词汇 | Markov process |
释义 | Markov process n. A stochastic process (as Brownian movement) that resembles a Markov chain except that the states are continuous. 马可夫过程。 Same as Markoff process. 亦可作 Markoff process。 |
随便看 |
青年旅行网英语在线翻译词典收录了86352条英语词汇在线翻译词条,基本涵盖了全部常用英语词汇的中英文双语翻译及用法,是英语学习的有利工具。