词汇 | Markov chain |
释义 | Markov chain n. A usu. discrete stochastic process (as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved. (统计)马可夫连锁。 |
随便看 |
|
青年旅行网英语在线翻译词典收录了440382条英语词汇在线翻译词条,基本涵盖了全部常用英语词汇的中英文双语翻译及用法,是英语学习的有利工具。