请输入您要查询的词汇:

 

词汇 Markov chain
释义

Markov chain

n.

A usu. discrete stochastic process (as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved.

(统计)马可夫连锁。

随便看

 

青年旅行网英语在线翻译词典收录了86352条英语词汇在线翻译词条,基本涵盖了全部常用英语词汇的中英文双语翻译及用法,是英语学习的有利工具。

 

Copyright © 2000-2024 Qntrip.com All Rights Reserved
京ICP备2021023879号 更新时间:2024/9/17 3:50:46