Markov chain
英
美
- n. [统]马尔可夫链
new
Markov chain的英文翻译是什么意思,词典释义与在线翻译:
英英释义
Noun:
-
a Markov process for which the parameter is discrete time values
Markov chain的用法和样例:
例句
用作名词 (n.)
- To analyze those algorithms, a new method which models point multiplication algorithms as Markov Chain is proposed in this paper.
为了分析这些算法,文中提出了一种新的方法,即把椭圆曲线标量乘运算看作马尔可夫链。 - Based on the Markov chain, this paper investigates the time variant system reliability of brittle structure under multiple time varying loads.
对由脆性材料组成的结构,应用马尔可夫链分析了系统在时变载荷作用下的时变可靠性问题。