请输入您要查询的单词:

 

单词 Markov chain
释义

Markov chain

English

Noun

Markov chain (plural Markov chains)

  1. (probability theory) A discrete-time stochastic process with the Markov property.
    • 2004 July 27, F. Keith Barker et al., “Phylogeny and diversification of the largest avian radiation”, in PNAS, page 11040, column 2:
      The probability density of the Bayseian posterior was estimated by Metropolis-coupled Markov chain Monte Carlo, with multiple incrementally heated chains.

Hypernyms

  • chain

Hyponyms

  • discrete-time Markov chain

Translations

See also

  • Wikipedia article on Markov chains
随便看

 

国际大辞典收录了7408809条英语、德语、日语等多语种在线翻译词条,基本涵盖了全部常用单词及词组的翻译及用法,是外语学习的有利工具。

 

Copyright © 2004-2023 idict.net All Rights Reserved
京ICP备2021023879号 更新时间:2024/8/6 23:43:12