热文观察...
  • “迅速地”英文翻译
    字词 迅速地 释义 rapidly rapidly; fleetly; in good time; in short order   汉译英翻译词典收......
  • “英语拼写”英文翻译
    追从 追偿损失 追击 追击兵 追击炮 追到 追剿 追加 追加付款 追加保证金 追加保证金的通知 追加保......
  • “迅速射击”英文翻译
    字词 迅速射击 释义 【电】 snapshot   汉译英翻译词典收录405719条汉英翻译词条,基本涵盖了全......
  • “markless”的英英意思
    单词 markless 释义 markless, a.|mkls|[f. mark n.1 + -less.]Without mark or a mark.1834Ld. ......
  • “markman”的英英意思
    单词 markman 释义 markman|mkmn|[f. mark n.1 + man n.1]†1. = marksman 1. Also fig. Obs.157......
  • “迅速”英文翻译
    字词 迅速 释义 rapidity; expedition; promptitude\n【法】 swift; vejocity\n相关词组:\n 行动......
字典帮 >教育工具 >英语词汇“Markov”的英英意思、用法、释义、翻译、读音、例句、短语、词组、英语词典-本地帮
2025-07-19

“Markov”的英英意思

单词 Markov
释义 Markov Math.|ˈmɑːkɒf|
Also Markoff.
[The name of Andrei Andreevich Markov (1856–1922), Russian mathematician, who investigated such processes.]
Markov process: any stochastic process for which the probabilities, at any one time, of the different future states depend only on the existing state and not on how that state was arrived at. Markov chain: a Markov process in which there are a finite or countably infinite number of possible states or in which transitions between states occur at discrete intervals of time; also, one for which in addition the transition probabilities are constant (independent of time). Also Markov property, the characteristic property of Markov processes.
1939Jap. Jrnl. Math. XVI. 47 (heading) Markoff process with an enumerable infinite number of possible states.1942Trans. Amer. Math. Soc. LII. 37 Then pij(t) can be considered a transition probability of a Markoff chain: A system is supposed which can assume various numbered states, and pij(t) is the probability that the system is in the jth state at the end of a time interval of length t, if it was in the ith state at the beginning of the interval.1950W. Feller Introd. Probability Theory I. xv. 337 A Markov process is the probabilistic analogue of the processes of classical mechanics, where the future development is completely determined by the present state and is independent of the way in which the present state has developed.Ibid. 337 A definition of the Markov property.1953J. L. Doob Stochastic Processes v. 170 A Markov chain is defined as a Markov process..whose random variables can (with probability 1) only assume values in a certain finite or denumerably infinite set. The set is usually taken, for convenience, to be the integers 1,{ddd}, N (finite case) or the integers 1, 2,{ddd}(infinite case).Ibid. 186 The problem of card mixing is a good example of the application of Markov chains.1953J. B. Carroll Study of Lang. iii. 85 A Markoff process has to do with the different ‘states’ into which a phenomenon can get, and the statistical probabilities which govern the transition of the phenomenon from one state to another.1956Nature 4 Feb. 207/1 The most simple case is when all the atoms of the assembly are supposed to have no volume and no interactions (such as in an ideal gas). In that case it can be treated as a Markov process.1960Kemeny & Snell Finite Markov Chains ii. 25 A finite Markov chain is a finite Markov process such that the transition probabilities pij(n) do not depend on n.1962J. Riordan Stochastic Service Syst. iii. 28 The simplest infinite-server system is unique among its fellows in the possession of the Markov property that future changes are independent of the past.1966S. Karlin First Course in Stochastic Processes ii. 27 A discrete time Markov chain {ob}Xn{cb} is a Markov stochastic process whose state space is a countable or finite set, and for which T = (0, 1, 2,{ddd}).Ibid., The vast majority of Markov chains that we shall encounter have stationary transition probabilities.1968P. A. P. Moran Introd. Probability Theory iii. 140 Thus a Markov chain observed in the reverse direction of time will be a Markov process. However, it will not in general be a Markov chain because the observed transition probabilities will not be independent of t.1973Manch. Sch. Econ. & Social Stud. XLI. 401 (heading) A Markov chain model of the benefits of participating in government training schemes.
“Markov”的英英意思

 

英语词典包含277258条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。

 


相关内容11:

“marking”的英英意思

“过帐参考”英文翻译

“markhore”的英英意思

“markgraf”的英英意思

“markhor”的英英意思


相关热词搜索:markov英英词典英英释义英语词汇意思用法释义英语