answersLogoWhite

0


Best Answer

A continuous-time stochastic process is called a semi-Markov process or 'Markov renewal process' if the embedded jump chain (the discrete process registering what values the process takes) is a Markov chain, and where the holding times (time between jumps) are random variables with any distribution, whose distribution function may depend on the two states between which the move is made. A semi-Markov process where all the holding times are exponentially distributed is called a continuous time Markov chain/process

User Avatar

Wiki User

11y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What is difference between markov process and semi-markov process?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Engineering

An organization would use markov analysis to track the?

Internal labour supply.


What are transient and steady state conditions in Markov Chains?

Please consult some book.


What is markov theory?

I will rephase your question as "What is a finite Markov chain processes?" If this is not your question, please resubmit it. As I researched this question, I found an interesting information on the Markov's, as his brother and brother's son contributed to mathematics. See: http://en.wikipedia.org/wiki/Andrey_Markov How much of Markov chain theory actually orginated with Andrey or future generations of Markov's and others is a good question. Commonly, mathematicians name their theories to other famous mathematicians, for example, Bayes, Gauss & Bernoulli. A Markov chain is a hypothetical process that has finite and identifiable states. The system changes in discrete steps, and only the current state will have an affect of the future state. I will use a concrete example of this. I'm hungry and standing in line at McDonalds. Suppose there are a finite number of choices, say 1- 12 (yes I know this is not true, but just suppose it, ok). Now suppose that my choice is not completely random, but is affected by what the person ahead of me is choosing. If the person ahead of me chooses a number 1, I will have have a higher chance of choosing a number 1 (say for example 50%) and a lower chance of choosing all other options. I could run a simulation model where thousands of customers are served and find out what are the probabilities of number 1 to 13 being served. The state of the system is determined by the selections of the customer defined by an unchanging set of rules governing the probability. In a more general sense, we could have a fix set of rules (conditional probabilities), that state given what the last customer ordered, will determine the probabilities of the next customer. This is called a transition matrix of probabilities. You can find more examples at: http://en.wikipedia.org/wiki/Examples_of_Markov_chains which cites the board game of monopoly as an example of Markov chain, because given the square that you are currently on, will determine the probabilities of the "future state" or the next square on which you will land.


Related questions

What is the difference between Markov Models and Hidden Markov models?

Unlike MM in HMM state is hidden.


What are the differences between Ivan Markov and Colonel?

Ivan Markov was a Colonel


What has the author A A Markov written?

A. A. Markov has written: 'Differenzenrechnung' -- subject(s): Difference equations, Interpolation 'The Correspondence between A.A. Markov and A.A. Chuprov on the theory of Probability and Mathematical statistics' -- subject(s): Correspondence, Mathematical statistics, Mathematicians, Probabilities '[Izbrannye trudy' -- subject(s): Bibliography, Number theory, Probabilities


What is markov analysis in decision making process in management?

Markov analysis is a method of analyzing the current behavior of some variable in an eifort topredict the fiature behavior of that same variable.


What has the author H Hirschmann written?

H. Hirschmann has written: 'The semi-Markov process, generalizations and calculation rules for application in the analysis of systems' -- subject(s): Markov processes


What is the birth name of Aleksandr Markov?

Aleksandr Markov's birth name is Aleksandr Vladimirovitch Markov.


What is the birth name of Andrei Markov?

Andrei Markov's birth name is Andrei Viktorovich Markov.


What is the birth name of Evgeniy Markov?

Evgeniy Markov's birth name is Yevgeniy Lvovitch Markov.


What is the birth name of Leonid Markov?

Leonid Markov's birth name is Markov, Leonid Vasilyevich.


How can you use Markov Analysis?

uses of markov analysis


When did Evgeniy Markov die?

Evgeniy Markov died in 1991.


How tall is Alexei Markov?

Alexei Markov is 5' 8".