Markov chain is a mathematical chain of events or states that describe the probability of the events that might occur in the future, based on the current state and not the previous states. It is a
In a Markov Chain, each state can be represented as a set of discrete steps. Each state has its own probability of transitioning to every other state. This may be represented by a weighted connected graph or by a transition matrix.
Let's consider the following Markov Chain model as a connected graph.
The figure indicates that the probability of transitioning from A to C is 0.3, from B to A 0.4, and so on. Note that all outgoing probabilities from a single state add up to 1. We can make our transition matrix from the connected graph, which is a component of the Markov Chain model.
As illustrated above, the rows of the matrix indicate the states they are transitioning from, while the columns indicate the states they are transitioning to.
Another component of the Markov chains is the initial state vector, which represents the initial probabilities of all states.
To calculate the chances of transitioning from one state to another in n number of steps. We multiply the initial state vector S with transition matrix P raised to the power n.
Considering the transition matrix given above, calculate the probability of transitioning from A to B in 1 step.
Thus, the probability of transitioning from A to B is 0.5 in one step.
Practice
Calculate the probability of transitioning from A to C in two steps
Markov chain is a means to calculate the chances of an event to occur. The transition matrix and initial state vector are the two main components of the Markov chain. It can be used to estimate traffic flows, resolve genetic issues, and in computer networks.
Free Resources