When the Markov process is discrete-valued (i.e discrete state space), it is called a Markov chain. Markov chain can further be divided into Continuous –parameter (or time) Markov chain and discrete – parameter (or time) Markov chain for continuous and discrete parameters respectively. Random process and stochastic process are synonymous. More @Wikipedia
Hover over any link to get a description of the article. Please note that search keywords are sometimes hidden within the full article and don't appear in the description or title.