The time-continuous discrete-state Markov process is a model for rating transitions. One parameter, namely the intensity to migrate to an adjacent rating state, implies an ordinal rating to have an intuitive metric. State-specific intensities generalize the state-stationarity. Observing Markov processes from a multiplicative intensity model, the maximum likelihood parameter estimators for both models can be written as a martingale transform of the processes that count transitions between the rating states. A Taylor expansion reveals consistency and asymptotic normality of the parameter estimates, resulting in a chi-square-distributed likelihood ratio of state-stationarity and the state-specific model. This extents to time-stationarity. Simulations contrast the asymptotic results with finite samples. An application to a sufficiently large set of credit rating histories shows that the one-parameter model can be a good starting point.