### EstiHMM: heat plots for the number of maximal sequences

#### by gertekoo

In a previous post, I mentioned an efficient algorithm for predicting the maximal state sequences for a given output sequence in an imprecise hidden Markov model (iHMM). Jasper De Bock and I have since given this algorithm a name, *EstiHMM*, and have written an implementation in Python that we intend to make public as soon as possible, also via this channel. We also hope to be able to present our work at the coming ISIPTA’11 conference in Innsbruck.

We now know that EstiHMM’s complexity is cubic in the number of states for the hidden variables, quadratic in the number of hidden variables, and linear in the number of maximal sequences. This is comparable to Viterbi’s algorithm, if we take into account that Viterbi resolves ties arbitrarily, something we are not allowed to do for iHMMs.

While a linear complexity in the number of sequences is probably as good as it gets, we see that we can only hope to find all maximal sequences efficiently provided their number is reasonably small. Should it, say, tend to increase exponentially with the length of the chain, then no algorithm, however cleverly designed, could overcome this hurdle.

Because this number of maximal sequences is so important, we decided to study its behaviour in more detail.

In a preliminary analysis, we used Monte Carlo methods to study the distribution of the number of maximal elements for randomly chosen transition and emission models. We found that the expected number of maximal elements is typically rather small, but its variance can become quite large with increasing imprecision, indicating roughly that high numbers of maximal elements only occur in restricted regions of (imprecise) probability space.

To get a more definite picture, we decided to concentrate on *binary* hidden Markov chains where the state variables have two possible values: say, and . So have the output variables: say, and as well. This means that a precise transition matrix for going from one state to the next is completely determined by two values: the probability to go from state to state , and the probability to go from state to state . We can find an imprecise model by contamination: taking convex mixtures of these precise models, with mixture coefficient , and the vacuous model, with mixture coefficient . As ranges from zero to one, we go from a precise iHMM to a vacuous iHMM.

We kept the emission model fixed and precise, and for a fixed value of the output sequence and , we allowed the probabilities to range over the possible values in the unit square, in a grid of size 1000 by 1000. For each of these values, we used the iHMMPredict algorithm to calculate the corresponding number of maximal state sequences. The results can be represented quite efficiently and elegantly in a heat plot.

Here are a few of the more spectacular ones:

To see how the number of maximal elements evolves as the imprecision increases, take a look at the following animation:

We see that, even for an imprecision that is appreciable, there are large regions of transition probability space where the number of maximal elements remains fairly small. The plots display quite interesting and intricate behaviour. And the nice thing is: **we can explain it**, and even predict it to some extent. But for this, you’ll have to wait until we see each other at the coming ISIPTA meeting. Looking forward to that!

Update: Call us fickle, but we have decided to rename the algorithm iHMMPredict to EstiHMM. It just tastes better.

[…] I have been telling you about work Jasper De Bock and I have been doing on state sequence prediction in imprecise hidden Markov Models, leading to the development of the EstiHMM algorithm. Now, Jasper’s master’s thesis (written in Dutch with an English extended abstract) on this subject has been submitted, and is available for download. We have submitted a paper about this to the ISIPTA 2011 conference, about which I hope to be able to give you more details soon. […]