Continuous Emission Hidden Markov Models#
AUTHOR:
William Stein, 2010-03
- class sage.stats.hmm.chmm.GaussianHiddenMarkovModel[source]#
Bases:
HiddenMarkovModel
Gaussian emissions Hidden Markov Model.
INPUT:
A
– matrix; the \(N \times N\) transition matrixB
– list of pairs(mu, sigma)
that define the distributionspi
– initial state probabilitiesnormalize
– bool (default:True
)
EXAMPLES:
We illustrate the primary functions with an example 2-state Gaussian HMM:
sage: m = hmm.GaussianHiddenMarkovModel([[.1,.9],[.5,.5]], ....: [(1,1), (-1,1)], ....: [.5,.5]); m Gaussian Hidden Markov Model with 2 States Transition matrix: [0.1 0.9] [0.5 0.5] Emission parameters: [(1.0, 1.0), (-1.0, 1.0)] Initial probabilities: [0.5000, 0.5000]
>>> from sage.all import * >>> m = hmm.GaussianHiddenMarkovModel([[RealNumber('.1'),RealNumber('.9')],[RealNumber('.5'),RealNumber('.5')]], ... [(Integer(1),Integer(1)), (-Integer(1),Integer(1))], ... [RealNumber('.5'),RealNumber('.5')]); m Gaussian Hidden Markov Model with 2 States Transition matrix: [0.1 0.9] [0.5 0.5] Emission parameters: [(1.0, 1.0), (-1.0, 1.0)] Initial probabilities: [0.5000, 0.5000]
We query the defining transition matrix, emission parameters, and initial state probabilities:
sage: m.transition_matrix() [0.1 0.9] [0.5 0.5] sage: m.emission_parameters() [(1.0, 1.0), (-1.0, 1.0)] sage: m.initial_probabilities() [0.5000, 0.5000]
>>> from sage.all import * >>> m.transition_matrix() [0.1 0.9] [0.5 0.5] >>> m.emission_parameters() [(1.0, 1.0), (-1.0, 1.0)] >>> m.initial_probabilities() [0.5000, 0.5000]
We obtain a sample sequence with 10 entries in it, and compute the logarithm of the probability of obtaining this sequence, given the model:
sage: obs = m.sample(5); obs # random [-1.6835, 0.0635, -2.1688, 0.3043, -0.3188] sage: log_likelihood = m.log_likelihood(obs) sage: counter = 0 sage: n = 0 sage: def add_samples(i): ....: global counter, n ....: for _ in range(i): ....: n += 1 ....: obs2 = m.sample(5) ....: if all(abs(obs2[i] - obs[i]) < 0.25 for i in range(5)): ....: counter += 1 sage: add_samples(10000) sage: while abs(log_likelihood - log(counter*1.0/n/0.5^5)) < 0.1: ....: add_samples(10000)
>>> from sage.all import * >>> obs = m.sample(Integer(5)); obs # random [-1.6835, 0.0635, -2.1688, 0.3043, -0.3188] >>> log_likelihood = m.log_likelihood(obs) >>> counter = Integer(0) >>> n = Integer(0) >>> def add_samples(i): ... global counter, n ... for _ in range(i): ... n += Integer(1) ... obs2 = m.sample(Integer(5)) ... if all(abs(obs2[i] - obs[i]) < RealNumber('0.25') for i in range(Integer(5))): ... counter += Integer(1) >>> add_samples(Integer(10000)) >>> while abs(log_likelihood - log(counter*RealNumber('1.0')/n/RealNumber('0.5')**Integer(5))) < RealNumber('0.1'): ... add_samples(Integer(10000))
We compute the Viterbi path, and probability that the given path of states produced obs:
sage: m.viterbi(obs) # random ([1, 0, 1, 0, 1], -8.714092684611794)
>>> from sage.all import * >>> m.viterbi(obs) # random ([1, 0, 1, 0, 1], -8.714092684611794)
We use the Baum-Welch iterative algorithm to find another model for which our observation sequence is more likely:
sage: try: ....: p, s = m.baum_welch(obs) ....: assert p > log_likelihood ....: assert (1 <= s <= 500) ....: except RuntimeError: ....: pass
>>> from sage.all import * >>> try: ... p, s = m.baum_welch(obs) ... assert p > log_likelihood ... assert (Integer(1) <= s <= Integer(500)) ... except RuntimeError: ... pass
Notice that running Baum-Welch changed our model:
sage: m # random Gaussian Hidden Markov Model with 2 States Transition matrix: [ 0.4154981366185841 0.584501863381416] [ 0.9999993174253741 6.825746258991804e-07] Emission parameters: [(0.4178882427119503, 0.5173109664360919), (-1.5025208631331122, 0.5085512836055119)] Initial probabilities: [0.0000, 1.0000]
>>> from sage.all import * >>> m # random Gaussian Hidden Markov Model with 2 States Transition matrix: [ 0.4154981366185841 0.584501863381416] [ 0.9999993174253741 6.825746258991804e-07] Emission parameters: [(0.4178882427119503, 0.5173109664360919), (-1.5025208631331122, 0.5085512836055119)] Initial probabilities: [0.0000, 1.0000]
- baum_welch(obs, max_iter=500, log_likelihood_cutoff=0.0001, min_sd=0.01, fix_emissions=False, v=False)[source]#
Given an observation sequence
obs
, improve this HMM using the Baum-Welch algorithm to increase the probability of observingobs
.INPUT:
obs
– a time series of emissionsmax_iter
– integer (default: 500) maximum number of Baum-Welch steps to takelog_likelihood_cutoff
– positive float (default: 1e-4); the minimal improvement in likelihood with respect to the last iteration required to continue. Relative value to log likelihood.min_sd
– positive float (default: 0.01); when reestimating, the standard deviation of emissions is not allowed to be less thanmin_sd
.fix_emissions
– bool (default:False
); ifTrue
, do not change emissions when updating
OUTPUT:
changes the model in place, and returns the log likelihood and number of iterations.
EXAMPLES:
sage: m = hmm.GaussianHiddenMarkovModel([[.1,.9],[.5,.5]], ....: [(1,.5), (-1,3)], ....: [.1,.9]) sage: m.log_likelihood([-2,-1,.1,0.1]) -8.858282215986275 sage: m.baum_welch([-2,-1,.1,0.1]) (4.534646052182..., 7) sage: m.log_likelihood([-2,-1,.1,0.1]) 4.534646052182... sage: m # rel tol 3e-14 Gaussian Hidden Markov Model with 2 States Transition matrix: [ 0.9999999992430161 7.569839394440382e-10] [ 0.49998462791192644 0.5000153720880736] Emission parameters: [(0.09999999999999999, 0.01), (-1.4999508147591902, 0.5000710504895474)] Initial probabilities: [0.0000, 1.0000]
>>> from sage.all import * >>> m = hmm.GaussianHiddenMarkovModel([[RealNumber('.1'),RealNumber('.9')],[RealNumber('.5'),RealNumber('.5')]], ... [(Integer(1),RealNumber('.5')), (-Integer(1),Integer(3))], ... [RealNumber('.1'),RealNumber('.9')]) >>> m.log_likelihood([-Integer(2),-Integer(1),RealNumber('.1'),RealNumber('0.1')]) -8.858282215986275 >>> m.baum_welch([-Integer(2),-Integer(1),RealNumber('.1'),RealNumber('0.1')]) (4.534646052182..., 7) >>> m.log_likelihood([-Integer(2),-Integer(1),RealNumber('.1'),RealNumber('0.1')]) 4.534646052182... >>> m # rel tol 3e-14 Gaussian Hidden Markov Model with 2 States Transition matrix: [ 0.9999999992430161 7.569839394440382e-10] [ 0.49998462791192644 0.5000153720880736] Emission parameters: [(0.09999999999999999, 0.01), (-1.4999508147591902, 0.5000710504895474)] Initial probabilities: [0.0000, 1.0000]
We illustrate bounding the standard deviation below. Note that above we had different emission parameters when the
min_sd
was the default of 0.01:sage: m = hmm.GaussianHiddenMarkovModel([[.1,.9],[.5,.5]], ....: [(1,.5), (-1,3)], ....: [.1,.9]) sage: m.baum_welch([-2,-1,.1,0.1], min_sd=1) (-4.07939572755..., 32) sage: m.emission_parameters() [(-0.2663018798..., 1.0), (-1.99850979..., 1.0)]
>>> from sage.all import * >>> m = hmm.GaussianHiddenMarkovModel([[RealNumber('.1'),RealNumber('.9')],[RealNumber('.5'),RealNumber('.5')]], ... [(Integer(1),RealNumber('.5')), (-Integer(1),Integer(3))], ... [RealNumber('.1'),RealNumber('.9')]) >>> m.baum_welch([-Integer(2),-Integer(1),RealNumber('.1'),RealNumber('0.1')], min_sd=Integer(1)) (-4.07939572755..., 32) >>> m.emission_parameters() [(-0.2663018798..., 1.0), (-1.99850979..., 1.0)]
We watch the log likelihoods of the model converge, step by step:
sage: m = hmm.GaussianHiddenMarkovModel([[.1,.9],[.5,.5]], ....: [(1,.5), (-1,3)], ....: [.1,.9]) sage: v = m.sample(10) sage: l = stats.TimeSeries([m.baum_welch(v, max_iter=1)[0] ....: for _ in range(len(v))]) sage: all(l[i] <= l[i+1] + 0.0001 for i in range(9)) True sage: l # random [-20.1167, -17.7611, -16.9814, -16.9364, -16.9314, -16.9309, -16.9309, -16.9309, -16.9309, -16.9309]
>>> from sage.all import * >>> m = hmm.GaussianHiddenMarkovModel([[RealNumber('.1'),RealNumber('.9')],[RealNumber('.5'),RealNumber('.5')]], ... [(Integer(1),RealNumber('.5')), (-Integer(1),Integer(3))], ... [RealNumber('.1'),RealNumber('.9')]) >>> v = m.sample(Integer(10)) >>> l = stats.TimeSeries([m.baum_welch(v, max_iter=Integer(1))[Integer(0)] ... for _ in range(len(v))]) >>> all(l[i] <= l[i+Integer(1)] + RealNumber('0.0001') for i in range(Integer(9))) True >>> l # random [-20.1167, -17.7611, -16.9814, -16.9364, -16.9314, -16.9309, -16.9309, -16.9309, -16.9309, -16.9309]
We illustrate fixing emissions:
sage: m = hmm.GaussianHiddenMarkovModel([[.1,.9],[.9,.1]], ....: [(1,2),(-1,.5)], ....: [.3,.7]) sage: set_random_seed(0); v = m.sample(100) sage: m.baum_welch(v,fix_emissions=True) (-164.72944548204..., 23) sage: m.emission_parameters() [(1.0, 2.0), (-1.0, 0.5)] sage: m = hmm.GaussianHiddenMarkovModel([[.1,.9],[.9,.1]], ....: [(1,2),(-1,.5)], ....: [.3,.7]) sage: m.baum_welch(v) (-162.854370397998..., 49) sage: m.emission_parameters() # rel tol 3e-14 [(1.2722419172602375, 2.371368751761901), (-0.9486174675179113, 0.5762360385123765)]
>>> from sage.all import * >>> m = hmm.GaussianHiddenMarkovModel([[RealNumber('.1'),RealNumber('.9')],[RealNumber('.9'),RealNumber('.1')]], ... [(Integer(1),Integer(2)),(-Integer(1),RealNumber('.5'))], ... [RealNumber('.3'),RealNumber('.7')]) >>> set_random_seed(Integer(0)); v = m.sample(Integer(100)) >>> m.baum_welch(v,fix_emissions=True) (-164.72944548204..., 23) >>> m.emission_parameters() [(1.0, 2.0), (-1.0, 0.5)] >>> m = hmm.GaussianHiddenMarkovModel([[RealNumber('.1'),RealNumber('.9')],[RealNumber('.9'),RealNumber('.1')]], ... [(Integer(1),Integer(2)),(-Integer(1),RealNumber('.5'))], ... [RealNumber('.3'),RealNumber('.7')]) >>> m.baum_welch(v) (-162.854370397998..., 49) >>> m.emission_parameters() # rel tol 3e-14 [(1.2722419172602375, 2.371368751761901), (-0.9486174675179113, 0.5762360385123765)]
- emission_parameters()[source]#
Return the parameters that define the normal distributions associated to all of the states.
OUTPUT:
a list
B
of pairsB[i] = (mu, std)
, such that the distribution associated to state \(i\) is normal with meanmu
and standard deviationstd
.EXAMPLES:
sage: M = hmm.GaussianHiddenMarkovModel([[.1,.9],[.5,.5]], ....: [(1,.5), (-1,3)], ....: [.1,.9]) sage: M.emission_parameters() [(1.0, 0.5), (-1.0, 3.0)]
>>> from sage.all import * >>> M = hmm.GaussianHiddenMarkovModel([[RealNumber('.1'),RealNumber('.9')],[RealNumber('.5'),RealNumber('.5')]], ... [(Integer(1),RealNumber('.5')), (-Integer(1),Integer(3))], ... [RealNumber('.1'),RealNumber('.9')]) >>> M.emission_parameters() [(1.0, 0.5), (-1.0, 3.0)]
- generate_sequence(length, starting_state=None)[source]#
Return a sample of the given length from this HMM.
INPUT:
length
– positive integerstarting_state
– int (orNone
); if specified then generate a sequence using this model starting with the given state instead of the initial probabilities to determine the starting state.
OUTPUT:
an
IntList
or list of emission symbolsTimeSeries
of emissions
EXAMPLES:
sage: m = hmm.GaussianHiddenMarkovModel([[.1,.9],[.5,.5]], ....: [(1,.5), (-1,3)], ....: [.1,.9]) sage: m.generate_sequence(5) # random ([-3.0505, 0.5317, -4.5065, 0.6521, 1.0435], [1, 0, 1, 0, 1]) sage: m.generate_sequence(0) ([], []) sage: m.generate_sequence(-1) Traceback (most recent call last): ... ValueError: length must be nonnegative
>>> from sage.all import * >>> m = hmm.GaussianHiddenMarkovModel([[RealNumber('.1'),RealNumber('.9')],[RealNumber('.5'),RealNumber('.5')]], ... [(Integer(1),RealNumber('.5')), (-Integer(1),Integer(3))], ... [RealNumber('.1'),RealNumber('.9')]) >>> m.generate_sequence(Integer(5)) # random ([-3.0505, 0.5317, -4.5065, 0.6521, 1.0435], [1, 0, 1, 0, 1]) >>> m.generate_sequence(Integer(0)) ([], []) >>> m.generate_sequence(-Integer(1)) Traceback (most recent call last): ... ValueError: length must be nonnegative
Verify numerically that the starting state is 0 with probability about 0.1:
sage: counter = 0 sage: n = 0 sage: def add_samples(i): ....: global counter, n ....: for i in range(i): ....: n += 1 ....: if m.generate_sequence(1)[1][0] == 0: ....: counter += 1 sage: add_samples(10^5) sage: while abs(counter*1.0 / n - 0.1) > 0.01: add_samples(10^5)
>>> from sage.all import * >>> counter = Integer(0) >>> n = Integer(0) >>> def add_samples(i): ... global counter, n ... for i in range(i): ... n += Integer(1) ... if m.generate_sequence(Integer(1))[Integer(1)][Integer(0)] == Integer(0): ... counter += Integer(1) >>> add_samples(Integer(10)**Integer(5)) >>> while abs(counter*RealNumber('1.0') / n - RealNumber('0.1')) > RealNumber('0.01'): add_samples(Integer(10)**Integer(5))
Example in which the starting state is 0 (see Issue #11452):
sage: set_random_seed(23); m.generate_sequence(2) ([0.6501, -2.0151], [0, 1])
>>> from sage.all import * >>> set_random_seed(Integer(23)); m.generate_sequence(Integer(2)) ([0.6501, -2.0151], [0, 1])
Force a starting state of 1 even though as we saw above it would be 0:
sage: set_random_seed(23); m.generate_sequence(2, starting_state=1) ([-3.1491, -1.0244], [1, 1])
>>> from sage.all import * >>> set_random_seed(Integer(23)); m.generate_sequence(Integer(2), starting_state=Integer(1)) ([-3.1491, -1.0244], [1, 1])
- log_likelihood(obs)[source]#
Return the logarithm of a continuous analogue of the probability that this model produced the given observation sequence.
Note that the “continuous analogue of the probability” above can be bigger than 1, hence the logarithm can be positive.
INPUT:
obs
– sequence of observations
OUTPUT:
float
EXAMPLES:
sage: m = hmm.GaussianHiddenMarkovModel([[.1,.9],[.5,.5]], ....: [(1,.5), (-1,3)], ....: [.1,.9]) sage: m.log_likelihood([1,1,1]) -4.297880766072486 sage: s = m.sample(20) sage: -80 < m.log_likelihood(s) < -20 True
>>> from sage.all import * >>> m = hmm.GaussianHiddenMarkovModel([[RealNumber('.1'),RealNumber('.9')],[RealNumber('.5'),RealNumber('.5')]], ... [(Integer(1),RealNumber('.5')), (-Integer(1),Integer(3))], ... [RealNumber('.1'),RealNumber('.9')]) >>> m.log_likelihood([Integer(1),Integer(1),Integer(1)]) -4.297880766072486 >>> s = m.sample(Integer(20)) >>> -Integer(80) < m.log_likelihood(s) < -Integer(20) True
- viterbi(obs)[source]#
Determine “the” hidden sequence of states that is most likely to produce the given sequence
obs
of observations, along with the probability that this hidden sequence actually produced the observation.INPUT:
obs
– sequence of emitted ints or symbols
OUTPUT:
list – “the” most probable sequence of hidden states, i.e., the Viterbi path.
float – log of probability that the observed sequence was produced by the Viterbi sequence of states.
EXAMPLES:
We find the optimal state sequence for a given model:
sage: m = hmm.GaussianHiddenMarkovModel([[0.5,0.5],[0.5,0.5]], ....: [(0,1),(10,1)], ....: [0.5,0.5]) sage: m.viterbi([0,1,10,10,1]) ([0, 0, 1, 1, 0], -9.0604285688230...)
>>> from sage.all import * >>> m = hmm.GaussianHiddenMarkovModel([[RealNumber('0.5'),RealNumber('0.5')],[RealNumber('0.5'),RealNumber('0.5')]], ... [(Integer(0),Integer(1)),(Integer(10),Integer(1))], ... [RealNumber('0.5'),RealNumber('0.5')]) >>> m.viterbi([Integer(0),Integer(1),Integer(10),Integer(10),Integer(1)]) ([0, 0, 1, 1, 0], -9.0604285688230...)
Another example in which the most likely states change based on the last observation:
sage: m = hmm.GaussianHiddenMarkovModel([[.1,.9],[.5,.5]], ....: [(1,.5), (-1,3)], ....: [.1,.9]) sage: m.viterbi([-2,-1,.1,0.1]) ([1, 1, 0, 1], -9.61823698847639...) sage: m.viterbi([-2,-1,.1,0.3]) ([1, 1, 1, 0], -9.566023653378513)
>>> from sage.all import * >>> m = hmm.GaussianHiddenMarkovModel([[RealNumber('.1'),RealNumber('.9')],[RealNumber('.5'),RealNumber('.5')]], ... [(Integer(1),RealNumber('.5')), (-Integer(1),Integer(3))], ... [RealNumber('.1'),RealNumber('.9')]) >>> m.viterbi([-Integer(2),-Integer(1),RealNumber('.1'),RealNumber('0.1')]) ([1, 1, 0, 1], -9.61823698847639...) >>> m.viterbi([-Integer(2),-Integer(1),RealNumber('.1'),RealNumber('0.3')]) ([1, 1, 1, 0], -9.566023653378513)
- class sage.stats.hmm.chmm.GaussianMixtureHiddenMarkovModel[source]#
Bases:
GaussianHiddenMarkovModel
Gaussian mixture Hidden Markov Model.
INPUT:
A
– matrix; the \(N \times N\) transition matrixB
– list of mixture definitions for each state. Each state may have a varying number of gaussians with selection probabilities that sum to 1 and encoded as(p, (mu,sigma))
pi
– initial state probabilitiesnormalize
– bool (default:True
); if given, input is normalized to define valid probability distributions, e.g., the entries of \(A\) are made nonnegative and the rows sum to 1, and the probabilities inpi
are normalized.
EXAMPLES:
sage: A = [[0.5,0.5],[0.5,0.5]] sage: B = [[(0.9,(0.0,1.0)), (0.1,(1,10000))],[(1,(1,1)), (0,(0,0.1))]] sage: hmm.GaussianMixtureHiddenMarkovModel(A, B, [1,0]) Gaussian Mixture Hidden Markov Model with 2 States Transition matrix: [0.5 0.5] [0.5 0.5] Emission parameters: [0.9*N(0.0,1.0) + 0.1*N(1.0,10000.0), 1.0*N(1.0,1.0) + 0.0*N(0.0,0.1)] Initial probabilities: [1.0000, 0.0000]
>>> from sage.all import * >>> A = [[RealNumber('0.5'),RealNumber('0.5')],[RealNumber('0.5'),RealNumber('0.5')]] >>> B = [[(RealNumber('0.9'),(RealNumber('0.0'),RealNumber('1.0'))), (RealNumber('0.1'),(Integer(1),Integer(10000)))],[(Integer(1),(Integer(1),Integer(1))), (Integer(0),(Integer(0),RealNumber('0.1')))]] >>> hmm.GaussianMixtureHiddenMarkovModel(A, B, [Integer(1),Integer(0)]) Gaussian Mixture Hidden Markov Model with 2 States Transition matrix: [0.5 0.5] [0.5 0.5] Emission parameters: [0.9*N(0.0,1.0) + 0.1*N(1.0,10000.0), 1.0*N(1.0,1.0) + 0.0*N(0.0,0.1)] Initial probabilities: [1.0000, 0.0000]
- baum_welch(obs, max_iter=1000, log_likelihood_cutoff=1e-12, min_sd=0.01, fix_emissions=False)[source]#
Given an observation sequence
obs
, improve this HMM using the Baum-Welch algorithm to increase the probability of observingobs
.INPUT:
obs
– a time series of emissionsmax_iter
– integer (default: 1000) maximum number of Baum-Welch steps to takelog_likelihood_cutoff
– positive float (default: 1e-12); the minimal improvement in likelihood with respect to the last iteration required to continue. Relative value to log likelihood.min_sd
– positive float (default: 0.01); when reestimating, the standard deviation of emissions is not allowed to be less thanmin_sd
.fix_emissions
– bool (default:False
); ifTrue
, do not change emissions when updating
OUTPUT:
changes the model in place, and returns the log likelihood and number of iterations.
EXAMPLES:
sage: m = hmm.GaussianMixtureHiddenMarkovModel( ....: [[.9,.1],[.4,.6]], ....: [[(.4,(0,1)), (.6,(1,0.1))], [(1,(0,1))]], ....: [.7,.3]) sage: set_random_seed(0); v = m.sample(10); v [0.3576, -0.9365, 0.9449, -0.6957, 1.0217, 0.9644, 0.9987, -0.5950, -1.0219, 0.6477] sage: m.log_likelihood(v) -8.31408655939536... sage: m.baum_welch(v) (2.18905068682..., 15) sage: m.log_likelihood(v) 2.18905068682... sage: m # rel tol 6e-12 Gaussian Mixture Hidden Markov Model with 2 States Transition matrix: [ 0.8746363339773399 0.12536366602266016] [ 1.0 1.451685202290174e-40] Emission parameters: [0.500161629343*N(-0.812298726239,0.173329026744) + 0.499838370657*N(0.982433690378,0.029719932009), 1.0*N(0.503260056832,0.145881515324)] Initial probabilities: [0.0000, 1.0000]
>>> from sage.all import * >>> m = hmm.GaussianMixtureHiddenMarkovModel( ... [[RealNumber('.9'),RealNumber('.1')],[RealNumber('.4'),RealNumber('.6')]], ... [[(RealNumber('.4'),(Integer(0),Integer(1))), (RealNumber('.6'),(Integer(1),RealNumber('0.1')))], [(Integer(1),(Integer(0),Integer(1)))]], ... [RealNumber('.7'),RealNumber('.3')]) >>> set_random_seed(Integer(0)); v = m.sample(Integer(10)); v [0.3576, -0.9365, 0.9449, -0.6957, 1.0217, 0.9644, 0.9987, -0.5950, -1.0219, 0.6477] >>> m.log_likelihood(v) -8.31408655939536... >>> m.baum_welch(v) (2.18905068682..., 15) >>> m.log_likelihood(v) 2.18905068682... >>> m # rel tol 6e-12 Gaussian Mixture Hidden Markov Model with 2 States Transition matrix: [ 0.8746363339773399 0.12536366602266016] [ 1.0 1.451685202290174e-40] Emission parameters: [0.500161629343*N(-0.812298726239,0.173329026744) + 0.499838370657*N(0.982433690378,0.029719932009), 1.0*N(0.503260056832,0.145881515324)] Initial probabilities: [0.0000, 1.0000]
We illustrate bounding the standard deviation below. Note that above we had different emission parameters when the min_sd was the default of 0.01:
sage: m = hmm.GaussianMixtureHiddenMarkovModel( ....: [[.9,.1],[.4,.6]], ....: [[(.4,(0,1)), (.6,(1,0.1))], [(1,(0,1))]], ....: [.7,.3]) sage: m.baum_welch(v, min_sd=1) (-12.617885761692..., 1000) sage: m.emission_parameters() # rel tol 6e-12 [0.503545634447*N(0.200166509595,1.0) + 0.496454365553*N(0.200166509595,1.0), 1.0*N(0.0543433426535,1.0)]
>>> from sage.all import * >>> m = hmm.GaussianMixtureHiddenMarkovModel( ... [[RealNumber('.9'),RealNumber('.1')],[RealNumber('.4'),RealNumber('.6')]], ... [[(RealNumber('.4'),(Integer(0),Integer(1))), (RealNumber('.6'),(Integer(1),RealNumber('0.1')))], [(Integer(1),(Integer(0),Integer(1)))]], ... [RealNumber('.7'),RealNumber('.3')]) >>> m.baum_welch(v, min_sd=Integer(1)) (-12.617885761692..., 1000) >>> m.emission_parameters() # rel tol 6e-12 [0.503545634447*N(0.200166509595,1.0) + 0.496454365553*N(0.200166509595,1.0), 1.0*N(0.0543433426535,1.0)]
We illustrate fixing all emissions:
sage: m = hmm.GaussianMixtureHiddenMarkovModel( ....: [[.9,.1],[.4,.6]], ....: [[(.4,(0,1)), (.6,(1,0.1))], [(1,(0,1))]], ....: [.7,.3]) sage: set_random_seed(0); v = m.sample(10) sage: m.baum_welch(v, fix_emissions=True) (-7.58656858997..., 36) sage: m.emission_parameters() [0.4*N(0.0,1.0) + 0.6*N(1.0,0.1), 1.0*N(0.0,1.0)]
>>> from sage.all import * >>> m = hmm.GaussianMixtureHiddenMarkovModel( ... [[RealNumber('.9'),RealNumber('.1')],[RealNumber('.4'),RealNumber('.6')]], ... [[(RealNumber('.4'),(Integer(0),Integer(1))), (RealNumber('.6'),(Integer(1),RealNumber('0.1')))], [(Integer(1),(Integer(0),Integer(1)))]], ... [RealNumber('.7'),RealNumber('.3')]) >>> set_random_seed(Integer(0)); v = m.sample(Integer(10)) >>> m.baum_welch(v, fix_emissions=True) (-7.58656858997..., 36) >>> m.emission_parameters() [0.4*N(0.0,1.0) + 0.6*N(1.0,0.1), 1.0*N(0.0,1.0)]
- emission_parameters()[source]#
Returns a list of all the emission distributions.
OUTPUT:
list of Gaussian mixtures
EXAMPLES:
sage: m = hmm.GaussianMixtureHiddenMarkovModel([[.9,.1],[.4,.6]], ....: [[(.4,(0,1)), (.6,(1,0.1))], [(1,(0,1))]], ....: [.7,.3]) sage: m.emission_parameters() [0.4*N(0.0,1.0) + 0.6*N(1.0,0.1), 1.0*N(0.0,1.0)]
>>> from sage.all import * >>> m = hmm.GaussianMixtureHiddenMarkovModel([[RealNumber('.9'),RealNumber('.1')],[RealNumber('.4'),RealNumber('.6')]], ... [[(RealNumber('.4'),(Integer(0),Integer(1))), (RealNumber('.6'),(Integer(1),RealNumber('0.1')))], [(Integer(1),(Integer(0),Integer(1)))]], ... [RealNumber('.7'),RealNumber('.3')]) >>> m.emission_parameters() [0.4*N(0.0,1.0) + 0.6*N(1.0,0.1), 1.0*N(0.0,1.0)]
- sage.stats.hmm.chmm.unpickle_gaussian_hmm_v0(A, B, pi, name)[source]#
EXAMPLES:
sage: m = hmm.GaussianHiddenMarkovModel([[1]], [(0,1)], [1]) sage: sage.stats.hmm.chmm.unpickle_gaussian_hmm_v0(m.transition_matrix(), m.emission_parameters(), m.initial_probabilities(), 'test') Gaussian Hidden Markov Model with 1 States Transition matrix: [1.0] Emission parameters: [(0.0, 1.0)] Initial probabilities: [1.0000]
>>> from sage.all import * >>> m = hmm.GaussianHiddenMarkovModel([[Integer(1)]], [(Integer(0),Integer(1))], [Integer(1)]) >>> sage.stats.hmm.chmm.unpickle_gaussian_hmm_v0(m.transition_matrix(), m.emission_parameters(), m.initial_probabilities(), 'test') Gaussian Hidden Markov Model with 1 States Transition matrix: [1.0] Emission parameters: [(0.0, 1.0)] Initial probabilities: [1.0000]
- sage.stats.hmm.chmm.unpickle_gaussian_hmm_v1(A, B, pi, prob, n_out)[source]#
EXAMPLES:
sage: m = hmm.GaussianHiddenMarkovModel([[1]], [(0,1)], [1]) sage: loads(dumps(m)) == m # indirect test True
>>> from sage.all import * >>> m = hmm.GaussianHiddenMarkovModel([[Integer(1)]], [(Integer(0),Integer(1))], [Integer(1)]) >>> loads(dumps(m)) == m # indirect test True
- sage.stats.hmm.chmm.unpickle_gaussian_mixture_hmm_v1(A, B, pi, mixture)[source]#
EXAMPLES:
sage: m = hmm.GaussianMixtureHiddenMarkovModel([[1]], [[(.4,(0,1)), (.6,(1,0.1))]], [1]) sage: loads(dumps(m)) == m # indirect test True
>>> from sage.all import * >>> m = hmm.GaussianMixtureHiddenMarkovModel([[Integer(1)]], [[(RealNumber('.4'),(Integer(0),Integer(1))), (RealNumber('.6'),(Integer(1),RealNumber('0.1')))]], [Integer(1)]) >>> loads(dumps(m)) == m # indirect test True