This class is a special case of a DiagonalGMM that implements the MAP algorithm instead of the EM algorithm.
Inheritance:
Public Fields
-
DiagonalGMM* prior_distribution
- The prior distribution used in MAP
-
real weight_on_prior
- The weight to give to the prior parameters during update
-
bool learn_weights
- update Gaussian's weights
-
bool learn_variances
- update Gaussian's variances
-
bool learn_means
- update Gaussian's means
Public Methods
-
MAPDiagonalGMM(DiagonalGMM* prior_distribution_)
-
virtual void frameViterbiAccPosteriors(int t, real* inputs, real log_posterior)
- The backward step of Viterbi for a frame
-
virtual void frameEMAccPosteriors(int t, real* inputs, real log_posterior)
- The backward step of EM for a frame
-
virtual void eMUpdate()
- The update after each iteration for EM
-
virtual void setDataSet(DataSet* data_)
- Copy the parameters of the prior distribution
Public Fields
-
int n_gaussians
-
real prior_weights
-
EMTrainer* initial_kmeans_trainer
-
MeasurerList* initial_kmeans_trainer_measurers
-
real* log_weights
-
real* dlog_weights
-
real* var_threshold
-
Sequence* log_probabilities_g
-
int best_gauss
-
Sequence* best_gauss_per_frame
-
real* sum_log_var_plus_n_obs_log_2_pi
-
real** minus_half_over_var
-
real** means_acc
Public Methods
-
virtual void reset()
-
virtual void setVarThreshold(real* var_threshold_)
-
virtual void eMIterInitialize()
-
virtual void iterInitialize()
-
virtual real frameLogProbability(int t, real* inputs)
-
virtual real viterbiFrameLogProbability(int t, real* inputs)
-
virtual real frameLogProbabilityOneGaussian(int g, real* inputs)
-
virtual void sequenceInitialize(Sequence* inputs)
-
virtual void eMSequenceInitialize(Sequence* inputs)
-
virtual void update()
-
virtual void frameBackward(int t, real* f_inputs, real* beta_, real* f_outputs, real* alpha_)
-
virtual void frameDecision(int t, real* decision)
Public Fields
-
real log_probability
-
Sequence* log_probabilities
Public Methods
-
virtual real logProbability(Sequence* inputs)
-
virtual real viterbiLogProbability(Sequence* inputs)
-
virtual void eMAccPosteriors(Sequence* inputs, real log_posterior)
-
virtual void viterbiAccPosteriors(Sequence* inputs, real log_posterior)
-
virtual void decode(Sequence* inputs)
-
virtual void eMForward(Sequence* inputs)
-
virtual void viterbiForward(Sequence* inputs)
-
virtual void viterbiBackward(Sequence* inputs, Sequence* alpha)
Public Members
-
Returns the decision of the distribution
Public Fields
-
int n_inputs
-
int n_outputs
-
Parameters* params
-
Parameters* der_params
-
Sequence* beta
Public Methods
-
virtual void forward(Sequence* inputs)
-
virtual void backward(Sequence* inputs, Sequence* alpha)
-
virtual void setPartialBackprop(bool flag=true)
-
virtual void frameForward(int t, real* f_inputs, real* f_outputs)
-
virtual void loadXFile(XFile* file)
-
virtual void saveXFile(XFile* file)
Inherited from Machine:
Public Fields
-
Sequence* outputs
Inherited from Object:
Public Fields
-
Allocator* allocator
Public Methods
-
void addOption(const char* name, int size, void* ptr, const char* help="")
-
void addIOption(const char* name, int* ptr, int init_value, const char* help="")
-
void addROption(const char* name, real* ptr, real init_value, const char* help="")
-
void addBOption(const char* name, bool* ptr, bool init_value, const char* help="")
-
void addOOption(const char* name, Object** ptr, Object* init_value, const char* help="")
-
void setOption(const char* name, void* ptr)
-
void setIOption(const char* name, int option)
-
void setROption(const char* name, real option)
-
void setBOption(const char* name, bool option)
-
void setOOption(const char* name, Object* option)
-
void load(const char* filename)
-
void save(const char* filename)
-
void* operator new(size_t size, Allocator* allocator_=NULL)
-
void* operator new(size_t size, Allocator* allocator_, void* ptr_)
-
void operator delete(void* ptr)
Documentation
This class is a special case of a DiagonalGMM that implements the
MAP algorithm instead of the EM algorithm. This means that the
mean parameters will be changed according to the Maximum A Posteriori
algorithm, given a prior value of the means (through a prior DiagonalGMM
given in the constructor). Moreover, the variances and weights are
not changed, as experimental results tend to show that there is no
effects when they are changed.
- DiagonalGMM* prior_distribution
- The prior distribution used in MAP
- real weight_on_prior
- The weight to give to the prior parameters during update
- bool learn_weights
- update Gaussian's weights
- bool learn_variances
- update Gaussian's variances
- bool learn_means
- update Gaussian's means
- MAPDiagonalGMM(DiagonalGMM* prior_distribution_)
- virtual void frameViterbiAccPosteriors(int t, real* inputs, real log_posterior)
- The backward step of Viterbi for a frame
- virtual void frameEMAccPosteriors(int t, real* inputs, real log_posterior)
- The backward step of EM for a frame
- virtual void eMUpdate()
- The update after each iteration for EM
- virtual void setDataSet(DataSet* data_)
-
Copy the parameters of the prior distribution
- This class has no child classes.
- Author:
- Samy Bengio (bengio@idiap.ch)
Johnny Mariethoz (Johnny.Mariethoz@idiap.ch)
Alphabetic index HTML hierarchy of classes or Java
This page was generated with the help of DOC++.