|
|
|
|
|
A Method of Learning Latent Variables Dimensionality for Bayesian Networks |
|
PP: 319-326 |
|
Author(s) |
|
Zan Zhang,
Hao Wang,
Hongliang Yao,
|
|
Abstract |
|
Latent variables often play an important role in improving the quality of the learned Bayesian networks and understanding
the nature of interactions in the model. The dimensionality of latent variables has significant effect on the representation quality and
complexity of the model. The maximum possible dimensionality of a latent variable is a Cartesian product of the state space of its
Markov blanket variables. In order to obtain the dimensionality of the latent variable, we need to calculate the network score for every
possible dimensionality of the latent variable, and the calculations of this task are substantial. Besides, we do not know the data and
condition probability table of the latent variable which makes the task difficult. In this paper, we propose a novel method to learn the
dimensionality of the latent variable when the network structure is known. Firstly, we use the latent variable and its Markov blanket
variables to extract a local network from original network. Then, we score the local network instead of the original network which
reduces the running time of the task. Secondly, we utilize a state-clustering method to score the network for each dimensionality of
the latent variable, where a simulated annealing strategy is introduced to avoid local optimum. Finally, based on the above stages,
we choose the dimensionality of the latent variable which can make the network get the best score. This new method has excellent
learning performance and can deal with complex networks. Extensive experiments validate the effectiveness of our method against
other algorithms. |
|
|
|
|
|