Negative Mixture Models via Squaring Representation and Learning

No Thumbnail Available

Date

2024

Journal Title

Journal ISSN

Volume Title

Publisher

University of Nottingham

Abstract

The truths behind a real-world data can be faced by measuring the uncertainty around data. From probabilistic view, the uncertainty is used with respected to unsupervised learning as learning objectives under the probability distributions and inference. Mixture models enhanced the expressiveness of probability distributions. Mixture models have provided a general framework used for clustering data by building more complex probability distributions. We are begging with discussion of mixture distributions and introduced the latent variable concept. Mixture types with respect to the number of components and its formulation are discussed. Some example of Gaussian mixture models is exposed. Mixture types with respect to mixture coefficient are also discussed. We exposed the statistical inference problem of mixture models with different approaches such as, latent variable models, Markov chain Mote Carlo method and variational methods. Through our discussion, we exposed a several illustrative examples. Some concepts of probabilistic circuits: representation, formulation and the corresponding inference are also discussed. In thesis, we applied probabilistic circuits in probabilistic inference. Also, we discussed how the negative mixture is presented as probabilistic circuits. And its structure as tractable computational graphs. Also, we discussed the representation for the squared negative mixture models as efficiently tensorized computational graphs. As well as how can reduces the model size under including negative parameters in this class of functions. Mixture models and especially negative mixture model via squaring to learn the truths of real data was discussed. Due to Gaussian mixture models applied in several branches of science such as machine learning, data mining, pattern recognition and statistical analysis. And Gaussian mixture model and negative Gaussian mixture model are an important subclass for learning in data. In this thesis, we focused on discussion these models in two cases positive and negative case. For the representing the valid negative mixture models, we discuss a generic strategy to support negative parameters called squaring a base mixture. And then, this framework is extending to probabilistic circuits. Finally, we discuss the main idea of my thesis The main aim of this thesis is discussion the inference problem in the framework of mixture models. As well as the basic role which play each of positive mixture model and negative weight mixture model, especially standard Gaussian mixture model and negative weight Gaussian mixture model in inference problem. we expose this thesis in five subsequent chapters describe as follows. In Chapter 1: We discuss mixture motivation and mixture types. Also, we expose to some standard mixture models. In Chapter 2: We discuss mixture types with respected to its coefficients. When mixture coefficient is reduced to negative values for some not all coefficients then mixture model called negative weight mixture model. Also, in this chapter expos to the statistical inference problem of mixture models with different approaches such as latent variable models, Markov chain Mote Carlo (MCMC) method and variational methods. In Chapter 3: We discuss the important ideas around the problem of probabilistic inference. Information about the class of queries to computing interesting quantities of a probability distribution are discussed and makes a family of probabilistic model tractable. Different illustrative examples are exposed. The probabilistic circuits: representation and inference were discussed. At the end of this chapter discussed negative MMs via squaring and representing negative MMs as probabilistic circuits. In Chapter 4: We discuss Gaussian mixture models used to present subpopulations within an overall population. Also, we have known how Gaussian mixtures which is constituted a form of unsupervised learning. In the second part, we discussed the negative weight Gaussian mixture models under negative coefficients which make it more expressive than Gaussian mixture models by reducing the number of components and parameters. Also, the comparison between standard Gaussian mixture model and negative weight Gaussian mixture model are formulated under a real example. In Chapter 5: We discuss the important contributions of positive and negative weight mixture models especially positive and negative weight Gaussian mixture models. As well as the future works which can be developed in mixture framework.

Description

Keywords

Mixture Models, Negative Mixture Models, Gaussian Mixture Models, Probabilistic Circuits, Statistical Inference, Machine Learning, Variational Methods, Markov Chain Monte Carlo (MCMC)

Citation

Endorsement

Review

Supplemented By

Referenced By

Copyright owned by the Saudi Digital Library (SDL) © 2024