• Medientyp: E-Book; Hochschulschrift
  • Titel: Variational inference for composite Gaussian process models
  • Weitere Titel: Übersetzung des Haupttitels: Variationelle Inferenz für zusammengesetzte Gauß-Prozess Modelle
  • Beteiligte: Lindinger, Jakob [Verfasser:in]; Lippert, Christoph [Akademische:r Betreuer:in]; Deisenroth, Marc Peter [Akademische:r Betreuer:in]; Herbrich, Ralf [Akademische:r Betreuer:in]
  • Körperschaft: Universität Potsdam
  • Erschienen: Potsdam, [2023?]
  • Umfang: 1 Online-Ressource (ix, 122 Seiten, 5675 KB); Illustrationen, Diagramme
  • Sprache: Englisch
  • DOI: 10.25932/publishup-60444
  • Identifikator:
  • Schlagwörter: Hochschulschrift
  • Entstehung:
  • Hochschulschrift: Dissertation, Universität Potsdam, 2023
  • Anmerkungen:
  • Beschreibung: Most machine learning methods provide only point estimates when being queried to predict on new data. This is problematic when the data is corrupted by noise, e.g. from imperfect measurements, or when the queried data point is very different to the data that the machine learning model has been trained with. Probabilistic modelling in machine learning naturally equips predictions with corresponding uncertainty estimates which allows a practitioner to incorporate information about measurement noise into the modelling process and to know when not to trust the predictions. A well-understood, flexible probabilistic framework is provided by Gaussian processes that are ideal as building blocks of probabilistic models. They lend themself naturally to the problem of regression, i.e., being given a set of inputs and corresponding observations and then predicting likely observations for new unseen inputs, and can also be adapted to many more machine learning tasks. However, exactly inferring the optimal parameters of such a Gaussian process ...
  • Zugangsstatus: Freier Zugang