Udacity Logo
Log InSign Up

Fundamentals of Probabilistic Graphical Models

Course

Learn to use Bayes Nets to represent complex probability distributions, and algorithms for sampling from those distributions. Then learn the algorithms used to train, predict, and evaluate Hidden Markov Models for pattern recognition. HMMs have been used for gesture recognition in computer vision, gene sequence identification in bioinformatics, speech generation & part of speech tagging in natural language processing, and more.

Learn to use Bayes Nets to represent complex probability distributions, and algorithms for sampling from those distributions. Then learn the algorithms used to train, predict, and evaluate Hidden Markov Models for pattern recognition. HMMs have been used for gesture recognition in computer vision, gene sequence identification in bioinformatics, speech generation & part of speech tagging in natural language processing, and more.

3 weeks

Real-world Projects

Completion Certificate

Last Updated January 18, 2022

Prerequisites:

No experience required

Course Lessons

Lesson 1

Introduction to Probabilistic Models

Welcome to Fundamentals of Probabilistic Graphical Models. In this lesson, we will cover the course overview, prerequisites, and do a brief introduction to probability.

Lesson 2

Probability

Sebastian Thrun briefly reviews basic probability theory including discrete distributions, independence, joint probabilities, and conditional distributions to model uncertainty in the real world.

Lesson 3

Spam Classifier with Naive Bayes

In this section, you'll learn how to build a spam email classifier using the naive Bayes algorithm.

Lesson 4

Bayes Nets

Sebastian explains using Bayes Nets as a compact graphical model to encode probability distributions for efficient analysis.

Lesson 5

Inference in Bayes Nets

Sebastian explains probabilistic inference using Bayes Nets, i.e. how to use evidence to calculate probabilities from the network.

Lesson 6

Part of Speech Tagging with HMMs

Learn Hidden Markov Models, and apply them to part-of-speech tagging, a very popular problem in Natural Language Processing.

Lesson 7

Dynamic Time Warping

Thad explains the Dynamic Time Warping technique for working with time-series data.

Lesson 8 • Project

Project: Part of Speech Tagging

In this project, you'll build a hidden Markov model for part of speech tagging with a universal tagset.

Taught By The Best

Photo of Sebastian Thrun

Sebastian Thrun

Founder and Executive Chairman, Udacity

As the Founder and Chairman of Udacity, Sebastian's mission is to democratize education by providing lifelong learning to millions of students worldwide. He is also the founder of Google X, where he led projects including the Self-Driving Car, Google Glass, and more.

Photo of Thad Starner

Thad Starner

Professor of Computer Science, Georgia Tech

Thad Starner is the director of the Contextual Computing Group (CCG) at Georgia Tech and is also the longest-serving Technical Lead/Manager on Google's Glass project.

Taught By The Best

Photo of Sebastian Thrun

Sebastian Thrun

Founder and Executive Chairman, Udacity

As the Founder and Chairman of Udacity, Sebastian's mission is to democratize education by providing lifelong learning to millions of students worldwide. He is also the founder of Google X, where he led projects including the Self-Driving Car, Google Glass, and more.

Photo of Thad Starner

Thad Starner

Professor of Computer Science, Georgia Tech

Thad Starner is the director of the Contextual Computing Group (CCG) at Georgia Tech and is also the longest-serving Technical Lead/Manager on Google's Glass project.

Get Started Today

Fundamentals of Probabilistic Graphical Models