Course description
Machine Learning: Unsupervised Learning - E-learning from Udacity
This e-learning course is designed to guide learners through Unsupervised Learning, a machine learning task makes predictions in a manner similar to pattern recognition. Participants will learn how to use this invaluable toolor analyzing data to look for patterns. The course focuses on using a number of different approaches in order to find structure in unlabeled data.
A graduate level program - make a movie recommendation system!
This Unsupervised Learning course is the final of three in the Machine Learning series offered in cooperation with Georgia Tech. The primary outcome is to allow participants the opportunity to practice a variety of Unsupervised Learning approaches and learn valuable methods solidified through hands-on practice in the form of a final project.
Do you work at this company and want to update this page?
Is there out-of-date information about your company or courses published here? Fill out this form to get in touch with us.
Upcoming start dates
Who should attend?
This intermediate level Machine Learning: Unsupervised Learning course is designed for development professionals hoping to learn randomized optimization, clustering, information theory and feature selection and transformation.
Pre-Requisites
This is the final course in a series of three on machine learning. Participants will have completed the first, Machine Learning: Supervised Learning, and second course in the series, Machine Learning: Reinforced Learning. Participants will also have a basic understanding of probability theory, statistics, some programming and ideally an introduction to artificial intelligence.
Find out if this course is right for you - request more information here!
Training content
Training topics for this Machine Learning: Unsupervised Learning course are broken into 10 Lessons:
Machine Learning is the ROX
- Definition of Machine Learning
- Supervised learning
- Induction and deduction
- Unsupervised learning
- Reinforcement learning
Decision Trees
- Classification and Regression overview
- Classification learning
- Example: Dating
- Representation
- Decision trees learning
- Decision tree expressiveness
- ID3 algorithm
- ID3 bias
- Decision trees and continuous attributes
Regression and Classification
- Regression and function approximation
- Linear regression and best fit
- Order of polynomial
- Polynomial regression
- Cross validation
Neural Networks
- Artificial neural networks
- Perceptron units
- XOR as perceptron network
- Perceptron training
- Gradient descent
- Comparison of learning rules
- Sigmoid function
- Optimizing weights
- Restriction bias
- Preference bias
Instance-Based Learning
- Instance based learning before
- Instance based learning now
- K-NN algorithm
- Won’t you compute my neighbors?
- Domain K-NNowledge
- K-NN bias
- Curse of dimensionality
Ensemble B&B
- Ensemble learning: Boosting
- Ensemble learning algorithm
- Ensemble learning outputs
- Weak learning
- Boosting in code
- When D agrees
Kernel Methods and Support Vector Machines (SVM)s
- Support Vector Machines
- Optimal separator
- SVMs: Linearly married
- Kernel methods
Computational Learning Theory
- Computational Learning Theory
- Learning theory
- Resources in Machine Learning
- Defining inductive learning
- Teacher with constrained queries
- Learner with constrained queries
- Learner with mistake bounds
- Version spaces
- PAC learning
- Epsilon exhausted
- Haussler theorem
VC Dimensions
- Infinite hypothesis spaces
- Power of a hypothesis space
- What does VC stand for?
- Internal training
- Linear separators
- The ring
- Polygons
- Sampling complexity
- VC of finite H
Bayesian Learning
- Bayes Rule
- Bayesian learning
- Bayesian learning in action!
- Noisy data
- Best hypothesis
- Minimum description length
- Bayesian classification
Bayesian Inference
- Joint distribution
- Adding attributes
- Conditional independence
- Belief networks
- Sampling from the joint distribution
- Recovering the joint distribution
- Inferencing rules
- Naïve Bayes
- Why Naïve Bayes is cool
Costs
It is free to start this Machine Learning: Unsupervised Learning course
Estimated time for completion assuming 6 hours per week: Approx. 1 months
2-Week Free Trial: Love it or Leave it
All Udacity courses are offered with a two-week free trial. Learners will have plenty of time to make sure that the program fits their needs. If it's not working out for any reason - user can cancel their subscription fee of charge.