Machine Learning - Decision Trees and Random Forests
Course
Online
*Indicative price
Original amount in USD:
$ 21
Description
-
Type
Course
-
Level
Intermediate
-
Methodology
Online
-
Duration
Flexible
-
Start date
Different dates available
Learn Intuitive Machine Learning Techniques by Exploring a Classic Problem.
In an age of decision fatigue and information overload, this “Machine Learning: Decision Trees & Random Forests” course is a crisp yet thorough primer on two great Machine Learning techniques that help cut through the noise: decision trees and random forests. Supplemental Material included!
Facilities
Location
Start date
Start date
About this course
This Machine Learning: Decision Trees & Random Forests online course will teach you cool machine learning techniques to predict survival probabilities aboard the Titanic – a Kaggle problem!
Reviews
Subjects
- Operational
- Ensemble Learning
- Bagging
- Boosting & Stacking
- Regularization
- Cross-Validation
- Overfitting
- Installing Python
- Machine Techniques
- Random Forests
Teachers and trainers (1)
Name Name
Teacher
Course programme
- Lesson I: Introduction: You, This Course & Us!
- Lesson II: Planting the seed: What are Decision Trees?
- Lesson III: Growing the Tree: Decision Tree Learning
- Lesson IV: Branching out: Information Gain
- Lesson V: Decision Tree Algorithms
- Lesson VI: Installing Python: Anaconda & PIP
- Lesson VII: Back to Basics: Numpy in Python
- Lesson VIII: Back to Basics: Numpy & Scipy in Python
- Lesson IX: Titanic: Decision Trees predict Survival (Kaggle) – I
- Lesson X: Titanic: Decision Trees predict Survival (Kaggle) – II
- Lesson XI: Titanic: Decision Trees predict Survival (Kaggle) – III
- Lesson I: Overfitting: The Bane of Machine Learning
- Lesson II: Overfitting continued
- Lesson III: Cross-Validation
- Lesson IV: Simplicity is a virtue: Regularization
- Lesson V: The Wisdom of Crowds: Ensemble Learning
- Lesson VI: Ensemble Learning continued: Bagging, Boosting & Stacking
- Lesson I: Random Forests: Much more than trees
- Lesson II: Back on the Titanic: Cross Validation & Random Forests
Additional information
Design and Implement the solution to a famous problem in machine learning: predicting survival probabilities aboard the Titanic. Understand the perils of overfitting, and how random forests help overcome this risk. Identify the use-cases for Decision Trees as well as Random Forests.
No prerequisites required, but knowledge of some undergraduate level mathematics would help, but is not mandatory. Working knowledge of Python would be helpful if you want to perform the coding exercise and understand the provided source code.
Taught by a Stanford-educated, ex-Googler and an IIT, IIM – educated ex-Flipkart lead analyst. This team has decades of practical experience in quant trading, analytics and e-commerce.
Python Activity: Surviving aboard the Titanic! Build a decision tree to predict the survival of a passenger on the Titanic. This is a challenge posed by Kaggle (a competitive online data science community). We’ll start off by exploring the data and transforming the data into feature vectors that can be fed to a Decision Tree Classifier.
LENGTH
4 hrs 50 min
Machine Learning - Decision Trees and Random Forests
*Indicative price
Original amount in USD:
$ 21