Improve the performance predictive models, build more complex models and use techniques to improve quality of your predictive models.Ensemble methods offer a powerful way to improve prediction accuracy by combining in a clever way predictions from many individual predictors. In this course, you will learn how to use ensemble methods to improve accuracy in classification and regression problems.When using Predictive Analytics to solve actual problems, besides models and algorithms there are many other practical considerations that must be considered like which features should I use, how many features are enough, should I create new features, how to combine features to give the same underlying information, which hyper-parameters should I use? We explore topics that will help you answer such questions.Artificial Neural Networks are models loosely based on how neural networks work in a living being. These models have a long history in the Artificial Intelligence community with ups and downs in popularity. Nowadays, because of the increase in computational power, improved methods, and software enhancements, they are popular again and are the basis for advanced approaches such as Deep Learning. This course introduces the use of Deep Learning models for Predictive Analytics using the powerful TensorFlow library.About the AuthorAlvaro Fuentes is a Data Scientist with an M.S. in Quantitative Economics and M.S. in Applied Mathematics with more than 10 years' experience in analytical roles..
He worked in the Central Bank of Guatemala as an Economic Analyst, building models for economic and financial data. He founded Quant Company to provide consulting and training services in Data Science topics and has been a consultant for many projects in fields such as Business, Education, Psychology, and Mass Media
Facilities
Location
Start date
Online
Start date
Different dates availableEnrolment now open
About this course
Use ensemble algorithms to combine many individual predictors to produce better predictions
Apply advanced techniques such as dimensionality reduction to combine features and build better models
Evaluate models and choose the optimal hyper-parameters using cross-validation
Learn the foundations for working and building models using Neural Networks
Learn different techniques to solve problems that arise when doing Predictive Analytics in the real world
Questions & Answers
Add your question
Our advisors and other users will be able to reply to you
We are verifying your question adjusts to our publishing rules. According to your answers, we noticed you might not be elegible to enroll into this course, possibly because of: qualification requirements, location or others. It is important you consult this with the Centre.
Thank you!
We are reviewing your question. We will publish it shortly.
Or do you prefer the center to contact you?
Reviews
Have you taken this course? Share your opinion
This centre's achievements
2021
All courses are up to date
The average rating is higher than 3.7
More than 50 reviews in the last 12 months
This centre has featured on Emagister for 6 years
Subjects
Approach
Works
Networks
Programming
Programme Planning
Programming Application
IT
IT Management
Information Systems
Information Systems management
Course programme
Ensemble Methods for Regression and Classification 4 lectures40:49The Course Overview This video provides an overview of the entire course.How Ensemble Methods Work? Explain the general idea behind ensemble methods and discuss at a high level the intuition of the main ensemble methods - bagging, random forest and boosting. • Explain the general idea on which all ensemble methods are based • Explain at a high level the idea of bagging • Explain at a high level the idea of random forests and boostingBagging, Random Forests, and Boosting for Regression Present with a practical example the procedure to build ensemble methods for regression tasks and compare the results of ensemble methods with other simpler methods. • Present the dataset to be used in the example • Build and train 3 different ensemble methods • Compare the results of the models and show the performance of the different methodsBagging, Random Forests, and Boosting for Classification Present with a practical example the procedure to build ensemble methods for classification tasks and compare the results of ensemble methods with other simpler methods. • Present the dataset to be used in the example • Build and train three different ensemble methods • Compare the results of the models and show the performance of the different methods Ensemble Methods for Regression and Classification 4 lectures40:49The Course Overview This video provides an overview of the entire course.How Ensemble Methods Work? Explain the general idea behind ensemble methods and discuss at a high level the intuition of the main ensemble methods - bagging, random forest and boosting. • Explain the general idea on which all ensemble methods are based • Explain at a high level the idea of bagging • Explain at a high level the idea of random forests and boostingBagging, Random Forests, and Boosting for Regression Present with a practical example the procedure to build ensemble methods for regression tasks and compare the results of ensemble methods with other simpler methods. • Present the dataset to be used in the example • Build and train 3 different ensemble methods • Compare the results of the models and show the performance of the different methodsBagging, Random Forests, and Boosting for Classification Present with a practical example the procedure to build ensemble methods for classification tasks and compare the results of ensemble methods with other simpler methods. • Present the dataset to be used in the example • Build and train three different ensemble methods • Compare the results of the models and show the performance of the different methods The Course Overview This video provides an overview of the entire course. The Course Overview This video provides an overview of the entire course. The Course Overview This video provides an overview of the entire course. The Course Overview This video provides an overview of the entire course. This video provides an overview of the entire course. This video provides an overview of the entire course. How Ensemble Methods Work? Explain the general idea behind ensemble methods and discuss at a high level the intuition of the main ensemble methods - bagging, random forest and boosting. • Explain the general idea on which all ensemble methods are based • Explain at a high level the idea of bagging • Explain at a high level the idea of random forests and boosting How Ensemble Methods Work? Explain the general idea behind ensemble methods and discuss at a high level the intuition of the main ensemble methods - bagging, random forest and boosting. • Explain the general idea on which all ensemble methods are based • Explain at a high level the idea of bagging • Explain at a high level the idea of random forests and boosting How Ensemble Methods Work? Explain the general idea behind ensemble methods and discuss at a high level the intuition of the main ensemble methods - bagging, random forest and boosting. • Explain the general idea on which all ensemble methods are based • Explain at a high level the idea of bagging • Explain at a high level the idea of random forests and boosting How Ensemble Methods Work? Explain the general idea behind ensemble methods and discuss at a high level the intuition of the main ensemble methods - bagging, random forest and boosting. • Explain the general idea on which all ensemble methods are based • Explain at a high level the idea of bagging • Explain at a high level the idea of random forests and boosting Explain the general idea behind ensemble methods and discuss at a high level the intuition of the main ensemble methods - bagging, random forest and boosting. • Explain the general idea on which all ensemble methods are based • Explain at a high level the idea of bagging • Explain at a high level the idea of random forests and boosting Explain the general idea behind ensemble methods and discuss at a high level the intuition of the main ensemble methods - bagging, random forest and boosting. • Explain the general idea on which all ensemble methods are based • Explain at a high level the idea of bagging • Explain at a high level the idea of random forests and boosting Bagging, Random Forests, and Boosting for Regression Present with a practical example the procedure to build ensemble methods for regression tasks and compare the results of ensemble methods with other simpler methods. • Present the dataset to be used in the example • Build and train 3 different ensemble methods • Compare the results of the models and show the performance of the different methods Bagging, Random Forests, and Boosting for Regression Present with a practical example the procedure to build ensemble methods for regression tasks and compare the results of ensemble methods with other simpler methods. • Present the dataset to be used in the example • Build and train 3 different ensemble methods • Compare the results of the models and show the performance of the different methods Bagging, Random Forests, and Boosting for Regression Present with a practical example the procedure to build ensemble methods for regression tasks and compare the results of ensemble methods with other simpler methods. • Present the dataset to be used in the example • Build and train 3 different ensemble methods • Compare the results of the models and show the performance of the different methods Bagging, Random Forests, and Boosting for Regression Present with a practical example the procedure to build ensemble methods for regression tasks and compare the results of ensemble methods with other simpler methods. • Present the dataset to be used in the example • Build and train 3 different ensemble methods • Compare the results of the models and show the performance of the different methods Present with a practical example the procedure to build ensemble methods for regression tasks and compare the results of ensemble methods with other simpler methods. • Present the dataset to be used in the example • Build and train 3 different ensemble methods • Compare the results of the models and show the performance of the different methods Present with a practical example the procedure to build ensemble methods for regression tasks and compare the results of ensemble methods with other simpler methods. • Present the dataset to be used in the example • Build and train 3 different ensemble methods • Compare the results of the models and show the performance of the different methods Bagging, Random Forests, and Boosting for Classification Present with a practical example the procedure to build ensemble methods for classification tasks and compare the results of ensemble methods with other simpler methods. • Present the dataset to be used in the example • Build and train three different ensemble methods • Compare the results of the models and show the performance of the different methods Bagging, Random Forests, and Boosting for Classification Present with a practical example the procedure to build ensemble methods for classification tasks and compare the results of ensemble methods with other simpler methods. • Present the dataset to be used in the example • Build and train three different ensemble methods • Compare the results of the models and show the performance of the different methods Bagging, Random Forests, and Boosting for Classification Present with a practical example the procedure to build ensemble methods for classification tasks and compare the results of ensemble methods with other simpler methods. • Present the dataset to be used in the example • Build and train three different ensemble methods • Compare the results of the models and show the performance of the different methods Bagging, Random Forests, and Boosting for Classification Present with a practical example the procedure to build ensemble methods for classification tasks and compare the results of ensemble methods with other simpler methods. • Present the dataset to be used in the example • Build and train three different ensemble methods • Compare the results of the models and show the performance of the different methods Present with a practical example the procedure to build ensemble methods for classification tasks and compare the results of ensemble methods with other simpler methods. • Present the dataset to be used in the example • Build and train three different ensemble methods • Compare the results of the models and show the performance of the different methods Present with a practical example the procedure to build ensemble methods for classification tasks and compare the results of ensemble methods with other simpler methods. • Present the dataset to be used in the example • Build and train three different ensemble methods • Compare the results of the models and show the performance of the different methods Cross-Validation and Parameter Tuning 3 lectures31:12K-fold Cross-Validation Explain the main problem with hold out cross validation and explain the approach of K-fold cross-validation to solve this problem. Present how to do K-fold cross validation in scikit-learn • Give a little review about the need for cross-validation • Explain the main problem with hold out cross-validation • Explain how k-fold cross validation works and show how to do it in scikit-learnComparing Models with K-fold Cross-Validation Show how k-fold cross-validation can be used to get a better assessment of the performance of the models and hence to make better model comparison. • Build and train three models on the same dataset • Use k-fold cross-validation to get the performance metrics of the models • Use the results to compare the 3 trained modelsHyper-Parameter Tuning in scikit-learn Explain the need for hyper-parameter tuning when building predictive analytics models and show how we can combine k-fold cross-validation with grid search to do it. • Explain the need for hyper-parameter tuning • Show who to use grid search with k-fold cross-validation for hyper-parameter tuning • Show how to tune hyper-parameters with a practical example Cross-Validation and Parameter Tuning. 3 lectures31:12K-fold Cross-Validation Explain the main problem with hold out cross validation and explain the approach of K-fold cross-validation to solve this problem. Present how to do K-fold cross validation in scikit-learn • Give a little review about the need for cross-validation • Explain the main problem with hold out cross-validation • Explain how k-fold cross validation works and show how to do it in scikit-learnComparing Models with K-fold Cross-Validation Show how k-fold cross-validation can be used to get a better assessment of the performance of the models and hence to make better model comparison. • Build and train three models on the same dataset • Use k-fold cross-validation to get the performance metrics of the models • Use the results to compare the 3 trained modelsHyper-Parameter Tuning in scikit-learn Explain the need for hyper-parameter tuning when building predictive analytics models and show how we can combine k-fold cross-validation with grid search to do it. • Explain the need for hyper-parameter tuning • Show who to use grid search with k-fold cross-validation for hyper-parameter tuning • Show how to tune hyper-parameters with a practical example K-fold Cross-Validation Explain the main problem with hold out cross validation and explain the approach of K-fold cross-validation to solve this problem. Present how to do K-fold cross validation in scikit-learn • Give a little review about the need for cross-validation • Explain the main problem with hold out cross-validation • Explain how k-fold cross validation works and show how to do it in scikit-learn K-fold Cross-Validation Explain the main problem with hold out cross validation and explain the approach of K-fold cross-validation to solve this problem. Present how to do K-fold cross validation in scikit-learn • Give a little review about the need for cross-validation • Explain the main problem with hold out cross-validation • Explain how k-fold cross validation works and show how to do it in scikit-learn K-fold Cross-Validation Explain the main problem with hold out cross validation and explain the approach of K-fold cross-validation to solve this problem. Present how to do K-fold cross validation in scikit-learn • Give a little review about the need for cross-validation • Explain the main problem with hold out cross-validation • Explain how k-fold cross validation works and show how to do it in scikit-learn K-fold Cross-Validation Explain the main problem with hold out cross validation and explain the approach of K-fold cross-validation to solve this problem. Present how to do K-fold cross validation in scikit-learn • Give a little review about the need for cross-validation • Explain the main problem with hold out cross-validation • Explain how k-fold cross validation works and show how to do it in scikit-learn Explain the main problem with hold out cross validation and explain the approach of K-fold cross-validation to solve this problem. Present how to do K-fold cross validation in scikit-learn • Give a little review about the need for cross-validation • Explain the main problem with hold out cross-validation • Explain how k-fold cross validation works and show how to do it in scikit-learn Explain the main problem with hold out cross validation and explain the approach of K-fold cross-validation to solve this problem kit-learn. • Explain the need for dimensionality reduction • Explain the PCA method for dimensionality reduction • Show how to...
Additional information
Knowledge of Python and familiarity with its Data Science Stack are assumed. Additionally, an understanding of the basic concepts of predictive analytics and how to use basic predictive models is also necessary to take full advantage of this course