Master

In Maynard (USA)

Price on request

Description

  • Type

    Master

  • Location

    Maynard (USA)

  • Start date

    Different dates available

6.867 is an introductory course on machine learning which gives an overview of many concepts, techniques, and algorithms in machine learning, beginning with topics such as classification and linear regression and ending up with more recent topics such as boosting, support vector machines, hidden Markov models, and Bayesian networks. The course will give the student the basic ideas and intuition behind modern machine learning methods as well as a bit more formal understanding of how, why, and when they work. The underlying theme in the course is statistical inference as it provides the foundation for most of the methods covered.

Facilities

Location

Start date

Maynard (USA)
See map
02139

Start date

Different dates availableEnrolment now open

Questions & Answers

Add your question

Our advisors and other users will be able to reply to you

Who would you like to address this question to?

Fill in your details to get a reply

We will only publish your name and question

Reviews

Subjects

  • Project
  • Materials
  • Algorithms
  • Networks

Course programme

Lectures: 2 sessions / week, 1.5 hours / session


A list of topics covered in the course is presented in the calendar.


This introductory course gives an overview of many concepts, techniques, and algorithms in machine learning, beginning with topics such as classification and linear regression and ending up with more recent topics such as boosting, support vector machines, hidden Markov models, and Bayesian networks. The course will give the student the basic ideas and intuition behind modern machine learning methods as well as a bit more formal understanding of how, why, and when they work. The underlying theme in the course is statistical inference as it provides the foundation for most of the methods covered.


There will be a total of 5 problem sets, due roughly every two weeks. The content of the problem sets will vary from theoretical questions to more applied problems. You are encouraged to collaborate with other students while solving the problems but you will have to turn in your own solutions. Copying will not be tolerated. If you collaborate, you must indicate all of your collaborators.


Each problem set will be graded by a group of students with the guidance of your TAs. Each problem set will be graded in a single grading session, usually on the first Monday after it is due, starting at 5pm. Every student is required to participate in one grading session. You should sign up for grading by contacting a TA, by email or in person; doing it early increases the chances of getting the preferred grading schedule. Students who do not register for grading by the third week of the course, will be assigned to a problem set by us.


If you drop the class after signing up for a grading session, please be sure to let us know so we can keep track of students available for grading. If you add the class during the term, please remember to sign up for grading.


There will be two in-class exams, a midterm midway through the term and a final the last day of class.


You are required to complete a class project. The choice of the topic is up to you so long as it clearly pertains to the course material. To ensure that you are on the right track, you will have to submit a one paragraph description of your project a month before the project is due. Similarly to problem sets, you are encouraged to collaborate on the project. We expect a four page write-up about the project, which should clearly and succinctly describe the project goal, methods, and your results. Each group should submit only one copy of the write-up and include all the names of the group members (a two person group will have 6 pages, a three person group will have 8 pages, and so on). The projects will be graded on the basis of your understanding of the overall course material (not based on, e.g., how brilliantly your method works). The scope of the project is about 1-2 problem sets.


The projects are due in Lec #23. Electronic submission is required but we can accept only postscript or pdf documents. The short proposal should be turned in on or before Lec #12.


The projects can be literature reviews, theoretical derivations or analyses, applications of machine learning methods to problems you are interested in, or something else (to be discussed with course staff).


Your overall grade will be determined roughly as follows:


There are a number of useful texts for this course but each covers only some part of the class material.


Bishop, Christopher. Neural Networks for Pattern Recognition. New York, NY: Oxford University Press, 1995. ISBN: 9780198538646.


Duda, Richard, Peter Hart, and David Stork. Pattern Classification. 2nd ed. New York, NY: Wiley-Interscience, 2000. ISBN: 9780471056690.


Hastie, T., R. Tibshirani, and J. H. Friedman. The Elements of Statistical Learning: Data Mining, Inference and Prediction. New York, NY: Springer, 2001. ISBN: 9780387952840.


MacKay, David. Information Theory, Inference, and Learning Algorithms. Cambridge, UK: Cambridge University Press, 2003. ISBN: 9780521642989. Available on-line here.


Mitchell, Tom. Machine Learning. New York, NY: McGraw-Hill, 1997. ISBN: 9780070428072.


You are responsible for the material covered in lectures (most of which will appear in lecture notes in some form), problem sets, as well as material specifically made available and indicated for this purpose. The weekly recitations/tutorials will be helpful in understanding the material and solving the homework problems.


For any use or distribution of these materials, please cite as follows:


Tommi Jaakkola, course materials for 6.867 Machine Learning, Fall 2006. MIT OpenCourseWare ( Massachusetts Institute of Technology. Downloaded on [DD Month YYYY].


Problem set 3 due


Problem set 4 out


Probabilistic inference


Guest lecture on collaborative filtering


Don't show me this again


This is one of over 2,200 courses on OCW. Find materials for this course in the pages linked along the left.


MIT OpenCourseWare is a free & open publication of material from thousands of MIT courses, covering the entire MIT curriculum.


No enrollment or registration. Freely browse and use OCW materials at your own pace. There's no signup, and no start or end dates.


Knowledge is your reward. Use OCW to guide your own life-long learning, or to teach others. We don't offer credit or certification for using OCW.


Made for sharing. Download files for later. Send to friends and colleagues. Modify, remix, and reuse (just remember to cite OCW as the source.)


Learn more at Get Started with MIT OpenCourseWare


Machine learning

Price on request