Machine Learning Systems (Fall 2019)

Course Description

The recent success of AI has been in large part due in part to advances in hardware and software systems. These systems have enabled training increasingly complex models on ever larger datasets. In the process, these systems have also simplified model development, enabling the rapid growth in the machine learning community. These new hardware and software systems include a new generation of GPUs and hardware accelerators (e.g., TPU and Nervana), open source frameworks such as Theano, TensorFlow, PyTorch, MXNet, Apache Spark, Clipper, Horovod, and Ray, and a myriad of systems deployed internally at companies just to name a few. At the same time, we are witnessing a flurry of ML/RL applications to improve hardware and system designs, job scheduling, program synthesis, and circuit layouts.

In this course, we will describe the latest trends in systems designs to better support the next generation of AI applications, and applications of AI to optimize the architecture and the performance of systems. The format of this course will be a mix of lectures, seminar-style discussions, and student presentations. Students will be responsible for paper readings, and completing a hands-on project. For projects, we will strongly encourage teams that contains both AI and systems students.

New Course Format

A previous version of this course was offered in Spring 2019. The format of this second offering is slightly different. Each week will cover a different research area in AI-Systems. The Monday lecture will be presented by Professor Gonzalez and will cover the context of the topic as well as a high-level overview of the reading for the week. The Friday lecture will be organized around a mini program committee meeting for the weeks readings. Students will be required to submit detailed reviews for a subset of the papers and lead the paper review discussions. The goal of this new format is to both build a mastery of the material and also to develop a deeper understanding of how to evaluate and review research and hopefully provide insight into how to write better papers.

Course Syllabus

This is a tentative schedule. Specific readings are subject to change as new material is published.

Jump to Today

Week Date (Lec.) Topic
( 1 )

Introduction and Course Overview

This lecture will be an overview of the class, requirements, and an introduction to the history of machine learning and systems research.

( 2 )

Holiday (Labor Day)

There will be no class but please sign-up for the weekly discussion slots.

( 3 )

Big Ideas and How to Evaluate ML Systems Research

Additional Machine Learning Reading

Additional Systems Reading

Open Debate about the Field

( 4 )

Machine Learning Life-cycle

This lecture will discuss the machine learning life-cycle, spanning model development, training, and serving. It will outline some of the technical machine learning and systems challenges at each stage and how these challenges interact.

( 5 )

Discussion of Papers on Machine Learning Life-cycle

( 6 )

Database Systems and Machine Learning

In the previous lecture we saw that data and feature engineering is often the dominant hurtle in model development. Database systems are often the source of data and the platform in which feature engineering takes place. This lecture will cover some of the big ideas is database systems and how they relate to work on machine learning in databases.

  • Lecture slides: [pdf, pptx]
  • Project Proposal Sign-up doc. You must be enrolled in the class or on the waitlist to access this document. Please add any projects you are thinking about starting and list yourself as interested in anyone else’s projects.
( 7 )

Discussion of Database Systems and Machine Learning

( 8 )

Machine Learning Frameworks and Automatic Differentiation

This week we will discuss recent development in model development and training frameworks. While there is a long history of machine learning frameworks we will focus on frameworks for deep learning and automatic differentiation. In class we will review some of the big trends in machine learning framework design and basic ideas in forward and backward automatic differentiation.

Project proposals are due next Monday

( 9 )

Machine Learning Frameworks and Automatic Differentiation

Update: Two of the readings were changed to reflect a focus on deep learning frameworks. The previous readings on SystemML and KeystoneML have been moved to optional reading.

( 10 )

Distributed Model Training

This week we will discuss developments in distributed training. We will quickly review the statistical query model pushed by early map-reduce machine learning frameworks and then discuss advances in parameter servers and distributed neural network training.

Project Proposals Due!

  • One Page Project description due at 11:59 PM. Check out the suggested projects. Submit a link to your one page Google document containing your project descriptions to this google form. You only need one submission per team but please list all the team member’s email addresses. You can also update your submission if needed.
( 11 )

Discussion on Distributed Model Training

( 12 )

Prediction Serving

Until recently, much of the focus on systems research was aimed at model training. However, recently there has been a growing interest in addressing the challenges of prediction serving. This lecture will frame the challenges of prediction serving and cover some of the recent advances.

( 13 )

Unfortunately, class was canceled and so the PC Meeting has been moved to Monday. Note that early project presentations are also due next Friday.

( 14 )

Discussion on Prediction Serving

( 15 )

Project Presentations

( 16 )

Finish Project Presentations and Start Model Compilation

This week we will explore the process of compiling/optimizing deep neural network computation graphs. This reading will span both graph level optimization as well as the compilation and optimization of individual tensor operations.

( 17 )

Discussion of Model Compilation

( 18 )

Unfortunately, due to the power outage, lecture is canceled today. To make up for lost lecture(s) and accommodate our guest speakers, we will skip the overview lecture this week and start with the PC meeting on Machine Learning Applied to Systems. However, this will put a little extra pressure on the neutral presenters to provide additional context. We will then cover the discussion on machine learning hardware the following Monday.

( 19 )

Discussion of Machine Learning Applied to Systems Day 1

( 20 )

Hardware Acceleration for Machine Learning

This lecture will be presented by Kurt Keutzer and Suresh Krishna who are experts in processor design as well as network and architecture co-design.

  • Guest lecture slides: [pdf, pptx]
( 21 )

Discussion Hardware Acceleration for Machine Learning

( 22 )

(11/11) Administrative Holiday

( 23 )

Discussion of Machine Learning Applied to Systems Day 2

( 24 )

Learning with Adversaries

This week we will discuss machine learning in adversarial settings. This includes secure federated learning, differential privacy, and adversarial examples.

( 25 )

Discussion on Learning with Adversaries

( 26 )

Autonomous Driving

Autonomous vehicles will likely transform society in the next decade and are fundamentally AI enabled systems. In this lecture we will discuss the AI-Systems challenges around autonomous driving.

( 27 )

(11/29) Holiday (Thanksgiving)

( 28 )

Discussion on Autonomous Driving

Everyone must do one of the readings (you pick).

( 29 )


( 30 )

(12/6) RRR Week

( 31 )

(12/9) RRR Week

( 32 )

(12/16) Poster Presentations

( 33 )

(12/20) No Class

Don’t forget to submit your final reports. As noted on Piazza, the final report should be 6-pages plus references (2-column, 10pt font, unlimited appendix). Please submit your report using this form:

Submit Your Report Here

You only need to submit the project once per team. The write-up should discuss the problem formulation, related work, your approach, and your results.


Detailed candidate project descriptions will be posted shortly. However, students are encourage to find projects that relate to their ongoing research.


Grades will be largely based on class participation and projects. In addition, we will require weekly paper summaries submitted before class.