SAMPL is an interdisciplinary machine learning research group exploring problems spanning multiple layers of the system stack including deep learning frameworks, specialized hardware for training and inference, new intermediate representations, differentiable programming, and various applications. We are part of the Paul G. Allen School of Computer Science & Engineering at the University of Washington. Our group is a collaboration between researchers from Sampa, Syslab, MODE, and PLSE.



Parameter Box, Parameter Hub and Parameter Link

Parameter Server for Efficient Distributed Deep Neural Network Training for Clusters, Datacenters, and the Public Clouds

Read more »

Relay High-Level Intermediate Representation (IR)

High level IR for optimizing machine learning models.

Read more »

VTA Deep Learning Accelerator

Hardware/Software Deep Learning Acceleration Stack

Read more »

Sequential Model Specialization

Fast Video Classification via Adaptive Cascading of Deep Models

Read more »

TVM Stack

TVM: An Automated End-to-End Optimizing Compiler for Deep Learning

Read more »


A Scalable Tree Boosting System

Read more »


Model, data, and computing to support learning are three pillars of machine learning. Advances in these three factors enabled breakthroughs in the past, and we believe will enable significant future advances as well. Specialized hardware architectures such as GPUs and TPU-like accelerators fuel the ever-growing computing demand for machine learning workloads. We need to address new challenges in scheduling, networking, storage and programming abstraction to build scalable systems that benefit from emerging hardware architectures and deal with ever-growing available data. Importantly, future models and learning algorithms need to be co-designed with the hardware, and system-level factors need to inform the design of hardware-software stack. We need to build common, reusable infrastructure that works across hardware backends, and use learning to make smarter systems. These challenges and research questions span multiple areas of computer science. Hence, we formed SAMPL (System, Architecture, Machine learning, and Programming language), a joint research group to conduct this-cross stack research. We focus on system, hardware and learning model co-design and build novel architectures, programming abstractions, and learning systems to enable future intelligent systems. We strive to both build tools usable by the general community as well explore new directions in machine learning systems.



Luis Ceze
Tianqi Chen
Assistant Professor - CMU
Thierry Moreau
Affiliate Assistant Professor
Matthai Philipose
Affiliate Professor
Alex Ratner
Assistant Professor
Zachary Tatlock
Associate Professor


Past Undergraduate Students

Lianmin Zheng Interned in 2018. Now at Berkeley Ph.D. program.


Josh Fromm Ph.D., 2019. Now at OctoML.
Liang Luo Ph.D., 2020. Now at Facebook Research.
Haichen Shen Ph.D., 2018. Now at AWS AI.
Seungyeop Han Ph.D., 2016. Now at Rubrik, Inc.