I’m a Computer Science graduate student at the University of British Columbia (UBC) advised by Prof. Alexandra Fedorova and co-advised by Prof. Ivan Beschastnikh. I primarily work on systems research for machine learning and parallel computing. I’m a member of NSS Lab at UBC.
My recent research projects include optimizing distributed machine learning, performance analysis of deep learning training workloads and debugging machine learning systems. I also have experience working on instruction prefetching optimizations for multi-core machines and virtual machines. Before returning to academia, I was working as a software engineer for 5 years in various companies in India.
- “Why Should I Trust You?” Explaining the Predictions of Any Classifier
- Beyond Data and Model Parallelism for Deep Neural Networks
- Deep Gradient Compression: Reducing the Communication Bandwidth for Distributed Training
- Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding