I’m a 1st year PhD student at the University of Toronto supervised by Prof. Gennady Pekhimenko. I completed Masters in Computer Science from University of British Columbia under the supervision of Prof. Alexandra Fedorova and Prof. Ivan Beschastnikh. I primarily work on systems research for machine learning and distributed/parallel computing.
My recent research projects include optimizing distributed machine learning and performance analysis of deep learning training workloads. I also have experience working on instruction prefetching optimizations for multi-core machines and virtual machines. Before returning to academia, I was working as a software engineer for 5 years in various companies in India.
- “Why Should I Trust You?” Explaining the Predictions of Any Classifier
- Beyond Data and Model Parallelism for Deep Neural Networks
- Deep Gradient Compression: Reducing the Communication Bandwidth for Distributed Training
- Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding