Divyansh Jhunjhunwala
About Me
Hi! I am Divyansh, a fourth year PhD candidate in the Electrical and Computer Engineering department at Carnegie Mellon University, advised by Dr. Gauri Joshi.
My research interests lie broadly in distributed optimization and machine learning, in particular federated learning. Given the multi-disciplinary nature of problems in federated learning, my research often leverages ideas from related areas such as optimization theory, transfer learning, model fusion, theory of over-parameterized neural networks, among others.
In summer 22 and summer 23, I interned at IBM Research working with Dr. Shiqiang Wang on some interesting problems in federated learning.
Prior to CMU, I completed my Bachelors in Technology (B.Tech) in Electronics and Electrical Communication Engineering from IIT Kharagpur, where I
received the Institute Silver Medal for graduating with the highest CGPA in my department.
Email  / 
Google Scholar
|
|
|
FedFisher: Leveraging Fisher Information for One-Shot Federated Learning
Divyansh Jhunjhunwala , Shiqiang Wang, Gauri Joshi
International Conference on Artificial Intelligence and Statistics (AISTATS) 2024
A preliminary version appeared at Federated Learning and Analytics workshop at ICML 2023
Propose FedFisher, an algorithm for learning the global model for federated learning using just one round communication with novel theotetical guarantees for two layer overparameterized ReLU networks.
|
|
FedExP: Speeding up Federated Averaging via Extrapolation
Divyansh Jhunjhunwala , Shiqiang Wang, Gauri Joshi
International Conference on Learning Representations (ICLR), 2023 ( Spotlight, top 25% of accepted papers )
We present FedExP, a method to adaptively determine the server step size in FL based on dynamically varying pseudo-gradients throughout the FL process.
|
|
Maximizing Global Model Appeal in Federated Learning
Yae Jee Cho, Divyansh Jhunjhunwala , Tian Li, Virginia Smith, Gauri Joshi
Transactions of Machine Learning Research (TMLR), 2024
Propose MaxFL algorithm to explicitly maximize the fraction of clients that are incentivized to use the global model in federated learning.
|
|
FedVARP: Tackling the Variance Due to Partial Client Participation in Federated Learning
Divyansh Jhunjhunwala , Pranay Sharma, Aushim Nagarkatti, Gauri Joshi
Uncertainty in Artificial Intelligence (UAI), 2022
Propose FedVARP algorithm to deal with variance caused by only a few clients participating in every round of federated training.
|
|
Leveraging Spatial and Temporal Correlations in Sparsified Mean Estimation
Divyansh Jhunjhunwala , Ankur Mallick, Advait Gadhikar, Swanand Kadhe, Gauri Joshi
Advances in Neural Information Processing Systems (NeurIPS), 2021
Introduce notions of spatial and temporal correlations and show how they can be used to efficiently compute the mean of a set of vectors in a communication-limited setting.
|
|
Adaptive Quantization of model updates for communication-efficient federated learning
Divyansh Jhunjhunwala , Advait Gadhikar, Gauri Joshi, Yonina C. Eldar
International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2021
Propose an adaptive quantization strategy that aims to achieve communication efficiency as well as a low error floor by changing the number of quantization levels during training in federated learning.
|
|