selected scholarly activity
-
conferences
- Mixtures of Gaussians are Privately Learnable with a Polynomial Number of Samples. INTERNATIONAL CONFERENCE ON ALGORITHMIC LEARNING THEORY, VOL 237. 2024
- Sample-Optimal Locally Private Hypothesis Selection and the Provable Benefits of Interactivity. Proceedings of Machine Learning Research. 4240-4275. 2024
- Adversarially Robust Learning with Tolerance. INTERNATIONAL CONFERENCE ON ALGORITHMIC LEARNING THEORY, VOL 201. 115-135. 2023
- On the Role of Noise in the Sample Complexity of Learning Recurrent Neural Networks: Exponential Gaps for Long Sequences. Advances in Neural Information Processing Systems. 2023
- Polynomial Time and Private Learning of Unbounded Gaussian Mixture Models. Proceedings of Machine Learning Research. 1018-1040. 2023
- Benefits of Additive Noise in Composing Classes with Bounded Capacity. Advances in Neural Information Processing Systems. 2022
- Private and polynomial time algorithms for learning Gaussians and beyond: Extended Abstract. Proceedings of Machine Learning Research. 1075-1076. 2022
- On the Sample Complexity of Privately Learning Unbounded High-Dimensional Gaussians. ALGORITHMIC LEARNING THEORY, VOL 132. 2021
- Black-box Certification and Learning under Adversarial Perturbations. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119. 2020
- On the Sample Complexity of Learning Sum-Product Networks. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108. 4508-4518. 2020
- On the Sample Complexity of Learning Sum-Product Networks. Proceedings of Machine Learning Research. 4508-4518. 2020
- Disentangled behavioral representations. Advances in Neural Information Processing Systems. 2019
- Nearly tight sample complexity bounds for learning mixtures of Gaussians via sample compression schemes. Advances in Neural Information Processing Systems. 2018
- Online Nearest Neighbor Search in Binary Space. Proceedings - IEEE International Conference on Data Mining, ICDM. 853-858. 2017
- Clustering with Same-Cluster Queries. Advances in Neural Information Processing Systems. 2016
- A Dimension-Independent Generalization Bound for Kernel Supervised Principal Component Analysis. Feature Extraction: Modern Questions and Challenges. 19-29. 2015
-
journal articles
- Polynomial Time and Private Learning of Unbounded Gaussian Mixture Models 2023
- Benefits of Additive Noise in Composing Classes with Bounded Capacity 2022
- Adversarially Robust Learning with Tolerance 2022
- Private and polynomial time algorithms for learning Gaussians and beyond 2021
- Privately Learning Mixtures of Axis-Aligned Gaussians. Advances in Neural Information Processing Systems. 34. 2021
- Near-optimal Sample Complexity Bounds for Robust Learning of Gaussian Mixtures via Compression Schemes. Journal of the ACM. 67:1-42. 2020
- On the Sample Complexity of Privately Learning Unbounded High-Dimensional Gaussians 2020
- Online Nearest Neighbor Search Using Hamming Weight Trees.. IEEE Transactions on Pattern Analysis and Machine Intelligence. 42:1729-1740. 2020
- Online Nearest Neighbor Search Using Hamming Weight Trees. IEEE Transactions on Pattern Analysis and Machine Intelligence. 42:1-1. 2019
- Online Nearest Neighbor Search Using Hamming Weight Trees. IEEE Transactions on Pattern Analysis and Machine Intelligence. 1-1. 2019
- Sample-Efficient Learning of Mixtures. arXiv:1706.01596 [cs]. 32:2679-2686. 2017
- Representation Learning for Clustering: A Statistical Framework. arXiv:1506.05900 [cs, stat]. 82-91. 2015
-
other
-
preprints
- Sample-Efficient Private Learning of Mixtures of Gaussians 2024
- Mixtures of Gaussians are Privately Learnable with a Polynomial Number of Samples 2023
- On the Role of Noise in the Sample Complexity of Learning Recurrent Neural Networks: Exponential Gaps for Long Sequences 2023
- Disentangled behavioral representations 2019
- Near-optimal Sample Complexity Bounds for Robust Learning of Gaussians Mixtures via Compression Schemes 2017
- Sample-Efficient Learning of Mixtures 2017