default search action
Yin Tat Lee
Person information
SPARQL queries
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
2020 – today
- 2024
- [j12]Manru Zong, Yin Tat Lee, Man-Chung Yue:
Short-step methods are not strongly polynomial-time. Math. Program. 207(1): 733-746 (2024) - [c85]Chulin Xie, Zinan Lin, Arturs Backurs, Sivakanth Gopi, Da Yu, Huseyin A. Inan, Harsha Nori, Haotian Jiang, Huishuai Zhang, Yin Tat Lee, Bo Li, Sergey Yekhanin:
Differentially Private Synthetic Data via Foundation Model APIs 2: Text. ICML 2024 - [c84]Haotian Jiang, Yin Tat Lee, Zhao Song, Lichen Zhang:
Convex Minimization with Integer Minima in Õ(n4) Time. SODA 2024: 3659-3684 - [c83]Mehrdad Ghadiri, Yin Tat Lee, Swati Padmanabhan, William Swartworth, David P. Woodruff, Guanghao Ye:
Improving the Bit Complexity of Communication for Distributed Convex Optimization. STOC 2024: 1130-1140 - [i93]Chulin Xie, Zinan Lin, Arturs Backurs, Sivakanth Gopi, Da Yu, Huseyin A. Inan, Harsha Nori, Haotian Jiang, Huishuai Zhang, Yin Tat Lee, Bo Li, Sergey Yekhanin:
Differentially Private Synthetic Data via Foundation Model APIs 2: Text. CoRR abs/2403.01749 (2024) - [i92]Mehrdad Ghadiri, Yin Tat Lee, Swati Padmanabhan, William Swartworth, David P. Woodruff, Guanghao Ye:
Improving the Bit Complexity of Communication for Distributed Convex Optimization. CoRR abs/2403.19146 (2024) - [i91]Marah I Abdin, Sam Ade Jacobs, Ammar Ahmad Awan, Jyoti Aneja, Ahmed Awadallah, Hany Awadalla, Nguyen Bach, Amit Bahree, Arash Bakhtiari, Harkirat S. Behl, Alon Benhaim, Misha Bilenko, Johan Bjorck, Sébastien Bubeck, Martin Cai, Caio César Teodoro Mendes, Weizhu Chen, Vishrav Chaudhary, Parul Chopra, Allie Del Giorno, Gustavo de Rosa, Matthew Dixon, Ronen Eldan, Dan Iter, Amit Garg, Abhishek Goswami, Suriya Gunasekar, Emman Haider, Junheng Hao, Russell J. Hewett, Jamie Huynh, Mojan Javaheripi, Xin Jin, Piero Kauffmann, Nikos Karampatziakis, Dongwoo Kim, Mahoud Khademi, Lev Kurilenko, James R. Lee, Yin Tat Lee, Yuanzhi Li, Chen Liang, Weishung Liu, Eric Lin, Zeqi Lin, Piyush Madan, Arindam Mitra, Hardik Modi, Anh Nguyen, Brandon Norick, Barun Patra, Daniel Perez-Becker, Thomas Portet, Reid Pryzant, Heyang Qin, Marko Radmilac, Corby Rosset, Sambudha Roy, Olatunji Ruwase, Olli Saarikivi, Amin Saied, Adil Salim, Michael Santacroce, Shital Shah, Ning Shang, Hiteshi Sharma, Xia Song, Masahiro Tanaka, Xin Wang, Rachel Ward, Guanhua Wang, Philipp Witte, Michael Wyatt, Can Xu, Jiahang Xu, Sonali Yadav, Fan Yang, Ziyi Yang, Donghan Yu, Chengruidong Zhang, Cyril Zhang, Jianwen Zhang, Li Lyna Zhang, Yi Zhang, Yue Zhang, Yunan Zhang, Xiren Zhou:
Phi-3 Technical Report: A Highly Capable Language Model Locally on Your Phone. CoRR abs/2404.14219 (2024) - 2023
- [c82]Sivakanth Gopi, Yin Tat Lee, Daogao Liu, Ruoqi Shen, Kevin Tian:
Algorithmic Aspects of the Log-Laplace Transform and a Non-Euclidean Proximal Sampler. COLT 2023: 2399-2439 - [c81]Yunbum Kook, Yin Tat Lee, Ruoqi Shen, Santosh S. Vempala:
Condition-number-independent Convergence Rate of Riemannian Hamiltonian Monte Carlo with Numerical Integrators. COLT 2023: 4504-4569 - [c80]Reid Pryzant, Dan Iter, Jerry Li, Yin Tat Lee, Chenguang Zhu, Michael Zeng:
Automatic Prompt Optimization with "Gradient Descent" and Beam Search. EMNLP 2023: 7957-7968 - [c79]Yair Carmon, Arun Jambulapati, Yujia Jin, Yin Tat Lee, Daogao Liu, Aaron Sidford, Kevin Tian:
ReSQueing Parallel and Private Stochastic Convex Optimization. FOCS 2023: 2031-2058 - [c78]Jiyan He, Xuechen Li, Da Yu, Huishuai Zhang, Janardhan Kulkarni, Yin Tat Lee, Arturs Backurs, Nenghai Yu, Jiang Bian:
Exploring the Limits of Differentially Private Deep Learning with Group-wise Clipping. ICLR 2023 - [c77]Kwangjun Ahn, Sébastien Bubeck, Sinho Chewi, Yin Tat Lee, Felipe Suarez, Yi Zhang:
Learning threshold neurons via edge of stability. NeurIPS 2023 - [c76]Sivakanth Gopi, Yin Tat Lee, Daogao Liu, Ruoqi Shen, Kevin Tian:
Private Convex Optimization in General Norms. SODA 2023: 5068-5089 - [c75]Sophie Huiberts, Yin Tat Lee, Xinzhi Zhang:
Upper and Lower Bounds on the Smoothed Complexity of the Simplex Method. STOC 2023: 1904-1917 - [i90]Yair Carmon, Arun Jambulapati, Yujia Jin, Yin Tat Lee, Daogao Liu, Aaron Sidford, Kevin Tian:
ReSQueing Parallel and Private Stochastic Convex Optimization. CoRR abs/2301.00457 (2023) - [i89]Sivakanth Gopi, Yin Tat Lee, Daogao Liu, Ruoqi Shen, Kevin Tian:
Algorithmic Aspects of the Log-Laplace Transform and a Non-Euclidean Proximal Sampler. CoRR abs/2302.06085 (2023) - [i88]Yangsibo Huang, Daogao Liu, Zexuan Zhong, Weijia Shi, Yin Tat Lee:
kNN-Adapter: Efficient Domain Adaptation for Black-Box Language Models. CoRR abs/2302.10879 (2023) - [i87]Sébastien Bubeck, Varun Chandrasekaran, Ronen Eldan, Johannes Gehrke, Eric Horvitz, Ece Kamar, Peter Lee, Yin Tat Lee, Yuanzhi Li, Scott M. Lundberg, Harsha Nori, Hamid Palangi, Marco Túlio Ribeiro, Yi Zhang:
Sparks of Artificial General Intelligence: Early experiments with GPT-4. CoRR abs/2303.12712 (2023) - [i86]Haotian Jiang, Yin Tat Lee, Zhao Song, Lichen Zhang:
Convex Minimization with Integer Minima in Õ(n4) Time. CoRR abs/2304.03426 (2023) - [i85]Reid Pryzant, Dan Iter, Jerry Li, Yin Tat Lee, Chenguang Zhu, Michael Zeng:
Automatic Prompt Optimization with "Gradient Descent" and Beam Search. CoRR abs/2305.03495 (2023) - [i84]Yiran Wu, Feiran Jia, Shaokun Zhang, Hangyu Li, Erkang Zhu, Yue Wang, Yin Tat Lee, Richard Peng, Qingyun Wu, Chi Wang:
An Empirical Study on Challenging Math Problem Solving with GPT-4. CoRR abs/2306.01337 (2023) - [i83]Suriya Gunasekar, Yi Zhang, Jyoti Aneja, Caio César Teodoro Mendes, Allie Del Giorno, Sivakanth Gopi, Mojan Javaheripi, Piero Kauffmann, Gustavo de Rosa, Olli Saarikivi, Adil Salim, Shital Shah, Harkirat Singh Behl, Xin Wang, Sébastien Bubeck, Ronen Eldan, Adam Tauman Kalai, Yin Tat Lee, Yuanzhi Li:
Textbooks Are All You Need. CoRR abs/2306.11644 (2023) - [i82]Yuanzhi Li, Sébastien Bubeck, Ronen Eldan, Allie Del Giorno, Suriya Gunasekar, Yin Tat Lee:
Textbooks Are All You Need II: phi-1.5 technical report. CoRR abs/2309.05463 (2023) - [i81]Ruoqi Shen, Sébastien Bubeck, Ronen Eldan, Yin Tat Lee, Yuanzhi Li, Yi Zhang:
Positional Description Matters for Transformers Arithmetic. CoRR abs/2311.14737 (2023) - [i80]Harsha Nori, Yin Tat Lee, Sheng Zhang, Dean Carignan, Richard Edgar, Nicolò Fusi, Nicholas King, Jonathan Larson, Yuanzhi Li, Weishung Liu, Renqian Luo, Scott Mayer McKinney, Robert Osazuwa Ness, Hoifung Poon, Tao Qin, Naoto Usuyama, Chris White, Eric Horvitz:
Can Generalist Foundation Models Outcompete Special-Purpose Tuning? Case Study in Medicine. CoRR abs/2311.16452 (2023) - 2022
- [j11]Yin Tat Lee, Santosh S. Vempala:
Geodesic Walks in Polytopes. SIAM J. Comput. 51(2): 17-400 (2022) - [c74]Sivakanth Gopi, Yin Tat Lee, Daogao Liu:
Private Convex Optimization via Exponential Mechanism. COLT 2022: 1948-1989 - [c73]Yin Tat Lee, Santosh S. Vempala:
The Manifold Joys of Sampling (Invited Talk). ICALP 2022: 4:1-4:20 - [c72]Da Yu, Saurabh Naik, Arturs Backurs, Sivakanth Gopi, Huseyin A. Inan, Gautam Kamath, Janardhan Kulkarni, Yin Tat Lee, Andre Manoel, Lukas Wutschitz, Sergey Yekhanin, Huishuai Zhang:
Differentially Private Fine-tuning of Language Models. ICLR 2022 - [c71]Damek Davis, Dmitriy Drusvyatskiy, Yin Tat Lee, Swati Padmanabhan, Guanghao Ye:
A gradient sampling method with complexity guarantees for Lipschitz functions in high and low dimensions. NeurIPS 2022 - [c70]Sally Dong, Haotian Jiang, Yin Tat Lee, Swati Padmanabhan, Guanghao Ye:
Decomposable Non-Smooth Convex Optimization with Nearly-Linear Gradient Oracle Complexity. NeurIPS 2022 - [c69]Yunbum Kook, Yin Tat Lee, Ruoqi Shen, Santosh S. Vempala:
Sampling with Riemannian Hamiltonian Monte Carlo in a Constrained Space. NeurIPS 2022 - [c68]Xuechen Li, Daogao Liu, Tatsunori B. Hashimoto, Huseyin A. Inan, Janardhan Kulkarni, Yin Tat Lee, Abhradeep Guha Thakurta:
When Does Differentially Private Learning Not Suffer in High Dimensions? NeurIPS 2022 - [c67]Sally Dong, Yu Gao, Gramoz Goranci, Yin Tat Lee, Richard Peng, Sushant Sachdeva, Guanghao Ye:
Nested Dissection Meets IPMs: Planar Min-Cost Flow in Nearly-Linear Time. SODA 2022: 124-153 - [c66]Maryam Fazel, Yin Tat Lee, Swati Padmanabhan, Aaron Sidford:
Computing Lewis Weights to High Precision. SODA 2022: 2723-2742 - [c65]Jan van den Brand, Yu Gao, Arun Jambulapati, Yin Tat Lee, Yang P. Liu, Richard Peng, Aaron Sidford:
Faster maxflow via improved dynamic spectral vertex sparsifiers. STOC 2022: 543-556 - [i79]Yunbum Kook, Yin Tat Lee, Ruoqi Shen, Santosh S. Vempala:
Sampling with Riemannian Hamiltonian Monte Carlo in a Constrained Space. CoRR abs/2202.01908 (2022) - [i78]Sivakanth Gopi, Yin Tat Lee, Daogao Liu:
Private Convex Optimization via Exponential Mechanism. CoRR abs/2203.00263 (2022) - [i77]Sally Dong, Yu Gao, Gramoz Goranci, Yin Tat Lee, Richard Peng, Sushant Sachdeva, Guanghao Ye:
Nested Dissection Meets IPMs: Planar Min-Cost Flow in Nearly-Linear Time. CoRR abs/2205.01562 (2022) - [i76]Xuechen Li, Daogao Liu, Tatsunori Hashimoto, Huseyin A. Inan, Janardhan Kulkarni, Yin Tat Lee, Abhradeep Guha Thakurta:
When Does Differentially Private Learning Not Suffer in High Dimensions? CoRR abs/2207.00160 (2022) - [i75]Sivakanth Gopi, Yin Tat Lee, Daogao Liu, Ruoqi Shen, Kevin Tian:
Private Convex Optimization in General Norms. CoRR abs/2207.08347 (2022) - [i74]Sally Dong, Haotian Jiang, Yin Tat Lee, Swati Padmanabhan, Guanghao Ye:
Decomposable Non-Smooth Convex Optimization with Nearly-Linear Gradient Oracle Complexity. CoRR abs/2208.03811 (2022) - [i73]Arun Jambulapati, Yin Tat Lee, Santosh S. Vempala:
A Slightly Improved Bound for the KLS Constant. CoRR abs/2208.11644 (2022) - [i72]Yunbum Kook, Yin Tat Lee, Ruoqi Shen, Santosh S. Vempala:
Condition-number-independent Convergence Rate of Riemannian Hamiltonian Monte Carlo with Numerical Integrators. CoRR abs/2210.07219 (2022) - [i71]Sophie Huiberts, Yin Tat Lee, Xinzhi Zhang:
Upper and Lower Bounds on the Smoothed Complexity of the Simplex Method. CoRR abs/2211.11860 (2022) - [i70]Jiyan He, Xuechen Li, Da Yu, Huishuai Zhang, Janardhan Kulkarni, Yin Tat Lee, Arturs Backurs, Nenghai Yu, Jiang Bian:
Exploring the Limits of Differentially Private Deep Learning with Group-wise Clipping. CoRR abs/2212.01539 (2022) - [i69]Kwangjun Ahn, Sébastien Bubeck, Sinho Chewi, Yin Tat Lee, Felipe Suarez, Yi Zhang:
Learning threshold neurons via the "edge of stability". CoRR abs/2212.07469 (2022) - 2021
- [j10]Michael B. Cohen, Yin Tat Lee, Zhao Song:
Solving Linear Programs in the Current Matrix Multiplication Time. J. ACM 68(1): 3:1-3:39 (2021) - [j9]Sébastien Bubeck, Ronen Eldan, Yin Tat Lee:
Kernel-based Methods for Bandit Convex Optimization. J. ACM 68(4): 25:1-25:35 (2021) - [j8]Yin Tat Lee, Man-Chung Yue:
Universal Barrier Is n-Self-Concordant. Math. Oper. Res. 46(3): 1129-1148 (2021) - [j7]Sébastien Bubeck, Michael B. Cohen, James R. Lee, Yin Tat Lee:
Metrical Task Systems on Trees via Mirror Descent and Unfair Gluing. SIAM J. Comput. 50(3): 909-923 (2021) - [c64]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Structured Logconcave Sampling with a Restricted Gaussian Oracle. COLT 2021: 2993-3050 - [c63]Janardhan Kulkarni, Yin Tat Lee, Daogao Liu:
Private Non-smooth ERM and SCO in Subquadratic Steps. NeurIPS 2021: 4053-4064 - [c62]Sivakanth Gopi, Yin Tat Lee, Lukas Wutschitz:
Numerical Composition of Differential Privacy. NeurIPS 2021: 11631-11642 - [c61]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Lower Bounds on Metropolized Sampling Methods for Well-Conditioned Distributions. NeurIPS 2021: 18812-18824 - [c60]Zhiqi Bu, Sivakanth Gopi, Janardhan Kulkarni, Yin Tat Lee, Judy Hanwen Shen, Uthaipon Tantipongpipat:
Fast and Memory Efficient Differentially Private-SGD via JL Projections. NeurIPS 2021: 19680-19691 - [c59]Jan van den Brand, Yin Tat Lee, Yang P. Liu, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Minimum cost flows, MDPs, and ℓ1-regression in nearly linear time for dense instances. STOC 2021: 859-869 - [c58]He Jia, Aditi Laddha, Yin Tat Lee, Santosh S. Vempala:
Reducing isotropy and volume to KLS: an o*(n3ψ2) volume algorithm. STOC 2021: 961-974 - [c57]Sally Dong, Yin Tat Lee, Guanghao Ye:
A nearly-linear time algorithm for linear programs with small treewidth: a multiscale representation of robust central path. STOC 2021: 1784-1797 - [i68]Jan van den Brand, Yin Tat Lee, Yang P. Liu, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Minimum Cost Flows, MDPs, and 𝓁1-Regression in Nearly Linear Time for Dense Instances. CoRR abs/2101.05719 (2021) - [i67]Zhiqi Bu, Sivakanth Gopi, Janardhan Kulkarni, Yin Tat Lee, Judy Hanwen Shen, Uthaipon Tantipongpipat:
Fast and Memory Efficient Differentially Private-SGD via JL Projections. CoRR abs/2102.03013 (2021) - [i66]Janardhan Kulkarni, Yin Tat Lee, Daogao Liu:
Private Non-smooth Empirical Risk Minimization and Stochastic Convex Optimization in Subquadratic Steps. CoRR abs/2103.15352 (2021) - [i65]Sivakanth Gopi, Yin Tat Lee, Lukas Wutschitz:
Numerical Composition of Differential Privacy. CoRR abs/2106.02848 (2021) - [i64]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Lower Bounds on Metropolized Sampling Methods for Well-Conditioned Distributions. CoRR abs/2106.05480 (2021) - [i63]Yin Tat Lee, Santosh S. Vempala:
Tutorial on the Robust Interior Point Method. CoRR abs/2108.04734 (2021) - [i62]Da Yu, Saurabh Naik, Arturs Backurs, Sivakanth Gopi, Huseyin A. Inan, Gautam Kamath, Janardhan Kulkarni, Yin Tat Lee, Andre Manoel, Lukas Wutschitz, Sergey Yekhanin, Huishuai Zhang:
Differentially Private Fine-tuning of Language Models. CoRR abs/2110.06500 (2021) - [i61]Maryam Fazel, Yin Tat Lee, Swati Padmanabhan, Aaron Sidford:
Computing Lewis Weights to High Precision. CoRR abs/2110.15563 (2021) - [i60]Jan van den Brand, Yu Gao, Arun Jambulapati, Yin Tat Lee, Yang P. Liu, Richard Peng, Aaron Sidford:
Faster Maxflow via Improved Dynamic Spectral Vertex Sparsifiers. CoRR abs/2112.00722 (2021) - 2020
- [j6]Yin Tat Lee, Marcin Pilipczuk, David P. Woodruff:
Introduction to the Special Issue on SODA'18. ACM Trans. Algorithms 16(1): 1:1-1:2 (2020) - [c56]Naman Agarwal, Sham M. Kakade, Rahul Kidambi, Yin Tat Lee, Praneeth Netrapalli, Aaron Sidford:
Leverage Score Sampling for Faster Accelerated Regression and ERM. ALT 2020: 22-47 - [c55]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Logsmooth Gradient Concentration and Tighter Runtimes for Metropolized Hamiltonian Monte Carlo. COLT 2020: 2565-2597 - [c54]Yin Tat Lee, Swati Padmanabhan:
An $\widetilde\mathcalO(m/\varepsilon^3.5)$-Cost Algorithm for Semidefinite Programs with Diagonal Constraints. COLT 2020: 3069-3119 - [c53]Haotian Jiang, Tarun Kathuria, Yin Tat Lee, Swati Padmanabhan, Zhao Song:
A Faster Interior Point Method for Semidefinite Programming. FOCS 2020: 910-918 - [c52]Jan van den Brand, Yin Tat Lee, Danupon Nanongkai, Richard Peng, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Bipartite Matching in Nearly-linear Time on Moderately Dense Graphs. FOCS 2020: 919-930 - [c51]Yin Tat Lee:
Convex Optimization and Dynamic Data Structure (Invited Talk). FSTTCS 2020: 3:1-3:1 - [c50]Sébastien Bubeck, Ronen Eldan, Yin Tat Lee, Dan Mikulincer:
Network size and size of the weights in memorization with two-layers neural networks. NeurIPS 2020 - [c49]Yair Carmon, Arun Jambulapati, Qijia Jiang, Yujia Jin, Yin Tat Lee, Aaron Sidford, Kevin Tian:
Acceleration with a Ball Optimization Oracle. NeurIPS 2020 - [c48]Marek Eliás, Michael Kapralov, Janardhan Kulkarni, Yin Tat Lee:
Differentially Private Release of Synthetic Graphs. SODA 2020: 560-578 - [c47]Sébastien Bubeck, Bo'az Klartag, Yin Tat Lee, Yuanzhi Li, Mark Sellke:
Chasing Nested Convex Bodies Nearly Optimally. SODA 2020: 1496-1508 - [c46]Sally Dong, Yin Tat Lee, Kent Quanrud:
Computing Circle Packing Representations of Planar Graphs. SODA 2020: 2860-2875 - [c45]Jan van den Brand, Yin Tat Lee, Aaron Sidford, Zhao Song:
Solving tall dense linear programs in nearly linear time. STOC 2020: 775-788 - [c44]Arun Jambulapati, Yin Tat Lee, Jerry Li, Swati Padmanabhan, Kevin Tian:
Positive semidefinite programming: mixed, parallel, and width-independent. STOC 2020: 789-802 - [c43]Haotian Jiang, Yin Tat Lee, Zhao Song, Sam Chiu-wai Wong:
An improved cutting plane method for convex optimization, convex-concave games, and its applications. STOC 2020: 944-953 - [c42]Aditi Laddha, Yin Tat Lee, Santosh S. Vempala:
Strong self-concordance and sampling. STOC 2020: 1212-1222 - [i59]Jan van den Brand, Yin Tat Lee, Aaron Sidford, Zhao Song:
Solving Tall Dense Linear Programs in Nearly Linear Time. CoRR abs/2002.02304 (2020) - [i58]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Logsmooth Gradient Concentration and Tighter Runtimes for Metropolized Hamiltonian Monte Carlo. CoRR abs/2002.04121 (2020) - [i57]Arun Jambulapati, Yin Tat Lee, Jerry Li, Swati Padmanabhan, Kevin Tian:
Positive Semidefinite Programming: Mixed, Parallel, and Width-Independent. CoRR abs/2002.04830 (2020) - [i56]Yair Carmon, Arun Jambulapati, Qijia Jiang, Yujia Jin, Yin Tat Lee, Aaron Sidford, Kevin Tian:
Acceleration with a Ball Optimization Oracle. CoRR abs/2003.08078 (2020) - [i55]Haotian Jiang, Yin Tat Lee, Zhao Song, Sam Chiu-wai Wong:
An Improved Cutting Plane Method for Convex Optimization, Convex-Concave Games and its Applications. CoRR abs/2004.04250 (2020) - [i54]Sébastien Bubeck, Ronen Eldan, Yin Tat Lee, Dan Mikulincer:
Network size and weights size for memorization with two-layers neural networks. CoRR abs/2006.02855 (2020) - [i53]Ruoqi Shen, Kevin Tian, Yin Tat Lee:
Composite Logconcave Sampling with a Restricted Gaussian Oracle. CoRR abs/2006.05976 (2020) - [i52]He Jia, Aditi Laddha, Yin Tat Lee, Santosh S. Vempala:
Reducing Isotropy and Volume to KLS: An O(n3ψ2) Volume Algorithm. CoRR abs/2008.02146 (2020) - [i51]Jan van den Brand, Yin Tat Lee, Danupon Nanongkai, Richard Peng, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Bipartite Matching in Nearly-linear Time on Moderately Dense Graphs. CoRR abs/2009.01802 (2020) - [i50]Haotian Jiang, Tarun Kathuria, Yin Tat Lee, Swati Padmanabhan, Zhao Song:
A Faster Interior Point Method for Semidefinite Programming. CoRR abs/2009.10217 (2020) - [i49]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Structured Logconcave Sampling with a Restricted Gaussian Oracle. CoRR abs/2010.03106 (2020) - [i48]Sally Dong, Yin Tat Lee, Guanghao Ye:
A Nearly-Linear Time Algorithm for Linear Programs with Small Treewidth: A Multiscale Representation of Robust Central Path. CoRR abs/2011.05365 (2020)
2010 – 2019
- 2019
- [j5]Kevin Scaman, Francis R. Bach, Sébastien Bubeck, Yin Tat Lee, Laurent Massoulié:
Optimal Convergence Rates for Convex Distributed Optimization in Networks. J. Mach. Learn. Res. 20: 159:1-159:31 (2019) - [c41]Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Near-optimal method for highly smooth convex optimization. COLT 2019: 492-507 - [c40]Michael B. Cohen, Ben Cousins, Yin Tat Lee, Xin Yang:
A near-optimal algorithm for approximating the John Ellipsoid. COLT 2019: 849-873 - [c39]Alexander V. Gasnikov, Pavel E. Dvurechensky, Eduard Gorbunov, Evgeniya A. Vorontsova, Daniil Selikhanovych, César A. Uribe, Bo Jiang, Haoyue Wang, Shuzhong Zhang, Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Near Optimal Methods for Minimizing Convex Functions with Lipschitz $p$-th Derivatives. COLT 2019: 1392-1393 - [c38]Yin Tat Lee, Zhao Song, Qiuyi Zhang:
Solving Empirical Risk Minimization in the Current Matrix Multiplication Time. COLT 2019: 2140-2157 - [c37]Deeparnab Chakrabarty, Yin Tat Lee, Aaron Sidford, Sahil Singla, Sam Chiu-wai Wong:
Faster Matroid Intersection. FOCS 2019: 1146-1168 - [c36]Sébastien Bubeck, Yin Tat Lee, Eric Price, Ilya P. Razenshteyn:
Adversarial examples from computational constraints. ICML 2019: 831-840 - [c35]Ruoqi Shen, Yin Tat Lee:
The Randomized Midpoint Method for Log-Concave Sampling. NeurIPS 2019: 2098-2109 - [c34]Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Complexity of Highly Parallel Non-Smooth Convex Optimization. NeurIPS 2019: 13900-13909 - [c33]Sébastien Bubeck, Michael B. Cohen, James R. Lee, Yin Tat Lee:
Metrical task systems on trees via mirror descent and unfair gluing. SODA 2019: 89-97 - [c32]C. J. Argue, Sébastien Bubeck, Michael B. Cohen, Anupam Gupta, Yin Tat Lee:
A Nearly-Linear Bound for Chasing Nested Convex Bodies. SODA 2019: 117-122 - [c31]Sébastien Bubeck, Yin Tat Lee, Yuanzhi Li, Mark Sellke:
Competitively chasing convex bodies. STOC 2019: 861-868 - [c30]Michael B. Cohen, Yin Tat Lee, Zhao Song:
Solving linear programs in the current matrix multiplication time. STOC 2019: 938-942 - [i47]Yin Tat Lee, Swati Padmanabhan:
An Õ(m/ε3.5)-Cost Algorithm for Semidefinite Programs with Diagonal Constraints. CoRR abs/1903.01859 (2019) - [i46]Yin Tat Lee, Zhao Song, Qiuyi Zhang:
Solving Empirical Risk Minimization in the Current Matrix Multiplication Time. CoRR abs/1905.04447 (2019) - [i45]Michael B. Cohen, Ben Cousins, Yin Tat Lee, Xin Yang:
A near-optimal algorithm for approximating the John Ellipsoid. CoRR abs/1905.11580 (2019) - [i44]Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Complexity of Highly Parallel Non-Smooth Convex Optimization. CoRR abs/1906.10655 (2019) - [i43]Ruoqi Shen, Yin Tat Lee:
The Randomized Midpoint Method for Log-Concave Sampling. CoRR abs/1909.05503 (2019) - [i42]Yin Tat Lee, Aaron Sidford:
Solving Linear Programs with Sqrt(rank) Linear System Solves. CoRR abs/1910.08033 (2019) - [i41]Sally Dong, Yin Tat Lee, Kent Quanrud:
Computing Circle Packing Representations of Planar Graphs. CoRR abs/1911.00612 (2019) - [i40]Aditi Laddha, Yin Tat Lee, Santosh S. Vempala:
Strong Self-Concordance and Sampling. CoRR abs/1911.05656 (2019) - [i39]Deeparnab Chakrabarty, Yin Tat Lee, Aaron Sidford, Sahil Singla, Sam Chiu-wai Wong:
Faster Matroid Intersection. CoRR abs/1911.10765 (2019) - 2018
- [j4]Yin Tat Lee, He Sun:
Constructing Linear-Sized Spectral Sparsification in Almost-Linear Time. SIAM J. Comput. 47(6): 2315-2336 (2018) - [c29]Yin Tat Lee, Aaron Sidford, Santosh S. Vempala:
Efficient Convex Optimization with Membership Oracles. COLT 2018: 1292-1294 - [c28]Kevin Scaman, Francis R. Bach, Sébastien Bubeck, Laurent Massoulié, Yin Tat Lee:
Optimal Algorithms for Non-Smooth Distributed Optimization in Networks. NeurIPS 2018: 2745-2754 - [c27]Sébastien Bubeck, Michael B. Cohen, Yin Tat Lee, James R. Lee, Aleksander Madry:
k-server via multiscale entropic regularization. STOC 2018: 3-16 - [c26]Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee, Akshay Ramachandran:
The Paulsen problem, continuous operator scaling, and smoothed analysis. STOC 2018: 182-189 - [c25]Ankit Garg, Yin Tat Lee, Zhao Song, Nikhil Srivastava:
A matrix expander Chernoff bound. STOC 2018: 1102-1114 - [c24]Yin Tat Lee, Santosh S. Vempala:
Convergence rate of riemannian Hamiltonian Monte Carlo and faster polytope volume computation. STOC 2018: 1115-1121 - [c23]Yin Tat Lee, Santosh S. Vempala:
Stochastic localization + Stieltjes barrier = tight bound for log-Sobolev. STOC 2018: 1122-1129 - [c22]Sébastien Bubeck, Michael B. Cohen, Yin Tat Lee, Yuanzhi Li:
An homotopy method for lp regression provably beyond self-concordance and in input-sparsity time. STOC 2018: 1130-1137 - [i38]C. J. Argue, Sébastien Bubeck, Michael B. Cohen, Anupam Gupta, Yin Tat Lee:
A Nearly-Linear Bound for Chasing Nested Convex Bodies. CoRR abs/1806.08865 (2018) - [i37]Yin Tat Lee, Santosh S. Vempala:
The Kannan-Lovász-Simonovits Conjecture. CoRR abs/1807.03465 (2018) - [i36]Sébastien Bubeck, Michael B. Cohen, James R. Lee, Yin Tat Lee:
Metrical task systems on trees via mirror descent and unfair gluing. CoRR abs/1807.04404 (2018) - [i35]Michael B. Cohen, Yin Tat Lee, Zhao Song:
Solving Linear Programs in the Current Matrix Multiplication Time. CoRR abs/1810.07896 (2018) - [i34]Sébastien Bubeck, Yin Tat Lee, Yuanzhi Li, Mark Sellke:
Competitively Chasing Convex Bodies. CoRR abs/1811.00887 (2018) - [i33]Sébastien Bubeck, Yin Tat Lee, Yuanzhi Li, Mark Sellke:
Chasing Nested Convex Bodies Nearly Optimally. CoRR abs/1811.00999 (2018) - [i32]Sébastien Bubeck, Yin Tat Lee, Eric Price, Ilya P. Razenshteyn:
Adversarial Examples from Cryptographic Pseudo-Random Generators. CoRR abs/1811.06418 (2018) - [i31]Yin Tat Lee, Zhao Song, Santosh S. Vempala:
Algorithmic Theory of ODEs and Sampling from Well-conditioned Logconcave Densities. CoRR abs/1812.06243 (2018) - 2017
- [j3]Michael Kapralov, Yin Tat Lee, Cameron Musco, Christopher Musco, Aaron Sidford:
Single Pass Spectral Sparsification in Dynamic Streams. SIAM J. Comput. 46(1): 456-477 (2017) - [j2]Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee:
Improved Cheeger's Inequality and Analysis of Local Graph Partitioning using Vertex Expansion and Expansion Profile. SIAM J. Comput. 46(3): 890-910 (2017) - [c21]Yin Tat Lee, Santosh Srinivas Vempala:
Eldan's Stochastic Localization and the KLS Hyperplane Conjecture: An Improved Lower Bound for Expansion. FOCS 2017: 998-1007 - [c20]Kevin Scaman, Francis R. Bach, Sébastien Bubeck, Yin Tat Lee, Laurent Massoulié:
Optimal Algorithms for Smooth and Strongly Convex Distributed Optimization in Networks. ICML 2017: 3027-3036 - [c19]Sébastien Bubeck, Yin Tat Lee, Ronen Eldan:
Kernel-based methods for bandit convex optimization. STOC 2017: 72-85 - [c18]Yin Tat Lee, He Sun:
An SDP-based algorithm for linear-sized spectral sparsification. STOC 2017: 678-687 - [c17]Yin Tat Lee, Santosh S. Vempala:
Geodesic walks in polytopes. STOC 2017: 927-940 - [c16]Deeparnab Chakrabarty, Yin Tat Lee, Aaron Sidford, Sam Chiu-wai Wong:
Subquadratic submodular function minimization. STOC 2017: 1220-1231 - [i30]Yin Tat Lee, He Sun:
An SDP-Based Algorithm for Linear-Sized Spectral Sparsification. CoRR abs/1702.08415 (2017) - [i29]Yin Tat Lee, Aaron Sidford, Santosh S. Vempala:
Efficient Convex Optimization with Membership Oracles. CoRR abs/1706.07357 (2017) - [i28]Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee, Akshay Ramachandran:
The Paulsen Problem, Continuous Operator Scaling, and Smoothed Analysis. CoRR abs/1710.02587 (2017) - [i27]Yin Tat Lee, Santosh Srinivas Vempala:
Convergence Rate of Riemannian Hamiltonian Monte Carlo and Faster Polytope Volume Computation. CoRR abs/1710.06261 (2017) - [i26]Sébastien Bubeck, Michael B. Cohen, James R. Lee, Yin Tat Lee, Aleksander Madry:
k-server via multiscale entropic regularization. CoRR abs/1711.01085 (2017) - [i25]Sébastien Bubeck, Michael B. Cohen, Yin Tat Lee, Yuanzhi Li:
An homotopy method for 𝓵p regression provably beyond self-concordance and in input-sparsity time. CoRR abs/1711.01328 (2017) - [i24]Naman Agarwal, Sham M. Kakade, Rahul Kidambi, Yin Tat Lee, Praneeth Netrapalli, Aaron Sidford:
Leverage Score Sampling for Faster Accelerated Regression and ERM. CoRR abs/1711.08426 (2017) - 2016
- [j1]Yin Tat Lee, Ka Chun Lam, Lok Ming Lui:
Landmark-Matching Transformation with Large Deformation Via n-dimensional Quasi-conformal Maps. J. Sci. Comput. 67(3): 926-954 (2016) - [c15]Sébastien Bubeck, Yin Tat Lee:
Black-box Optimization with a Politician. ICML 2016: 1624-1631 - [c14]Zeyuan Allen Zhu, Yin Tat Lee, Lorenzo Orecchia:
Using Optimization to Obtain a Width-Independent, Parallel, Simpler, and Faster Positive SDP Solver. SODA 2016: 1824-1831 - [c13]Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee:
Improved Cheeger's Inequality and Analysis of Local Graph Partitioning using Vertex Expansion and Expansion Profile. SODA 2016: 1848-1861 - [c12]Michael B. Cohen, Yin Tat Lee, Gary L. Miller, Jakub Pachocki, Aaron Sidford:
Geometric median in nearly linear time. STOC 2016: 9-21 - [c11]Rasmus Kyng, Yin Tat Lee, Richard Peng, Sushant Sachdeva, Daniel A. Spielman:
Sparsified Cholesky and multigrid solvers for connection laplacians. STOC 2016: 842-850 - [i23]Sébastien Bubeck, Yin Tat Lee:
Black-box optimization with a politician. CoRR abs/1602.04847 (2016) - [i22]Yin Tat Lee, Santosh S. Vempala:
Geodesic Walks on Polytopes. CoRR abs/1606.04696 (2016) - [i21]Michael B. Cohen, Yin Tat Lee, Gary L. Miller, Jakub Pachocki, Aaron Sidford:
Geometric Median in Nearly Linear Time. CoRR abs/1606.05225 (2016) - [i20]Sébastien Bubeck, Ronen Eldan, Yin Tat Lee:
Kernel-based methods for bandit convex optimization. CoRR abs/1607.03084 (2016) - [i19]Deeparnab Chakrabarty, Yin Tat Lee, Aaron Sidford, Sam Chiu-wai Wong:
Subquadratic Submodular Function Minimization. CoRR abs/1610.09800 (2016) - [i18]Yin Tat Lee, Santosh S. Vempala:
Eldan's Stochastic Localization and the KLS Hyperplane Conjecture: An Improved Lower Bound for Expansion. CoRR abs/1612.01507 (2016) - 2015
- [c10]Yin Tat Lee, Aaron Sidford:
Efficient Inverse Maintenance and Faster Algorithms for Linear Programming. FOCS 2015: 230-249 - [c9]Yin Tat Lee, He Sun:
Constructing Linear-Sized Spectral Sparsification in Almost-Linear Time. FOCS 2015: 250-269 - [c8]Yin Tat Lee, Aaron Sidford, Sam Chiu-wai Wong:
A Faster Cutting Plane Method and its Implications for Combinatorial and Convex Optimization. FOCS 2015: 1049-1065 - [c7]Michael B. Cohen, Yin Tat Lee, Cameron Musco, Christopher Musco, Richard Peng, Aaron Sidford:
Uniform Sampling for Matrix Approximation. ITCS 2015: 181-190 - [i17]Yin Tat Lee, Aaron Sidford:
Efficient Inverse Maintenance and Faster Algorithms for Linear Programming. CoRR abs/1503.01752 (2015) - [i16]Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee:
Improved Cheeger's Inequality and Analysis of Local Graph Partitioning using Vertex Expansion and Expansion Profile. CoRR abs/1504.00686 (2015) - [i15]Sébastien Bubeck, Yin Tat Lee, Mohit Singh:
A geometric alternative to Nesterov's accelerated gradient descent. CoRR abs/1506.08187 (2015) - [i14]Yin Tat Lee, Richard Peng, Daniel A. Spielman:
Sparsified Cholesky Solvers for SDD linear systems. CoRR abs/1506.08204 (2015) - [i13]Zeyuan Allen Zhu, Yin Tat Lee, Lorenzo Orecchia:
Using Optimization to Obtain a Width-Independent, Parallel, Simpler, and Faster Positive SDP Solver. CoRR abs/1507.02259 (2015) - [i12]Yin Tat Lee, He Sun:
Constructing Linear-Sized Spectral Sparsification in Almost-Linear Time. CoRR abs/1508.03261 (2015) - [i11]Yin Tat Lee, Aaron Sidford, Sam Chiu-wai Wong:
A Faster Cutting Plane Method and its Implications for Combinatorial and Convex Optimization. CoRR abs/1508.04874 (2015) - [i10]Rasmus Kyng, Yin Tat Lee, Richard Peng, Sushant Sachdeva, Daniel A. Spielman:
Sparsified Cholesky and Multigrid Solvers for Connection Laplacians. CoRR abs/1512.01892 (2015) - 2014
- [c6]Yin Tat Lee, Aaron Sidford:
Path Finding Methods for Linear Programming: Solving Linear Programs in Õ(vrank) Iterations and Faster Algorithms for Maximum Flow. FOCS 2014: 424-433 - [c5]Michael Kapralov, Yin Tat Lee, Cameron Musco, Christopher Musco, Aaron Sidford:
Single Pass Spectral Sparsification in Dynamic Streams. FOCS 2014: 561-570 - [c4]Jonathan A. Kelner, Yin Tat Lee, Lorenzo Orecchia, Aaron Sidford:
An Almost-Linear-Time Algorithm for Approximate Max Flow in Undirected Graphs, and its Multicommodity Generalizations. SODA 2014: 217-226 - [i9]Yin Tat Lee:
Probabilistic Spectral Sparsification In Sublinear Time. CoRR abs/1401.0085 (2014) - [i8]Yin Tat Lee, Ka Chun Lam, Lok Ming Lui:
Large Deformation Registration via n-dimensional Quasi-conformal Maps. CoRR abs/1402.6908 (2014) - [i7]Michael Kapralov, Yin Tat Lee, Cameron Musco, Christopher Musco, Aaron Sidford:
Single Pass Spectral Sparsification in Dynamic Streams. CoRR abs/1407.1289 (2014) - [i6]Michael B. Cohen, Yin Tat Lee, Cameron Musco, Christopher Musco, Richard Peng, Aaron Sidford:
Uniform Sampling for Matrix Approximation. CoRR abs/1408.5099 (2014) - 2013
- [c3]Yin Tat Lee, Aaron Sidford:
Efficient Accelerated Coordinate Descent Methods and Faster Algorithms for Solving Linear Systems. FOCS 2013: 147-156 - [c2]Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee, Shayan Oveis Gharan, Luca Trevisan:
Improved Cheeger's inequality: analysis of spectral partitioning algorithms through higher order spectral gap. STOC 2013: 11-20 - [c1]Yin Tat Lee, Satish Rao, Nikhil Srivastava:
A new approach to computing maximum flows using electrical flows. STOC 2013: 755-764 - [i5]Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee, Shayan Oveis Gharan, Luca Trevisan:
Improved Cheeger's Inequality: Analysis of Spectral Partitioning Algorithms through Higher Order Spectral Gap. CoRR abs/1301.5584 (2013) - [i4]Jonathan A. Kelner, Lorenzo Orecchia, Yin Tat Lee, Aaron Sidford:
An Almost-Linear-Time Algorithm for Approximate Max Flow in Undirected Graphs, and its Multicommodity Generalizations. CoRR abs/1304.2338 (2013) - [i3]Yin Tat Lee, Aaron Sidford:
Efficient Accelerated Coordinate Descent Methods and Faster Algorithms for Solving Linear Systems. CoRR abs/1305.1922 (2013) - [i2]Yin Tat Lee, Aaron Sidford:
Matching the Universal Barrier Without Paying the Costs : Solving Linear Programs with Õ(sqrt(rank)) Linear System Solves. CoRR abs/1312.6677 (2013) - [i1]Yin Tat Lee, Aaron Sidford:
Following the Path of Least Resistance : An Õ(m sqrt(n)) Algorithm for the Minimum Cost Flow Problem. CoRR abs/1312.6713 (2013)
Coauthor Index
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
Privacy notice: By enabling the option above, your browser will contact the API of archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from , , and to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
last updated on 2024-10-07 22:12 CEST by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint