Kiran Vodrahalli

{about} {research} {talks} {blog} {archive} {teaching} {notes} {links} {press}




I am a Computer Science Ph.D. student at Columbia University, focusing on theoretical computer science, with particular interest in machine learning, algorithms, and statistics. My thesis topic is resource-efficient (sparse/ low-memory / low-rank) machine learning. I am fortunate to
be advised by Professor Daniel Hsu and Professor Alex Andoni. I have been supported by an NSF Graduate Research Fellowship.


Other Info:

For more details, either check out this website or see my [CV] .


I am currently looking for postdoctoral and research positions starting in Fall 2022; I am interested in both applied and theoretical (or some mix thereof) research in machine learning (see my [CV] ). I am interested in focusing on one of the following topics in my next role:

Memory Bounded and Resource Efficient Learning

  • memory-bounded and resource-efficiency aspects of
    • bandit/reinforcement learning
    • language modeling and understanding
    • representation learning

Learning and Algorithmic Game Theory (see The Platform Design Problem)

  • game theoretic analysis of bi-level environment design
    and applications to internet economics and reinforcement learning
  • manipulation of learning agents and strategic behavior of learning agents in response to manipulation
  • data collection mechanisms and privacy/fairness considerations
  • new concepts of equilibria in learning in games and multi-agent learning settings
  • incentives and strategic behavior in machine learning

Fusing Logic and Learning (see Learning and Planning with Logical Automata)

  • outfitting large blackbox generative models/policies with interpretable controls
  • interpretable machine learning via interactive learning
  • learning the “grammar” of action sequences (ex: learning the rules of the road)

Please email me if you think we might have some aligned interests!

Research Interests

My primary area of research is theoretical computer science: in particular, provably resource-efficient algorithms for fitting statistical models in various settings (“algorithmic statistics”, “foundations of machine learning”, “learning theory”, etc.). Some of my work in this direction has skewed in the direction of giving computationally efficient, low sample complexity algorithms for learning functions with sparse descriptions. I am also interested in algorithms and optimization theory. I have also worked on applications of machine learning in several fields, including neuroscience, natural language understanding, economics, and robotics. Currently, I am particularly focused on designing algorithms and proving lower bounds for memory-bounded learning and optimization problems. I am also working on applying ideas from machine learning (online learning, learning in games, reinforcement learning) and bi-level optimization to understand computational and statistical issues associated with the economics of the online firm, as well as associated privacy, ethics, and fairness concerns (see my recent paper The Platform Design Problem).


In Fall 2021 and early Spring 2022 I was also a part-time student researcher at Google Brain.

In Summer 2021 I was a research intern at Google Brain, where I worked on principled resource-efficient methods for training deep neural networks.

In Summer 2019 I visited the Simons Institute at Berkeley for the Foundations of Deep Learning program.

I graduated from Princeton with an A.B. Mathematics degree with honors in 2016 and an M.S.E. in Computer Science in 2017, where I was lucky to have Professor Sanjeev Arora and Professor Ken Norman as thesis advisors. I was a member of Sanjeev Arora’s Unsupervised Learning Group, where I studied provable methods for machine learning (also a part of NLP @ Princeton and ML Theory @Princeton), in particular focusing on natural language understanding. I was also a member of Ken Norman’s Computational Memory Lab at the Princeton Neuroscience Institute, where I applied machine learning to fMRI analysis methods.