People of ACM - Corinna Cortes

March 28, 2023

You initially received a Master of Science Degree in Physics at the University of Copenhagen before earning a PhD in Computer Science from the University of Rochester. What prompted you to make the switch from physics to computer science?

The late 1980s were the early days of neural networks: Hopfield networks, Boltzmann machines, associative memory, and more. Physicists were coming to the area from spin glass theory and my advisor John Hertz introduced me to it. A large part of the work we did was computer simulations, so it seemed natural to continue my studies in Computer Science. On my way to Rochester, John Hertz arranged for me to have a summer internship at Bell Labs in the Adaptive Systems Research Group, headed by Larry Jackel. It was a very exciting time and a great department. There, I met Sara A. Solla, Yann LeCun, Patrice Simard, Leon Bottou, Isabelle Guyon, and later Vladimir Vapnik. I ended up spending most of my graduate days at Bell Labs in this group and developed the Support Vector Machines with Vladimir Vapnik. After graduation, I stayed at Bell Labs and started work on large-scale data mining.

Will you tell us a little about Google Research in New York City?

I founded Google Research in NYC in 2004. Google was still very young and not many professors or students knew we had a research team in NYC. It was small, less than a handful of people, but we wrote papers and went to conferences. That was slightly different from the research group at the headquarters in Mountain View, California. Google had not gone public yet, and out in California it was all hands on deck. Our physical distance to the mothership allowed us to do the kind of work we were hired to do, create the algorithms and techniques needed to power the next version of Google. As an industrial research organization, all our problems are rooted in real Google needs and we have lots of product impact. But we are always aiming for the general solution, and to date we are the research team with the largest number of publications per member. Our team spans a wide area of Computer Science disciplines including Speech Recognition, Market Algorithms, Theory and Algorithms, Optimization, Computer Vision, Information Retrieval and of course Machine Learning. I'm also responsible for teams around the world—from Paris to Sao Paulo to Mountain View—whose research is also very exciting and converse a broad spectrum.

You are known for your foundational work on Support Vector Machines (SVMs), which are among the most frequently used algorithms in machine learning. What are SVMs and why have they been so important to the field?

Support Vector Machines are machine learning algorithms that benefit from both having a very solid theory with generalization guarantees, and also resulting in a convex optimization problem. This is in stark contrast to Deep Learning, which has no clear generalization guarantee yet and is a highly non-convex optimization problem. Additionally, the margin theory originally developed for SVMs has since inspired a large body of work in theoretical machine learning and the design of many other effective algorithms.

SVMs were originally designed for classification problems, but Vladimir Vapnik and I also worked on regression, and later many algorithms such as clustering and principal component analysis were also “kernelized.” The idea behind SVMs is that while points for a two-group classification problem may not be separable in the input space, they may be so in a high-dimensional space obtained implicitly by a positive definite symmetric kernel transformation. To allow for classification errors even in this high-dimensional space, every point is equipped with a “slack” variable that expresses its distance from the separating surface. The Support Vectors are the points closest to the decision surface, or the points with errors. The SVM solution is expressed solely in terms of the support vectors and the support vectors can be viewed as a compressed representation of the training data.

Many experts say one goal for Machine Learning should be to develop a solid mathematical theory as its foundation, so that we can better guide its development. Do you predict any progress on this front in the near future?

I would hope so, but it is not clear how much progress one can make with Deep Learning, it is a very non-convex formulation and the theory for generalization has a long way to go. However, I have been truly impressed with the capabilities of these very large language models. Great progress is being made in speeding up their training and making them more robust to input variations. They do tend to “hallucinate” and sometimes create text and images that sound and look very convincing, but that are factually wrong and physically impossible. There's a lot of work being done by talented people to ground outputs in reality and ensure models perform responsibly and efficiently.

How do you train for a marathon? Do you run every day?

Not every day, I take Fridays off. And I also sometimes skip Mondays. I always follow a training plan that varies with the day of the week. Tuesdays and Thursdays I do speed training, Saturday I do a long run up to 23 miles. On days I don’t run, I go to the gym. If I don’t use my body, I can get a bit irritated.

Corinna Cortes is a Vice President of Google Research in New York City, where she is working on a broad range of theoretical and applied large-scale machine learning problems. She has published numerous articles on topics including supervised learning by classification, and data mining.

Her honors include receiving the ACM Paris Kanellakis Theory and Practice Award (jointly with Vladimir Vapnik) for her contributions to the theoretical foundations of support vector machines (SVMs) and the AT&T Science and Technology Medal for her work on data mining in very large data sets. Cortes was recently named an ACM Fellow for theoretical and practical contributions to machine learning, industrial leadership, and service to the field. Outside of her work in computing, Cortes is a competitive runner and has completed 17 New York City Marathons among other races.