Student Highlight: Blake Woodworth

Sixth-year PhD student Blake Woodworth began his studies at TTIC at an exciting time in his field, in 2015, when machine learning was progressing rapidly. “That was kind of exciting for me. As an undergrad, I’d taken a random assortment of math and computer science classes, but I took this one machine learning class that I thought was really cool. We made working robots, and it was exciting, so that was how I got into it,” said Woodworth.

He recently completed his thesis defense, the final part of the PhD program at TTIC. He began writing his thesis in late February, but the research that went into it is mostly the product of the last two years. He was inspired to explore optimization from working with his advisor, Professor Nati Srebro. He had almost never heard of optimization before coming to TTIC, but he ended up being especially intrigued by the field, specifically in the oracle complexity of convex, non-convex, and distributed optimization problems.

His thesis centers around basic theoretical questions in distributed optimization. For example, exploring how parallelism can be used to optimize objectives. “Some of the more interesting and relevant questions are quite hard, and it turns out, there are some very basic questions that we don’t really have a handle on. Many people in the field spend a lot of time trying to analyze distributed optimization, and surprisingly, a lot of the theory that has existed so far has fallen short of beating some very simple baselines,” he said.

Essentially, optimization is figuring out how to make machine learning problems easier to solve, and more efficient. It is the mathematical basis for machine learning, which is centered around creating a model, finding relevant data, and training the model on that set of data. When an optimization problem is solved, the parameters that fit the data are found by minimizing the loss function (a function that measures how successful a model is at creating the predicted outcome).

“I think the idea is to identify how you should use something like parallelism for optimization. With something I found, one of the main results is that there’s a trade off between using parallelism at all, versus using sequential computation on each parallel machine. What this means is that you should, if you intend to use parallelism, try not to be too aggressive with what you’re dealing with on each machine. You want to be conservative and and try to aggregate across machines,” said Woodworth.

Ultimately, he chose to come to TTIC for his PhD because of his passion for machine learning. When he was first applying to graduate programs, he wasn’t entirely sure what he wanted to do. Woodworth was also accepted to a program in statistics at another university, as well as a theoretical computer science program. For him, along with the more personalized and focused nature of the research community at TTIC, and the location in Chicago, it came down to choosing machine learning.

“I very much like the people at TTIC, I made a lot of friends and had a lot of fun with them. I think something I like about TTIC is how focused it is. Everyone is working in generally the same area, so we all more or less understand each other. Whenever we go to talks we are on the same page. I know people in other departments where someone comes in and gives a talk on a subject you’ve never heard of, and there’s not much of a point in going because you have no idea what they’re talking about. That’s never really the case here, we all have similar backgrounds,” he said.

His advice to students who are in earlier stages of the PhD program is to pursue what they’re truly interested in, rather than what may be popular currently. He believes that conducting good research that you enjoy is more important than chasing publication, and that if people enjoy the work that they are doing, they will be far happier in the long run. “I also think that personally, there’s a limit to how much useful thinking I can do in a day. There’s no reason to try to do more than that, because it doesn’t help. I think people should take it easy on hours worked, you don’t have to go all the time,” said Woodworth.

As a PhD student, he has already received several awards and recognitions, including an NSF Graduate Research Fellowship in 2017, a Google Research PhD Fellowship in 2019, and the Best Student Paper Award at COLT 2019 with co-authors Dr. Dylan Foster, Ayush Sekhari, Prof. Ohad Shamir, Prof. Nathan Srebro, and Prof. Karthik Sridharan. After graduating, he plans on pursuing a career in academia. Woodworth hopes to be a professor, and continue focusing on machine learning and optimization.