Professor Ruoyu Sun takes interdisciplinary approach to optimization
New ISE faculty member Ruoyu Sun hopes his research can add to the understanding of algorithms used in fields such as artificial intelligence and big data.
Sun, who joined ISE in spring 2017 as an assistant professor, is primarily interested in studying optimization.
Sun comes to Illinois from Facebook Artificial Intelligence Research, where he studied neural networks. Before that, he worked at Stanford University studying large-scale optimization algorithms.
Neural networks, are computational modles for solving problems motivated by the way the human brain works. They are also called deep learning, and are used in artificial intelligence.
“I think artificial intelligence is the future,” Sun says.
His interest in this area led to his current research in non-convex optimization, an emerging field of study that involves solving problems that are more difficult than convex optimization problems.
In convex optimization problems, the local optimal solution is the global one, and the problem can be solved efficiently. But in non-convex optimization problems, multiple local optimal solutions exist.
“The goal of optimization is trying to find the minimizer or maximizer of the function — maximize profit or minimize cost. But usually this is only easy when the function is convex,” Sun says. “When it’s non-convex, there’s no general algorithm to solve it globally. We need to develop new tools to understand that.”
In his PhD thesis, Sun established a theory for low-rank matrix completion problems, a mathematical model for recommendation systems used by Facebook and Google.
He hopes to extend this analysis to more general non-convex optimization problems.
In doing this, he hopes to help those in the field of artificial intelligence and machine learning, as these fields involve non-convex optimization problems.
A better understanding of these problems could lead to better algorithms that can be used in these fields.
“The understanding part is missing,” Sun says. “I think there is a need there.”
Another field he has been working on is large-scale optimization. Due to the increasing size of the problems in the age of big data, large-scale optimization has been a popular topic.
“Large-scale is another major challenge in big data and artificial intelligence. Large-scale convex problems are already quite difficult to solve, and large-scale non-convex problems are even harder.”
One approach he is particularly interested in is called the alternating direction method of multipliers, or ADMM, a method invented four decades ago. In recent years, it has received increased interest.
“Previous research is focused on unconstrained problems, but constrained problems are a lot more difficult," Sun says. "ADMM is a natural candidate for such problems.”
Sun has an interdisciplinary approach when it comes to his research, and he believes that different fields can learn a lot from each other.
“We all have the same goal; for example, to make big data useful for everyone,” he says. “I would like to have a multi-disciplinary view.”
He emphasizes this in his course, Advanced Topics in Continuous Optimization, which focuses on large-scale and non-convex optimization problems.
“The research problems in many fields are very similar in essence,” he says. “I’m hoping to see the different fields learn from each other. While I’m still grounded in optimization, I want to borrow ideas from other fields, and contribute to them as well.”