Professor Ruoyu Sun takes interdisciplinary approach to optimization
New ISE faculty member Ruoyu Sun hopes his research can add to the understanding of algorithms used in fields such as artificial intelligence and big data.
Sun, who joined ISE in spring 2017 as an assistant professor, is primarily interested in studying optimization.
Previously, he worked for Facebook Artificial Intelligence Research studying neural networks, which are computational models for solving problems motivated by the way human brain works. Before that, he worked at Stanford University studying large-scale optimization algorithms.
Neural networks, also called deep learning, are used in artificial intelligence.
“I think artificial intelligence is the future,” Sun said.
His interest in this area led to his current research in non-convex optimization, an emerging field of study that involves solving problems that are more difficult to solve than convex optimization problems.
In convex optimization problems, the local optimal solution is the global one, and the problem can be solved efficiently. But in non-convex optimization problems, multiple local optimal solutions exist.
“The goal of optimization is trying to find the minimizer or maximizer of the function — maximize profit or minimize cost. But usually this is only easy when the function is convex,” Sun said. “When it’s non-convex, there’s no general algorithm to solve it globally. We need to develop new tools to understand that.”
In his PhD thesis, Sun established a theory for low-rank matrix completion problem, a mathematical model for recommendation systems used in Facebook, Google, etc.
He hopes to extend this analysis to more general non-convex optimization problems.
In doing this, he hopes to help those in the field of artificial intelligence and machine learning, as these fields involve non-convex optimization problems.
A better understanding of these problems could lead to better algorithms that can be used in these fields.
“The understanding part is missing,” Sun said. “I think there is a need there.”
Another field he has been working on is large-scale optimization. Due to the increasing size of the problems in the age of big data, large-scale optimization has been a popular topic.
“Large-scale is another major challenge in big data and artificial intelligence. Large-scale convex problems are already quite difficult to solve, and large-scale non-convex problems are even harder.”
One approach he is particularly interested in is called alternating direction method of multipliers, or ADMM, which is a method invented four decades ago. In recent years, it has received increased interest.
“Previous research is focused on unconstrained problems, but constrained problems are a lot more difficult," Sun said. "ADMM is a very natural candidate for such problems.”
It was shown recently that ADMM with three blocks may diverge. This is bad news for ADMM. However, rather surprisingly, using a simple random permutation trick, ADMM can converge in numerical experiments. Sun and his collaborators proved the first convergence result for such a variant of ADMM.
“We hope this research can opens the door for understanding randomly permuted ADMM, a promising approach for constrained problems," he said.
Sun has an interdisciplinary approach when it comes to his research, and he believes that different fields can learn a lot from each other.
“We all have the same goal; for example, to make big data useful for everyone,” he said. “I would like to have a multi-disciplinary view.”
He emphasizes this in his course, Advanced Topics in Continuous Optimization, which focuses on large-scale and non-convex optimization problems.
“The research problems in many fields are very similar in essence,” he said. “I’m hoping to see the different fields learn from each other. While I’m still grounded in optimization, I want to borrow ideas from other fields, and contribute to them as well.”