In recent years, Optimization has been recognized as a crucial component in Data Science disciplines, such as machine learning, data mining and statistics. Due to the high dimensionality of real-world data sets and models, it is often imperative to follow the principle of parsimony, in the form of either sparsity or minimal rank of solutions. This principle can be formulated in terms of optimization models, which are often very di�cult to solve. Our research plan focuses on these key challenges that face the �eld of optimization and its applications in Data Science. To help guide our research, we selected three important problems that are in need of improved optimization models and algorithms: matrix completion, sparse inverse covariance estimation, and dictionary learning. These problems o�er a focal point for a rich set of applications, ranging from image processing and text mining to predictive control and recommender systems. State-of-the-art algorithms are often slow and do not parallelize well. Our goal is to develop optimization algorithms that are tailored to these classes of problems.
Status | Finished |
---|---|
Effective start/end date | 5/1/14 → 10/31/17 |