• Login
    View Item 
    •   Home
    • Theses & Dissertations
    • 2019 - Mines Theses & Dissertations
    • View Item
    •   Home
    • Theses & Dissertations
    • 2019 - Mines Theses & Dissertations
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of Mines RepositoryCommunitiesPublication DateAuthorsTitlesSubjectsThis CollectionPublication DateAuthorsTitlesSubjects

    My Account

    Login

    Mines Links

    Arthur Lakes LibraryColorado School of Mines

    Statistics

    Display Statistics

    Convex and nonconvex optimization geometries

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Thumbnail
    Name:
    Li_mines_0052E_11804.pdf
    Size:
    5.887Mb
    Format:
    PDF
    Download
    Author
    Li, Qiuwei
    Advisor
    Tang, Gongguo
    Date issued
    2019
    Keywords
    convex optimization
    nonconvex optimization
    tensor decomposition
    landscape analysis
    atomic norm
    strict saddles
    
    Metadata
    Show full item record
    URI
    https://hdl.handle.net/11124/173275
    Abstract
    Many machine learning and signal processing problems are fundamentally nonconvex. One way to solve them is to transform them into convex optimization problems (a.k.a. convex relaxation), which constitutes a major part of my research. Although the convex relaxation approach is elegant in some ways that it can give information-theoretical sample convexity and minimax denoising rate, but this approach is not efficient in dealing with high-dimensional problems. Therefore, as my second major part of the research, I will directly focus on the fundamentally nonconvex formulations of these nonconvex problems, with a particular interest in understanding the nonconvex optimization landscapes of their fundamental formulations. Then in the third part of my research, I will develop optimization algorithms with provable guarantees that can efficiently navigate these nonconvex landscapes and achieve the global optimality. Finally, in the final part, I will apply the alternating minimization algorithms to general tensor recovery problems and clustering problems. Part 1: Convex Optimization. In this part, we apply convex relaxations to several popular nonconvex problems in signal processing and machine learning (e.g. line spectral estimation problem and tensor decomposition problem) and prove that the solving the new convex relaxation problems can return the globally optimal solutions of their original nonconvex formulations. Part 2: Nonconvex Optimization. In this part, we focus on the fundamentally nonconvex optimization landscapes for several low-rank matrix optimization problems with general objective functions, which covers a massive number of popular problems in signal processing and machine learning. In particular, we develop mild conditions for these general low-rank matrix optimization problems to have a benign landscape: all second-order stationary points are global optimal solutions and all saddle points are strict saddles (i.e. Hessian matrix has a negative eigenvalue). Part 3: Algorithms. In this part, we will develop optimization algorithms with provable second-order optimal convergence for general nonconvex and non-Lipschitz problems. Further, in this part, we also solve an open problem for the second-order convergence of alternating minimization algorithms that have been widely used in practice to solve large-scale nonconvex problems due to their simple implementation, fast convergence, and superb empirical performance. Then the second-order convergence guarantees, along with the knowledge (see Part 2) that a massive number of nonconvex optimization problems have been shown to have a benign landscape (all second-order stationary points are global minima), ensure that the proposed algorithms can find global minima for a class of nonconvex problems. Part 4: Applications. In this part, we apply the alternating minimization algorithms to several popular applications in signal processing and machine learning, e.g., the low-rank tensor recovery problem and the spherical Principal Component Analysis (PCA).
    Rights
    Copyright of the original work is retained by the author.
    Collections
    2019 - Mines Theses & Dissertations

    entitlement

     
    DSpace software (copyright © 2002 - 2022)  DuraSpace
    Quick Guide | Contact Us
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.