Home  /  Programs  /  SURF  /  Announcements of Opportunity

Announcements of Opportunity

SURF: Announcements of Opportunity

Below are Announcements of Opportunity posted by Caltech faculty and JPL technical staff for the SURF program.

Each AO indicates whether or not it is open to non-Caltech students. If an AO is NOT open to non-Caltech students, please DO NOT contact the mentor.

Announcements of Opportunity are posted as they are received. Please check back regularly for new AO submissions! Remember: This is just one way that you can go about identifying a suitable project and/or mentor. Click here for more tips on finding a mentor.

Announcements for external summer programs are listed here.

New for 2021: Students applying for JPL projects should complete a SURF@JPL application instead of a "regular" SURF application.

Students pursuing opportunities at JPL must be
U.S. citizens or U.S. permanent residents.

  << Prev    Record 27 of 69    Next >>           Back To List


Project:  Convex Relaxations for Signomial and Polynomial Optimization
Disciplines:  Applied and Computational Mathematics, Computer Science
Mentor:  Adam Wierman, Professor, (EAS), adamw@caltech.edu
Mentor URL:  http://rsrg.cms.caltech.edu/  (opens in new window)
AO Contact:  Riley Murray, rmurray@caltech.edu
Background:  There are many types of optimization problems that are intractable to solve even numerically. If we restrict our attention to certain types of such problems, we can devise convex relaxations. Here, “convex” roughly means “easy to solve numerically”, and “relaxation” means the new problem can be used to bound the quality of the original problem’s unknown optimal solution.

Whether these techniques can be used in practice boils down to how well we can solve these convex problems. This is an issue because even “easy” optimization problems (like linear programming) become hard if we have hundreds of millions of variables. In addition, in some situations, it’s fine to wait minutes or hours for an optimization algorithm to run, while in other settings we need a solution in a few seconds.

This motivates a question: for a given type of convex relaxation, what methods can we use to find feasible but possibly sub-optimal solutions in far less time than solving the problem exactly? Assuming we have such a candidate method, we further ask: (Q1) what are its theoretical properties?, and (Q2) what new possibilities does it create for practitioners? Answering the latter question often involves significant amounts of programming and experimental work.
Description:  The larger goal of this project is to advance our understanding of signomial and polynomial optimization. Usually, a signomial is thought of as a generalized polynomial with arbitrary real exponents; this generalization comes at the expense of only considering signomials as functions of positive input variables. Both signomial and polynomial optimization are intractable in the formal sense of NP-Hardness.

Our approach to these problems is through the “sums of arithmetic-geometric exponentials” or “SAGE” convex relaxation paradigm. The word “exponentials” in the SAGE acronym reflects how signomials can be equivalently reparameterized in terms of exponential functions. Thinking of signomials in terms of exponential functions (rather than polynomials with real exponents) has led to breakthroughs in both signomial and polynomial optimization.

With current technology, the convex relaxations arising in SAGE can be solved up to about 10^7 variables, but only using particular software, and it may take quite a while. Even using the best-available software, numerical issues sometimes cause a solver to fail outright and return with an error (this can happen even for small convex problems). This creates motivation to explore how alternative types of convex optimization can be used to carry out the abstract mathematical idea of SAGE. Recent theoretical work has identified one such alternative approach. The specific goal of this SURF is to follow-up on that recent development, in terms of the standard pair of questions (Q1) and (Q2) given earlier.
References:  Examples of signomial and polynomial optimization, via the “SAGEopt” python package.
https://rileyjmurray.github.io/sageopt/examples/examples.html
An earlier paper that introduces the variant of “SAGE” to be addressed in this project.
https://drive.google.com/file/d/1E0Axf1wC410DfNtOKSDtxb6Ruf5UugOz/view?usp=sharing
The paper with the recent theoretical development.
https://arxiv.org/abs/2006.06811
Background on relevant convex optimization (Sections 1, 2, 4, and 5).
https://docs.mosek.com/modeling-cookbook/index.html
Student Requirements:  This project is only open to Caltech students and is likely best suited for a junior (3rd year, rising-senior) student. Concrete skills include exposure to (convex) optimization, experience with python software development (especially mathematical software), and advanced mathematical coursework (e.g., a nonempty subset of ACM 104/105/106/107/113). Expertise in an area of science or engineering outside of computer science is a plus.
Programs:  This AO can be done under the following programs:

  Program    Available To
       SURF    Caltech students only 

Click on a program name for program info and application requirements.


  << Prev    Record 27 of 69    Next >>           Back To List