SURF: Announcements of Opportunity
Below are Announcements of Opportunity posted by Caltech faculty and JPL technical staff for the SURF program. Additional AOs for the Amgen Scholars program can be found here.
Specific GROWTH projects being offerred for summer 2019 can be found here.
Each AO indicates whether or not it is open to non-Caltech students. If an AO is NOT open to non-Caltech students, please DO NOT contact the mentor.
Announcements of Opportunity are posted as they are received. Please check back regularly for new AO submissions! Remember: This is just one way that you can go about identifying a suitable project and/or mentor.
Announcements for external summer programs are listed here.
Students pursuing opportunities at JPL must be
U.S. citizens or U.S. permanent residents.
|Project:||Holographic Examination for Life-like Motility (HELM)|
|Disciplines:||Computer Science, Any physical science, esp biology|
|Mentor URL:||https://ml.jpl.nasa.gov/people/mandrake.shtml (opens in new window)|
|Background:||Imagine you fly to Europa, melt some ice, and stare at the liquid under a microscope. Or thick, salty Martian brines. Or the plumes shot out from Enceladus. How would you know what you see in that liquid is alive? You can test for DNA or proteins that mean life on Earth, and there are instruments for that. But what if, instead, you just looked at the movies of particles in the liquid and asked, "Is it moving like life?" Any human could watch and tell you what that meant, but try to write a program to measure it. Instead, we will use humans to provide labels like "This object here is moving like life" and "This object is not." Then we will train a machine learning program to take measurements of the particles and learn, statistically, what the human labelers are talking about. This is HELM, an algorithm that will fly to Europa, Mars, and other worlds looking for signs of life wiggling in the water. It also will be used extensively on Earth, rapidly identifying potentially dangerous organisms in our beach ocean water, lakes, streams, and even within our own blood.|
|Description:||We are developing the above algorithms in Python while keeping our memory and compute requirements very small so that we can fit on rad-hardened space hardware in future missions. But to train our algorithm, we need a LOT of labels. Those are made by humans staring in detail at real data movies taken of known living and unliving things, so that the ML methods can figure out how to determine life from unlife. This position will require helping us label a lot of such images, some each day. You will be embedded in our development team and watch the growth of a real ML system designed for space use, be exposed to real-world ML considerations, challenges, and solutions, and get real science-data analysis experience. But don't be fooled... it takes a lot of work to provide these labels in an accurate, careful manner. You will work at JPL in the Machine Learning & Instrument Autonomy group with flexible hours, meet the group, and make connections across JPL. Further, ML and tracking opportunities for research will abound in addition to your labeling duties, as we are considering not one but many potential algorithms that optimize from a tiny calculator-like computer to the latest in High Performance Space Computing (HPSC), basically like your own laptop.|
Req: Python programming - Skilled
Req: Statistics - uncertainty
Req: Image-based Filtering / Machine Learning
|Location / Safety:||Project building and/or room locations: . Student will need special safety training: No.|
This AO can be done under the following programs:
<< Prev Record 33 of 36 Next >> Back To List