Experience

 
 
 
 
 

R&D Researcher and Developer

Texas Spacecraft Laboratory

Jan 2019 – Present Austin, TX

I serve as the research lead for the Seeker R&D project which aims to achieve on-orbit pose determination using a monocular camera on COTS hardware.

Our team architected and developed an end-to-end Machine Learning training pipeline capable of running entirely on AWS in hopes of rapidly training and validating numerous models.

 
 
 
 
 

Mars 2020 GDS Software Engineer (Part-time)

NASA Jet Propulsion Laboratory

Nov 2018 – Present Remote

I work as an Academic Part Time Software Engineer for the Mars 2020 Common Software Services team.

I work on developing tools to improve efficiency of cloud tools.

 
 
 
 
 

Mars 2020 GDS Software Engineering Intern

NASA Jet Propulsion Laboratory

May 2018 – Aug 2018 Pasadena, CA

I interned with the Common Software Services for the Mars 2020 mission.

I worked on developing tools to improve efficiency of cloud tools.

 
 
 
 
 

Machine Learning/Computer Vision Lead

Texas Spacecraft Laboratory

Sep 2017 – May 2018 Austin, TX

I served as the Machine Learning and Computer Vision lead for the Seeker Vision project to develop a low-cost intelligent sensor for NASA JSC’s Seeker-1 CubeSat Mission.

I led efforts in developing, adapting, and testing convolutional neural network (CNN) architectures and computer vision algorithms to work with the space environment. Our sensor detects, localizes, and returns relative bearing estimates for a target spacecraft. Our computer vision system was integrated onto the Seeker-1 CubeSat and launched on Cygnus NG-11 in April 2019. It will be deployed from the ISS in July 2019 to carry out its mission.

 
 
 
 
 

Software Engineering Intern

METECS

May 2017 – Aug 2017 Houston, TX

I worked as a Software Engineering Intern for the RFID-Enabled Autnomous Logisitics Management (REALM) team at NASA’s Johnson Space Center.

I designed, implemented, and tested a voice user interface system for an inventory tracking system that is deployed on the International Space Station.

Projects

A NASA Johnson Space Center funded mission using open-source machine learning and computer vision tools for autonomous vision-based navigation in space. Final system launched on Cygnus NG-11 in April 2019 with a mission date of July 2019

A Computer Vision pipeline to identify the highway lane boundaries in a video

Predicting Twitter Emoji Usage with Neural Networks

Bran is used as a CLI to automatically create and ssh into an ec2 instance given user input settings. Bran automatically sets up requirements and installs the ravenML machine learning CLI

What Am I Looking At? (WAILA) is a real-time object detector Android app that predicts objects using Tensorflow and a model trained on the COCO dataset. WAILA provides functionality to learn more about the object and to save it as a memory

Skills

Python

Java

Go

Computer Vision

AWS

Machine Learning

Docker

Git

Tensorflow

Contact

  • Austin, TX