Home

In 2016, I graduated with a degree in Computer Engineering from Michigan Technological University after years of robotics research, internships, and exploring the Upper Peninsula. Since then, I have been working as an engineer and researcher advancing robotic perception systems.

The following sections provide an overview of my professional career through the lens of sensor fusion.

Camera-Lidar Fusion

While in college, I designed and hand-built a data-collection system which used a Hokuyo lidar like this one which was designed to be mounted on a drone. It was used to map bridges in the Metro-Detroit Area like this one under the guidance of Dr. Timothy C. Havens. The drone was equipped with the sensor kit I developed.

Continuing that work, I added a camera to the same data collection system in order to help our lab test a camera-lidar sensor fusion system for pose estimation. Additional documentation and imagery from this project can be found here.

Following college, I was part of the first intern class at Uber ATC (later Uber ATG, and then bought by Aurora) where we built an end-to-end self-driving system based purely on cameras. Following the internship, I transitioned to full time to continue this work.

After some time at Uber, I joined Argo AI as one of the original employees. Getting back to my camera-lidar roots, I started there by working with our team to develop C++ onboard infrastructure for everything from our monocular object detector to low-level lidar firmware.

I furthered my mapping work at Argo by working on a team which patented a method to develop cm-level precision ground height maps using a Gaussian Process, Poisson Surface Reconstruction, and several other hacks. My role in the project was to implement the Gaussian Process piece of the pipeline using the GPflow package.

I then spent a significant amount of my time working with Argo’s fantastic lidar team to develop algorithms for Argo’s custom Geiger-Mode lidar. Following my infra work, I was promoted and became a tech lead to team of engineers to ship Argo’s first deep-lidar based object detector. We successfully launched the detector in five different cities simultaneously.

Stereo

Towards the end of my time at Argo I was part of a small team which pushed the state-of-the-art in the field of stereo depth estimation by developing a novel approach for deep stereo vision on high-resolution images in real-time. Our work was published at CVPR in 2019.

In addition to my algorithmic work on stereo, I also modified the auto-exposure for our cameras so that the stereo pair would expose at the same time. This was needed for high-quality stereo correspondences.

Radar

After a while I transitioned to Waymo, formerly the Google Self-Driving Car Project, where I use our imaging radar to build deep radar algorithms for object detection.

I currently lead the radar-trucking perception working group. The group consists of a cross-functional team of engineers across hardware, software, and systems which work through issues involving our current radar, while pushing on designs for the next generation of hardware. Our working group has successfully launched several deep-radar based object detectors, field of view, and sensor fusion models into production. Our group has also been able to generalize several of these models across newer radars as they come online.

Camera-Radar Fusion

Throwing together all of my previous work, I was part of publishing CramNet, a novel camera-radar early fusion object detector at ECCV ‘22. Our work helped to pave the way for long-range, principled, and efficient camera-radar fusion using dense imagery which is robust to various error modes.

In General…

Throughout my career, I have gained diverse experience across multiple organizations and domains. I consider myself to be a robotics engineer with a specialization in building machine learning algorithms for custom-built sensors. I am passionate about advancing perception technology and welcome opportunities to discuss research and collaboration. For more information about my background and experience, please refer to the links below or view my Resume.