For my senior thesis project, I lead a team of six in developing a novel concept of a heterogeneous, distributed platform for autonomous 3D construction. This project led to a journal publication in IEEE Robotics Automation and Letters (RAL) as well as won an award at WPI. The platform is composed of two types of robots acting in a coordinated and complementary fashion: (i) A collection of communicating smart construction blocks behaving as a form of growable smart matter, and capable of planning and monitoring their own state and the construction progress; and (ii) A team of inchworm-inspired builder robots designed to navigate and modify the 3D structure, following the guidance of the smart blocks. In addition to leading the team, I was responsible for the development of the distributed construction algorithm, several control algorithms used onboard the robots, as well as a custom simulator for viewing the progress of the swarm.

See the publication here.

I also have experience with swarm (multi-agent) robotics involves working as a developer for NEST Labs at WPI. In this lab, I have contributed to two projects. This project involved developing a ranking algorithm to distribute tuples across a robotic swarm network for performing exploration of an unknown environment with maximum coordination and utility of robot swarm. This project works to solve a major issue in multi-agent systems, specifically how to share information across a network. The algorithm developed involved the use of a neural network that was able to optimize distributing information between robots through various parameters such as the amount of data that they were currently storing as well as their position to the swarm. This culminated in the idea that robots that are storing a lot of data should be used less for exploration, while the robots that have a lot of room to store information should be used more. The second project involved developing a VR app to control swarm in autonomously pushing items to locations specified in the app. I helped develop the VR app used to implement this project, as well as helped to convert the project into the Buzz programming language. 

 

In addition to these projects, I also worked on a multi-agent robotics project for NASA JPL called the PUFFER project, more of which can be found under work experience.

 

Description of skills:

• Developed ranking algorithm to distribute tuples across a robotic swarm network for performing exploration of an unknown environment with maximum coordination and utility of robot swarm. 

• Developed VR app to control swarm in autonomously pushing items to locations specified in the app. 

 
 

My experience with robots meant for space involves competing with a team of 10 in the Battle of the Rockets competition to deploy a robot from a rocket capable of reaching 1000 feet. For this competition, I first started as a software developer, and by the second year, I was running the entire school team, leading 20 students to a second place victory. The first year of competition involved deploying a robotic lander from the rocket, successfully landing and opening itself up on the ground, taking a panoramic picture autonomously, and then transmitting the picture to a nearby ground station. The second year of competition involved deploying a robotic rover from the rocket, successfully landing, navigating along rough terrain for several feet, turning, and then taking a panoramic picture to be transmitted to a ground station.

Description of Skills:

  • Designed rocket and simulate trajectories using OpenRocket software.

  • Fabricated model of rocket for testing purposes using Kraft Phenolic.

  • Designed lander to be deployed from rocket in team of 3 in SolidWorks.

  • Wrote Preliminary Design Review (PDR) and Critical Design Review (CDR) of project and presented before judges.

  • Machined wooden fins for rocket and nylon frame for lander using a laser cutter and CNC.

  • Assembled electronics of lander and rocket.

  • Programmed control system to navigate over rough terrain and transmit telemetry.

  • Programed MSP430 microcontroller with XBee for communication with ground station, programed Arduino for peripherals including temperature sensors, humidity sensors, light sensors, and an on-board GPS, all done using C/C++. Programed altimeter for rocket using Raven 3 software.

  • Assisted in programing ground station using Python.

 

 

Robotic Lander

Lander Deployment

Rocket at Competition

Ground Station

Design of autonomous rover

Design of autonomous rover

Payload deployment mechanism

Payload deployment mechanism

Design of Lander Folding Mechanism with Layout for Inner Electronics (made in SolidWorks)

Design of Lander Camera (made in SolidWorks)

Demonstration of self-righting mechanism for robotic lander

Altitude rocket launch at competition, just under 2000 feet

Official launch at Battle of the Rockets Competition in Maryland

Lander deployment from rocket at competition

Download PDF of PDR Presentation:

WPI Goatbusters 2016 Preliminary Design Review (PDR)

Download PDF of CDR Presentation:

WPI Goatbusters 2016 Critical Design Review (CDR)

 

My experience with self-driving and flying cars comes from several personal projects involving lane detection, object tracking for tracking other cars, Kalman filtering for tracking people, SLAM for navigating the environment, sensor fusion in combining LIDAR and radar data, and path planning on a road/between buildings, culminating in a self-driving car and flying car capable of navigating in their respective environments. For both the self-driving and flying cars, I used the Udacity simulators that allow me to quickly develop code without having actual hardware. These personal projects have allowed me to develop large scale robot systems from the ground up. My goal for both of these projects is to one day have them uploaded to a real self-driving and flying car. 

 

Description of Skills:

• Developed a self-driving car utilizing the Udacity Self-Driving Car Simulator and ROS.
• Utilized both behavioral cloning and deep learning for control of car.
• Programmed path planning algorithm capable of changing lanes on the road amidst other cars.

• Developed Kalman filtering algorithm in C++ for tracking the motion of people. 

• Utilized OpenCV for lane detection and object tracking algorithms. 

 
Flying car simulator

Flying car simulator

 

My work with surgical robotics has involved developing the code for a surgical continuum robot shown above for image-guided surgery. I also wrote a proposal to the NIH for a new type of tele-operated control of continuum robots using virtual reality.

Description of Skills:

  • Calculated differential kinematics for concentric-tube cannula robot in MATLAB.

  • Performed point registration using fiducial markers in CT scan for guiding robot through human larynx.

  • Developed NIH proposal for VR-based control of continuum robots

My work with bio-inspired robots has involved working on RoboDog for two years, first as a research assistant and currently as the project lead for a team of four. The purpose of this project was to design an all-terrain low-cost robotic quadruped. Most of the robotic quadrupeds seen in industry such as those of Boston Dynamics' Big Dog and the MIT Cheetah cost several million dollars. However, this quadruped, affectionately known as RoboDog, was made for under $1000 due to its use of a parallel elastic system that effectively stores energy, reducing strain on the motors and subsequently the need for expensive mechanisms and electronics. RoboDog has many applications, from exploration over rocky terrain to searching for survivors in collapsed buildings and other disasters to serving as a walking table or pack mule. I have implemented several different control algorithms for this project using MATLAB and C++.

Description of Skills:

  • Designed various versions of legs for quadruped, assembled and tested models using wood, plastic, and metal.

  • Modeled legs of quadruped and simulated motion of various designs in both SolidWorks and Creo. Used MATLAB to determine mathematical models for motion of legs.

  • Performed various testing on components of legs to establish experimental data, including force testing of all legs.

  • Machined components for legs using CNC machine, drill press, and vertical band saw. 3D print components for quadruped including mounts for motors and potentiometers and mount for elastics.

  • Assembled quadruped and install electrical components, including soldering wires and using op-amps for determining signal integrity.

  • Connected Arduino to quadruped and program code in C/C++ for basic motion of leg.

  • Presented research at several different events, including Cambridge Science Festival and WPI's TouchTomorrow.

 

 

Montage of Getting RoboDog to Walk

Initial test of prototype leg

 

I have a deep interest in artificial intelligence, specifically with reinforcement learning. In addition to a lot of the robotics based artificial intelligence described in other projects, a lot of the work I have done with reinforcement learning involves completing OpenAI Gym challenges, such as landing a lunar module on the moon, teaching a robotic agent to walk, and training an agent capable of playing (and winning) the game Bomberman. I have used Convolution Neural Networks (CNN) for a lot of the computer vision projects that I have done, such as for classifying traffic signs for my self-driving car project. I also have an interest in generative models such as Generative Adversarial Networks (GAN), and have used them for toy projects such as generating realistic cartoon characters and generating text.

One notable project I created was SharkNet, a binary classifier that utilizes the YOLOv3 architecture to detect sharks in images in order to stop illegal finning that can occur on the open seas. In addition to developing the neural network to classify images of sharks, I investigated the use of neural style transfer to preprocess noisy images by transferring the style of one image to another.

 
unsplash-image-GBDkr3k96DE.jpg
 

My experience with industrial robotics comes from spending lots of time in machine shops at my school (including student teaching a machining course), as well as building several industrial robots. Some of the industrial robots I have built include a factory robot capable of replacing “nuclear rods,” as well as developing MATLAB code for a homemade 3 degree of freedom (DOF) robot arm to sort colored objects, dynamically track and follow objects, and most importantly play (and win at) tic-tac-toe. I have further experience in programming the 6DOF Kuka KR210 robot in ROS to autonomously sort colored objects into bins. For the nuclear reactor challenge robot, the robot needed to listen for Bluetooth commands from a human operator and then autonomously replace nuclear rods as instructed. For this robot I designed a 4-bar linkage mechanism in SOLIDWORKS capable of acquiring and placing the fuel rods, as well as programmed an Arduino for the control system of the robot to listen to Bluetooth commands, navigate to the various rods, and extract/replace the rods. For both the 3DOF and 6DOF robot arms, I was responsible for implementing forward and inverse kinematics for both joint and task space level control as well as generating trajectories between task space coordinates for smooth motion between setpoints. Both projects involved using computer vision to identify and track colored objects in the robots’s workspaces. For the 3DOF robot, I was recognized by my professor for using a classical artificial intelligence algorithm to have the robot arm play tic-tac-toe with a human.

 

Description of Skills:

  • Utilized HID communication between MATLAB in Ubuntu and Nucleo microcontroller.

  • Implemented forward and inverse kinematics for both joint and task space level control.

  • Generated trajectories between task space coordinates for smooth motion between setpoints.

  • Recognized by professor for use of artificial intelligence to have the robot play a board game with a human.

  • Programmed 6DOF Kuka KR210 robot in ROS to autonomously sort colored objects into bins.

  • Implemented velocity-based (Jacobian) inverse kinematics control of robot for trajectory generation.

  • Designed, built, and programmed an autonomous robot to complete a mock nuclear reactor challenge.

  • Designed 4-bar linkage mechanism in SOLIDWORKS to acquire and place “fuel rods.”

  • Programmed Arduino for control system of robot (PID) as well as for point-to-point navigation.

 
 

My experience with search and rescue robots involves designing a firefighting robot in a team of 3. The challenge was have the robot enter an unknown environment (such as indoors), and then identify and extinguish a fire. A robot such as this could be useful for taking action to extinguish flames in buildings, homes, and warehouses, where sprinkler systems are not installed, possible, and or feasible. In order to test the effectiveness of the robot, the robot was tested in a maze with a burning candle.  

To develop this robot, I utilized ROS, the ROS Navigation stack, and a LIDAR in Ubuntu to control a Turtlebot to perform SLAM within the maze. The Turtlebot was made to perform path planning to navigate between points after map had been constructed. I was actually recognized by my professor for the development and use of a pure pursuit planner to generate smooth movements between points.

The other component to this project involved being able to identify and extinguish a fire/open flame. My contribution to this project involved designing, building, and programming the robot capable of doing this. I developed an algorithm capable of locating a flame within the maze using an I2C IR camera as well as robot position data. As demonstrated in the video below, the robot was able to successfully navigate the maze and extinguish the flame, as well as return home (ie to a charging station). 

 

Description of skills: 

• Utilized ROS, ROS Navigation stack, and LIDAR in Ubuntu to control a Turtlebot to perform SLAM within a maze.
• Performed path planning to navigate between points after map had been constructed.
• Recognized by professor for development and use of a pure pursuit planner to generate smooth movements between points. 

• Developed algorithm to locate a flame within the maze with an I2C IR camera and robot position data. 

 
 

Mh experience with humanoid robots comes from my experience working with the Valkyrie robot (NASA robot) and a Boston Dynamics ATLAS robot (affectionately named Warner). For these project I worked as a developer-in-training, assisting graduate students program a virtual robot for a NASA challenge to complete simulated tasks. 

The first project involved programming a robot to complete simulated tasks for use on the International Space State and Mars. These tasks included operating various machinery (ie align a communications array using pitch and yaw knobs). This project involved the use of ROS and Gazebo to model the Valkyrie robot and perform motion planning. I was responsible for the vision processing of the robot, having the robot identify LED buttons and other objects of interest using OpenCV. This project heavily involved the use of Scrum and Agile development, and required me to work with a large team. 

The second project involved developing an algorithm for mapping between human joints and the ATLAS robot (WARNER) using a Microsoft Kinect. This project, known as “Shadow Motion,” employed skeletal tracking and was incorporated into ROS and Gazebo for simulation on a NASA Valkyrie robot model. I helped assist in research for a Master’s student’s thesis on skeletal mapping; unfortunately time constraints forced me to leave the project before we could test the work on our ATLAS robot. 

Description of Skills:

  • Programmed a virtual robot as developer-in-training, working with graduate students for NASA challenge to program virtual robot to complete simulated tasks for use on the International Space Station and Mars.

  • Used Robotic Operating Software (ROS) and Gazebo to model robot and perform motion planning in C++.

  • Wrote and tested code written in OpenCV for perception of LED buttons that appear in different locations on a screen as well as other objects of interest.

  • Developed algorithm for mapping between human joints and ATLAS robot using Microsoft Kinect.

  • Employed skeletal tracking and incorporated into ROS and Gazebo for simulation on NASA Valkyrie robot model, assisted in research for Master’s student’s thesis on skeletal mapping.

  • Reviewed code and merged requests for different tasks the robot must complete in Gitlab.

 
 

A mix of some of the competition robots I have built in addition to side projects I have worked on, either for classes or for fun.

Description of Skills:

  • Programmed robot using C language to complete autonomous tasks as well as wired/connected all pneumatics and electronics.
  • Designed and modeled robot in SolidWorks as part of two-person team.
  • Manufactured and constructed robot as part of two-person team using steel, aluminum, and fiberglass.
  • Operated robot as part of two-person team for monthly competitions. 
  • Designed, programed, and implemented various sensors and electrical circuits.

 

Report for WPI class Robotics Engineering 1001 Final Project

 

 

Robot for VEX Skyrise Competition

Game of robot freeze tag

Robot for Savage Soccer competition at WPI

A robot drawing a spiral in WPI Robotics Lab

Demonstration of robot for RBE 1001 nuclear disposal, uses custom line tracker sensor

Custom built line follower sensor, uses two op amps and a photoresistor to detect differences between light and dark

Programming "By Your Command" on LCD screen with arduino. Fun project inspired by Star Trek

Demonstration of four-bar for WPI RBE 1001 Robotics Competition

 

For my Eagle Scout Project I constructed a shelter for the Nashua Dog Owners Group, accumulating a total of 423 service hours. The structure serves as a place for people to find shelter from the sun and elements while their dog(s) play in the surrounding park, as well as a way for information about the park and the Nashua Dog Owners Group to be posted on shelter. 

 

 
 

For this project, I lead a team of 4 in the design and construction of an ultrasonic smart cane that expands the range of spatial awareness for the visually impaired as well as provides buzzers and lights to increase awareness of cars and pedestrians of visually impaired. After researching the ineffectiveness of the traditional white cane employed by blind people and its inability to detect overhanging objects such as trees or air conditioners in buildings, I decided to modernize the traditional white cane in order to improve on these deficiencies, in addition to creating something that could fit over the traditional white cane and not require the visually impaired to buy a new cane. In addition to increasing the awareness of the visually impaired, we felt it was also important to increase the awareness of drivers, cyclists, and other people to the visually impaired by employing buzzers and LEDS that attract the notice of others when crossing the street. As of right now, the cane is a prototype and the design is currently being improved upon.

Description of Skills:

  • Designed and built circuits for smart cane that included soldering of all components.
  • Programmed sensors including ultrasonic sensors, vibration motors, and buzzers in C++.
  • Modeled cane in SolidWorks and employed MakerBot software to 3D print housing for electronics on cane. 
  • Presented research and prototype to WPI professors and fellow students.

 

 

 

Demonstration of smart cane

Second demonstration of smart cane

 

This project was part of a NASA Challenge to design and construct an asteroid anchoring device using pneumatic and electric power sources. The device will be tested in NASA's Neutral Buoyancy Lab.

Description of Skills:

  • Researched pneumatic motors and pneumatic regulators to incorporate best practices into final design based on cost and specifications.
  • Modeled device in SolidWorks on team of 2.
  • Solely responsible for creating outreach plan for club to educate and serve the local community. 

 

 

 
 

Some of the club projects I have worked on include the WPI Design, Build, Fly Team, to design and fabricate an unmanned radio controlled aircraft using also wood, carbon fiber tubing, and shrink wrap, as well as the NASA Centennial CubeQuest Competition. For this project I worked on a team to design flight-qualified, small satellites capable of advanced operations near and beyond the moon as part of the NASA CubeSat Launch Initiative. These small satellites known as CubeSats are often used by universities and small businesses for their low cost and durability. Part of my role on the team was researching the feasibility of mapping out low Earth orbit (LEO) radiation.

Description of Skills:

  • Managed propulsion of aircraft, including researching battery, motor, and propeller configurations on team of six.

  • Tested Nickel-Cadmium and Nickel-Metal Hydride batteries for efficiency and durability.

  • Created foam model of aircraft to test propeller configurations and wing shapes.

  • Served as part of guidance and navigation committee and electrical committee.

  • Tested sun sensor and write program to track position of sun at all times.

  • Tested reaction wheels and assisted in writing program to adjust position of satellite in space and establish safeguards in the event of an anomaly. 

 
Different propeller configurations

Different propeller configurations

Prototype of wing made for testing

Prototype of wing made for testing