Research Old

Learning What Humans Feel During Robotic Assistance

My current research focuses on helping robots learn what humans physically feel during robotic assistance. This can lead to robots that provide more effective assistance by understanding how their actions impact humans. Using a long short-term memory (LSTM) network, we have shown how a robot can use the sensor readings at its gripper to infer hundreds of forces that a person feels across their body during robot-assisted dressing.

Simple Social Robots for Human-Robot Interaction Education

Socially interactive robots are continuing to see increasing prevalence in our daily lives. As such, there is interest to expand exposure to human-robot interaction throughout academia. However, robots can be costly to purchase and build for an entire class of students. We designed Pypr Robots, a set of affordable social robots which draw upon low-cost prototyping from human-computer interaction. Pypr Robots cost only ~$55, yet are highly customizable and can be built in just a few hours even without prior robotics experience, expensive tools, or 3D printers. Find out more about this project on Github.

Multimodal Execution Monitoring for Anomaly Detection During Robot Manipulation

As part of the NSF Summer Undergraduate Research Experience (SURE) Robotics program at Georgia Tech, I worked on assistive robotics in the Healthcare Robotics Lab. While at Georgia Tech, I helped design a probabilistic anomaly detection approach for multimodal sensory input to improve safety of robotic systems that assist with difficult daily activities, such as feeding. We were able to model the spatiotemporal dynamics of sensory information using hidden Markov models and were able to classify anomalous events with a dynamic local detection threshold.

UW-La Crosse Quadcopter Lab

CrazyflieCopter I worked alongside Professor Martin Allen in order to establish a departmental quadcopter lab at UW-L. While working with the lab, I explored control algorithms for meeting the performance goals of high-end quadcopters using less expensive and more easily accessible hardware, such as Crazyflie Nano Quadcopters that fit in a person’s palm.

Wearable Device Security

GoogleGlassPassword Wearable devices, such as Google Glass or Microsoft HoloLens, have security issues due to an integrated first-person camera. Malware on such a device can quietly access the camera and publicize everything a user does. As part of the TRUST REU program at UC Berkeley, I looked at ways to improve security for wearable devices with Professor Dawn Song. While at Berkeley, I designing a solution using convolutional neural networks to visually detect sensitive or private information within a camera's view and heighten security on the camera to prevent malicious activities.

On Ramp to Parallel Computing

QuickLaunchInterface This project aims to improve the ease of use and learning of parallel computing by providing an interactive interface for launching parallel programs on various cluster and parallel architectures. I worked closely with Professor Samantha Foley at UW-L to design a lightweight backend infrastructure to handle MPI and build settings, along with a modular and mobile responsive web interface. Our system guides users through the process of launching a parallel program with a variety of descriptions, tutorials, pre-built programs, and preconfigured launch options.