Hello again! It’s the end of week 5 and the research project has come a long way.
The background of the project is based in robotics and computer vision. Numerous studies have been undertaken with similar intentions of creating meal assistance robots. For example, Tanaka et al. successfully created a meal assistance robot that brought a cup to a user’s face through facial recognition. This provided an example of how a simple meal assistance robot could operate. Ultimately, the system developed as a part of this research will be able to handle food instead of drink and will eliminate issues related to multiple faces being captured by the camera. Another project that is similar to this research was conducted by Bhalla et al. to create a wheelchair mounted meal assistance robot. This research used a JACO arm as well, along with a camera similar to the Kinect. However, this system was only able to handle round foods, such as apples and oranges, whereas this research will enable users to pick up utensils.
The specific application of the technology that we are developing is autonomously feeding the user of the JACO 2 robot arm. Because the joystick is a bother to control when trying to complete minute movements, our goal is to create a system to make the JACO arm automatically feed the user. For our purposes, we have decided that the signal for the robot arm to feed the user will be for the user to open their mouth for a short period of time. Using a Kinect camera, there has been considerable success in creating a program that reliably outputs when the user’s mouth is open. Controlling the robot has proven to be one of the more difficult aspects of the project, as inefficiencies in trajectory planning have become apparent.
My part in all of this has been focused in obtaining signals from the Kinect camera. Specifically, working on ensuring the Kinect is receiving signals from the correct face. The Kinect camera is able to track up to six people at a time, which could prove problematic when it receives signals from a person other than the user of the JACO arm. This problem has been combatted by making the computer only accept signals from the closest user. This was done by utilizing data on the xyz coordinates of each user. In order to choose the closest, the distance formula is used to determine the linear distance from the camera to the person, and the shortest distance selected as the correct face to collect information from. This method has worked rather well and will most likely be used in the final product.
Besides being used for meal assistance, this technology will have other applications when finished. For example, it would not be difficult to alter the product to be able to brush the teeth of the user. A broader application could be total control of the robot using facial expressions instead of the joystick.
 H. Tanaka, Y. Sumi and Y. Matsumoto, “Assistive robotic arm autonomously bringing a cup to the mouth by face recognition”, 2010 IEEE Workshop on Advanced Robotics and its Social Impacts, pp. 34-39, 2010.
 T. Bhalla, D. Fox, R. Nayeem, T. Rhodes and M. Warner, “Design and Validation for Control Interfaces for Anna”, Worcester Polytechnic Institute, pp. 14-26, 2015.