When we last left off, Evan and I were trying to convert the robot’s guidance system we wrote on a bigger board to the smaller board we had in our little robot. Our hopes were high. If it worked, then a simple three-direction concept would allow the little one to follow you around, like a baby if you will.
Unfortunately, we failed; the baby board did not have a way to properly debug the video data. When our first presentation came around, we did manage to get the robot to follow us in a straight line at least, so that was nice.
Our ultimate goal is to create a system that would move an arm to a mouth detected by the camera. This meant our next step was to get the Beaglebone board to detect faces for us.
Alas, that didn’t work either. In addition to some other issues, the delay in the streaming, before any processing applied, was already too long. Research is full of unexpected surprises.
This brings us up to speed with our current efforts. We have the facial detection (HAAR) working thanks to Evan, and all the math worked out in the algorithm. The last part is to develop a method to virtually reconstruct a visual cue, which we ’bout to finish tomorrow.
Back in the dorms, I’m having a lot of fun getting to know my REU group. Card games, board games, a county fair, a cat-fe, a season of Stranger Things, local beers, an escape room, you name it, we did it. Well, not really, but we did all those things so far. They taught me a bunch of new card games like Euchre, Golf, and Pitch to name a few. I especially enjoy learning about them and their background. They’re really cool people. Take Charles on stage with the ventriloquist at the St. Croix County Fair in the below picture for example!