If you’re engrossed in a book, you definitely can’t be bothered to get up and get your drink. However, with Saral Tayal’s Raspberry Pi-based personal assistant robot you no longer have to. The bot — designed to resemble the Mars rover — can get it for you using only voice commands.
The device works by using a webcam to observe its environment visually, as well as take voice controls via its built-in microphone. When it hears the wake word “robot,” it then responds to commands — including “follow me” to follow its human and await instructions, “coco cup” to fetch your cup, or “banana” to procure a banana. Other words could also be added as needed, and vision processing to find the objects is accomplished with the help of the Google Coral Accelerator. With the Coral Accelerator, the Raspberry Pi 3 only has to use around 30% of its processing power for the task, and it’s still able to produce about 10 frames per second of image recognition performance.
Mechanically, the rover employs a pair of DC motors to actuate its two treads. When it approaches an object, an ultrasonic sensor makes sure it’s in range, and gripping claws move together to scoop it up for transfer. It then brings you the desired object with minimal effort!