As humans, we quickly recognize furniture and step around it. We see pets moving and take care not to bump into them.
But come to robots, and this is a difficult task. Robots have no intuitive understanding of rooms, furniture or pets. They need to learn how to navigate and avoid obstacles.
Aido handles navigation using a mix of Wi-Fi fingerprinting (to identify where he is at home) and object recognition (for getting his bearings right and obstacle avoidance).
How Aido senses what’s around
- During setup, Aido fingerprints Wi-Fi signal strength across the house as the primary marker to identify different rooms.
- Aido also uses IR and object recognition to build a topographic map of your location, including elements like furniture, doors & structure of rooms. These are stored in a navigation database.
- Aido maintains these in a navigation database. Aido judges the space between obstacles, and if it is sufficient for him to traverse, will plot a path between them.
Image or object recognition is crucial for an autonomous robot like Aido. Aido needs to identify items when navigating and decide real time how to avoid them when it’s moving.
Indoor navigation and obstacle avoidance
- You can setup waypoints by taking Aido to various spots at home. For example, you can get Aido to your kitchen and mark a spot as ‘Kitchen’. Aido stores this entry with a Wi-Fi fingerprint.
- Aido builds a path between waypoints to navigate. He judges the space between obstacles. If it is sufficient for him to traverse, he plots a path between them.
- Aido constantly does a frame-by-frame evaluation for moving objects like pets or people. If Aido encounters a moving object, he waits for it to get out of the way before proceeding.
- Aido has sophisticated edge detection built in that prevents it from spinning uselessly when he encounters an obstacle like a staircase.