The new simulation platform, Habitat 2.0, announced by the social media giant Facebook Inc. will let AI researchers teach machines to navigate through photo-realistic 3D virtual environments, informs Joydev. An exclusive for Different Truths.
After setting footsteps in construction, manufacturing, healthcare, and numerous other industries, robots will now be trained to do your household chores too. Amazed? You have got the reasons. The new simulation platform, Habitat 2.0, announced by the social media giant Facebook Inc. will let AI researchers teach machines to navigate through photo-realistic 3D virtual environments and to interact with objects just as they would in an actual kitchen, dining room, or other commonly used space.
Habitat 2.0 is an extended version of the original platform Habitat that was rolled out by the company in 2019. This announcement made by Facebook moves it closer to achieve “embodied AI,” or the tech that might allow robots to perform everyday tasks. Facebook believes Habitat 2.0 is one of the fastest publicly available simulators for AI researchers that employs a human-like experience and helps test future innovations before ever setting foot into reality.
With the fully interactive 3D data set of indoor spaces that help training virtual robots, AI researchers can build virtual agents in a static 3D environment that can easily and reliably perform household tasks like stocking the fridge, loading the dishwasher, fetching, and returning objects from their places on command. This would be of greater help to visually impaired people or those occupied in other important works.
ReplicaCAD: Interactive Digital Twins of Real Spaces
Facebook Reality Lab’s Replica, which was previously released for 3D environments, is now rebuilt as ReplicaCAD and it serves as Habitat 2.0’s new data set. The previously static 3D scans have been converted to individual 3D models with physical parameters, collision proxy shapes, and semantic annotations in ReplicaCAD. The new data set, ReplicaCAD, enables virtual training and manipulation of different objects while still being safe from accidents like running into walls when trying to walk through them too quickly.
The already existing library of 18 3D scans is expanded to over 110 living area layouts and includes nearly 100 objects that add realistic clutter and lets the robots “interact” with doors and other elements.
Habitat2.0 Simulator
Habitat 2.0 Simulator works on the expanded capabilities of Habitat-SIM, the previously launched flexible, high-performance 3D simulator with configurable agents, multiple sensors, and generic 3D dataset. Habitat 2.0 Simulator prioritizes speed and performance over a wider range of simulation capabilities and allows the research community to test new approaches and iterate more effectively.
Habitat2.0 Simulator can simulate a Fetch robot interacting in ReplicaCAD scenes at 1,200 steps per second (SPS) surpassing the existing platforms that typically run at 10 to 400 SPS. Habitat 2.0 achieves 8,200 SPS (273× real-time) multi-process on a single GPU and nearly 26,000 SPS (850× real-time) on a single node with 8 GPUs.
The Future
Habitat 2.0 Simulator working with ReplicaCAD data set can build a new library of household assistive tasks called Home Assistive Benchmark (HAB). HAB can take in general tasks like cleaning the fridge, setting the table, and cleaning the house; robot skills like navigation, pick, place, open fridge door, open cabinet drawer, and others.
In a nutshell, the platform can train the robots to do all these household chores in a complete virtual setting, thus cutting down huge costs as well as time. Once the robots are trained and up for use, life on earth would really become quite amazing. Just wait for the technology to turn it up!
Images from http://sites.google.com/view/habitat2 and Visuals by Different Truths