Facebook AI Research is making available AI Habitat, a simulator that can train AI agents that embody things like a home robot to operate in environments meant to mimic typical real-world settings like an apartment or office.
For a home robot to understand what to do when you say “Can you check if laptop is in the other room and if it is, can you bring it to me?” will require drawing together multiple forms of intelligence.
Embodied AI research can be put to use to help robots navigate indoor environments by marrying together a number of AI systems related to computer vision, natural language understanding, and reinforcement learning.
“Habitat-Sim achieves several thousand frames per second (fps) running single-threaded, and can reach over 10,000 fps multi-process on a single GPU, which is orders of magnitude faster than the closest simulator,” a dozen AI researchers said in a paper about Habitat. “Once a promising approach has been developed and tested in simulation, it can be transferred to physical platforms that operate in the real world.”
Facebook Reality Labs, formerly named Oculus Research, is also open-sourcing Replica, a data set of photorealistic 3D environments like a retail store, apartment, and other indoor environments that resemble the real world. AI Habitat can work with Replica but also works with other embodied AI research data sets like Matterport3D for indoor environments.
Simulated data is commonly used in AI to train robotic systems, create reinforcement learning models, and power AI systems from Amazon Go to enterprise applications of few-shot learning with small amounts of data. Simulations can allow environmental control, reducing costs that arise from the need to collect real-world data.
AI Habitat was introduced in an effort to create a unified environment and address standardization for embodied research by the robotics and AI community. To that end, Facebook also released PyTorch Hub earlier this week.
“We aim to learn from the successes of previous frameworks and develop a unifying platform that combines their desirable characteristics while addressing their limitations. A common, unifying platform can significantly accelerate research by enabling code re-use and consistent experimental methodology. Moreover, a common platform enables us to easily carry out experiments testing agents based on different paradigms (learned vs. classical) and generalization of agents between datasets,” said Facebook.
In addition to the Habitat simulation engine, the Habitat API provides a library of high-level embodied AI algorithms for things like navigation, instruction following, and question answering.
Facebook released the PyTorch Hub platform for reproducibility of AI modelsearlier this week.
Researchers found that “learning outperforms SLAM if scaled to an order of magnitude more experience than previous investigations” and that only agents with depth sensors generalize well across datasets.
“AI Habitat consists of a stack of three modular layers, each of which can be configured or even replaced to work with different kinds of agents, training techniques, evaluation protocols, and environments. Separating these layers differentiates the platform from other simulators, whose design can make it difficult to decouple parameters in order to reuse assets or compare results,” the paper reads.
AI Habitat is the latest Facebook AI initiative to use embodied AI research, and follows research to train an AI agent to navigate the streets of New York with 360-degree images and to get around an office by watching videos.
Facebook VP and chief AI scientist Yann LeCun told VentureBeat the company is interested in robotics because the opportunity to tackle complex tasks attracts the top AI talent.
AI Habitat is the most recent example of tech giants attempting to deliver a robotics creation platform for AI developers and researchers. Microsoft introduced a robotics and AI platform in limited preview last month, while Amazon’s AWS RoboMaker, which draws on Amazon’s cloud and AI systems, made its debut in fall 2018.
How AI Habitat works was detailed in an arXiv paper written by a team that includes Facebook AI Research, Facebook Reality Labs, Intel AI Labs, Georgia Institute of Technology, Simon Fraser University, and University of California, Berkeley.
AI Habitat will be showcased in a workshop next week at the Computer Vision and Pattern Recognition (CVPR) conference in Long Beach, California.
In other recent contributions to the wider AI community, Facebook AI research scientist Mike Lewis and AI resident Sean Vasquez introduced MelNet, a generative model that can imitate music and the voices of people like Bill Gates.
Major object detection AI systems from Google, Microsoft, Amazon, and Facebook are less likely to work for people in South America and Africa than North America and Europe, and less likely to work for households that make less than $50 a month.
Facebook VP of AR/VR Andrew Bosworth earlier this week said new Portal devices — the first after the video chat devices were introduced in October 2018 — will make their public debut this fall.
Facebook also announced plans to open an office with 100 new AI roles in London.
This post by Khari Johnson originally appeared on VentureBeat.
Tagged with: AI, Navigation, robots
The post Facebook Open-sources AI Habitat To Help Robots Navigate Realistic Environments appeared first on UploadVR.