User interfaces for computers have evolved over the years, from the introduction of the keyboard and mouse on the personal computer, to touchscreens on mobile devices, to natural voice recognition. However, the same cannot be said for robots or drones—until now.
Dr. Allen Yang, executive director at the Center for Augmented Cognition at the University of California, Berkeley, believes that augmented reality is the next evolution of interfaces for robots and drones.
“The interface for robots have not changed for the past fifty years. The emerging market of AR may provide new ways for human users to more intuitively interact with robots,” said Dr. Yang in an interview with NextReality.
Based in the College of Engineering, the Center for Augmented Cognition started as a division of the Berkeley Robotics and Intelligent Machines lab. The center supports research by faculty and students in applying augmented and virtual technologies to human cognition modeling, human-computer interaction, and human-robot collaboration.
Last week, UC Berkley announced that a corporate gift from virtual reality entertainment company Immerex would facilitate the opening of a new lab and collaboration space for the Center for Augmented Cognition.
“We expect the new lab will make a positive impact in strengthening Berkeley’s leading role in Silicon Valley and globally on the innovation of disruptive technologies that connect information, people and society,” said S. Shankar Sastry, dean of the College of Engineering, as well as a co-director of the Center for Augmented Cognition, in a press release.
Named the Immerex Virtual Reality Lab, students will be able to use the space to develop projects with advanced equipment. In addition, the corporate gift will fund fellowships for student support and AR/VR classroom renovations.
In this new lab, we will be able to create synergies among Berkeley’s various programs in AR/VR and connect researchers and students to Immerex’s industry-leading technologies and channels in the emerging virtual reality global market.
Among the projects students and faculty at the center are undertaking is a method for controlling drones with augmented reality. Immersive Semi-Autonomous Aerial Command System (ISAACS) is an open-source project that uses Microsoft HoloLens as the command interface for a drone fleet, though it could eventually be applied to robots as well.
The ISAACS program is led by Dr. Yang, who brings expertise in image recognition, with Dean Sastry and Prof. Clair J. Tomlin, both of whom have studied drone control and safety of the past ten years.
“What we’ve proposed with the ISAACS program is to combine the new capability to understand humans with the new capability to understand robots using control theory and create a new kind of interface,” Dr. Yang told NextReality. “That interface has to be intuitive, so you don’t have to learn how to program robots, and it also has to be immersive, the reason being that the robots are physically sharing the same space with humans.”
According to Dr. Yang, AR allows for a more natural connection between man and machine by facilitating two-way communication. Through AR and machine learning, robots can understand commands and intent from humans by voice, gestures, and posture. At the same time, humans can better understand robots by seeing its field of view at a first person perspective, as well as having information such as battery power or current task displayed in the same field.
In addition to the AR-based interface, ISAACS also coordinates localization and visualization between operator and vehicle and provides a framework for vehicle safety assurance. Using real-time simultaneous localization and mapping (SLAM) solution, ISAACS can localize 3D coordinates between drones and the HoloLens operator. ISAACS also prevents accidents from human error by connecting with a drone’s low-level controller and also optimizing the operators’ situational awareness through the AR interface.
The goal is to make sure that robots can work for normal people. Those are the people who might not necessarily have computer science or electronic engineering background.
The team have impressed the business community already. Last year, ISAACS was named among the inaugural recipients of the Microsoft HoloLens Research Grant. Another stakeholder, DJI, provides the ISAACS team with test drones for their lab.
At this juncture, though ISAACS is not a commercially-ready platform, it is an opportunity for the industry to share knowledge for potential future developments that include these safety and natural interface features.
“Their purpose as part of the open research is to motivate open research for academia and the industry to collaborate,” said Dr. Yang. “Our industry partners, by working with us, can learn from our experience and learn from our research.”