When the pandemic made long-distance business travel difficult or impossible in 2020, many turned to the remote collaboration tools that were available. Video conferencing replaced business meetings. But what about the missed connection of being around others, and not just in a corner of a laptop screen? That’s where the OhmniLabs telepresence robot comes in. The tablet computer mounted on a mobile tripod is operated entirely by a remote user logging in over the internet. Able to communicate through the tablet microphone and speakers while independently navigating the space, it offers the user free reign to explore.
With a charging station a user can steer the machine into before they log out, the robot is mostly self-sufficient. The initial model—designed for offices, hospitals and factories—was reviewed by ENR at our New York City office earlier this year. ENR editors were able to log in and roam the office, chatting with coworkers despite being scattered across the U.S.
ENR found the robot was surprisingly able to get around and get into conversations. This is by design, according to Thuc Vu, founder and CEO of OhmniLabs. “People have deployed our robot in manufacturing settings to allow customers and others dive in and monitor remotely, for auditing and inspection purposes,” he says. The OhmniLabs robot is built on a modular platform with 3D-printed parts, and Vu says the telepresence model is just the first use case they’ve brought to market so far.
In terms of functionality, it can be more engaging to talk with someone running the robot rather than just a portrait in the corner of a normal video call. In practice, the ability to roam around paired with a decent camera that allows the user to read whiteboards or other screens does add a more collaborative aspect to it.
While not yet rugged enough to navigate a construction jobsite, Vu has not ruled it out and says outdoor versions of the robot have already been tested with customers. He says that deploying the robot for remote inspection of construction sites is a use case customers have asked for, even requesting that extensive sensors be added.
“We’ve got a long way to go to develop the library of modules for [the robot],” says Vu. That includes computer vision, “so the robot can recognize objects in the real world.”
But there’s also the uncanny effect of having someone pilot the robot around, reacting to people and engaging in conversation. “On Zoom, you’re a fly on the wall … people forget you’re there,” Vu says. “But as a robot you can drive around and feel engaged.”