• The RFM-1 language model boasts 8 billion parameters, having been trained on images and videos collected by the company’s software-operated warehouse robots.
  • RFM-1’s capability to understand natural language instructions empowers warehouse robots to learn and perform new tasks through clear English prompts.

Recently, a startup, Covariant, introduced RFM-1, an LLM designed to enable industrial robots to seek guidance for task completion in situations where they cannot independently execute the task.

The artificial intelligence model is versatile and applicable to various tasks. As per Covariant, RFM-1 has the capability to create short videos illustrating industrial robots engaged in activities like transferring merchandise between boxes. The startup asserts that these videos can be utilized to empower industrial robots with the capacity to strategize their actions.

The company Covariant, formally known as Embodied Intelligence Inc., has the support of more than USD 100 million from investors, including Index Ventures. It offers software to drive robotic arms and other warehouse automation systems. The company’s software shortens the time needed to program robots for tasks involving the processing of merchandise. Previously, this task required a substantial amount of time and custom code.

The recently launched RFM-1 language model boasts 8 billion parameters and underwent training using images and videos collected by warehouse robots utilizing the company’s software. Covariant augmented the training data with information from the built-in pressure sensors and other components of these robots, in addition to incorporating data sourced from the public web.

Numerous research datasets are available for the purpose of training AI models with robotics optimization. However, according to Covariant, the majority of the footage in those datasets shows warehouse automation systems being used in laboratories. The video used to train RFM-1 was captured by robots deployed in an operational environment, enhancing the data’s value for the AI model.

Equipped with natural language processing capabilities, RFM-1 enables warehouse robots to learn new tasks through straightforward English instructions. For instance, a worker could instruct a robot to retrieve items from a pallet and place them onto a nearby conveyor belt.

Traditionally, programming warehouse automation systems for novel tasks involved the laborious task of writing custom code, demanding considerable time and effort. According to Covariant, allowing workers to execute tasks through natural language prompts could streamline the management of robot fleets.

The company asserts that a robot utilizing RFM-1 not only comprehends instructions but is also capable of requesting them. In a blog post, Covariant researchers delved into the specific statement: “If a robot is having trouble picking a particular item, it can communicate that to the robot operator or engineer. Furthermore, it can suggest why it has trouble picking the item. The operator can then provide new motion strategies to the robot, such as perturbing the object by moving it or knocking it down, to find better grasp points.”

RFM-1 extends beyond language processing tasks. As per Covariant, it also functions as a video generator.

RFM-1 can produce short videos showing a robot carrying out various activities, like carrying goods. According to the company, these clips facilitate “online decision-making through planning” for automation systems. When a robot is given a new task, it can utilize RFM-1 to create a video demonstrating how it should be completed and then watch the video to determine the best course of action.

In the upcoming months, Covariant will begin distributing RFM-1 to customers’ warehouse robots. In the long term, the company intends to develop more sophisticated AI models capable of automating a wider variety of tasks related to robot configuration. To assist with the development process, Covariant will gather more training data.

The company’s researchers specified, “We expect to scale up our data collection speed by at least a factor of 10 through the robots coming into production soon.”