The US Army Research Laboratory (ARL) introduced ground-based orientation and object analysis technology for robotic combat systems. With the help of voice command recognition, the US military will be able to create units such as "people - robots", where robots will perform functions similar to conventional military.
The first successful experiments to create this type of system took place in October 2019, writes the MIT Technology Review. The created system consists of a computing unit based on the quad-core Intel Core i7 processor, lidar and optical-electronic cameras.
The device is based on a neural network that can recognize a map of the terrain and objects on it, assign them semantic marks and upgrade it using voice commands. For example, the headquarters may command the robot “See what happens behind the building”. If there are several “building” marks on the map, he will first ask for which one: only after that he will execute the command.
In the first tests, the military tested the system on a small four-wheeled Husky robot with a manipulator. He had to clear the area, remove trash from the road, conduct reconnaissance and independently determine which objects he could grab with his manipulator.
Two of the three tasks were given to the robot by voice commands. During one of the commands, the device crashed and had to be rebooted, so it is considered only partially completed.