A delivery robot, crossing an intersection on a green light, unexpectedly failed to yield to a special motorcade, causing one of the convoy vehicles to stop. This incident sparks crucial questions about the potential flaws in the robot`s algorithmic design and its ability to handle unusual traffic situations.

A recent incident in Moscow saw a robot courier fail to give way to a motorcade equipped with flashing lights, effectively cutting into the procession. The robot was traversing a pedestrian crossing while the traffic signal was green in its favor. However, a traffic police inspector had already blocked the intersection to clear the path for the motorcade. While the police escort vehicle accompanying the Aurus sedan managed to pass, the limousine was compelled to halt. This event has ignited online discussions, with many questioning whether the autonomous rover would similarly yield to emergency vehicles like ambulances or fire trucks in future occurrences.
According to experts, in typical circumstances, such a robot would indeed yield to emergency vehicles, as its programming includes this training. However, Ekaterina Rodina, Director of External Development at Automacon Group of Companies and author of the Telegram channel «Rodina i Roboty,» provides an analysis of what might have transpired in the widely circulated video:
Director of External Development, Automacon Group of Companies; Author, Telegram channel «Rodina i Roboty»
«This situation isn`t a display of robot `folly,` but rather a clear example of gaps in the autonomous system`s learning logic. Several vulnerabilities have been exposed: issues with perception sensors and software, the decision-making module, and insufficient testing. For engineers, this represents valuable data and an excellent training opportunity. Based on this incident, they can identify areas requiring significant improvement. Naturally, the perception system and the recognition of special signals are critical. Neural network-based computer vision algorithms should have been trained extensively on images from all angles, specifically including such flashing lights and various types of audible signals. It appears there was either a lack of sufficient training data or specific examples in these particular perspectives. The system did not assign high priority to the recognized special signals and failed to correlate them with the rule to yield the right-of-way. This necessitates further refinement in predicting behavior. The system must not merely detect objects but also anticipate their trajectories and intentions. The robot likely observed one vehicle pass quickly and did not predict that a convoy would follow, nor that the subsequent vehicle would continue at high speed, not expecting obstacles. It interpreted the situation as `danger averted.` The algorithm probably operated simply on the strict rule `green traffic light, movement permitted,` and this rule was unable to be overridden by the, let`s say, more serious rule that there is an object with special signals on the road and it must be given way to. In this instance, the priority unequivocally shifts. It`s also important to offer some defense for the robot. This situation is, to put it mildly, a rare event. A motorcade speeding through an intersection, whether on a red or green light, is infrequent compared to the thousands of routine intersection crossings. This highlights the inherent difficulty of simulation. Such scenarios are challenging to model in a virtual environment and hard to train for. The robot, essentially, encountered a unique combination of factors: a pedestrian crossing, a recently passed traffic police car, and a subsequent vehicle with flashing lights. The algorithm was unable to quickly adapt, adjust, or make the correct decision.»
For a human driver who failed to yield to a motorcade, the penalties would include a 10,000 ruble fine or a driver`s license suspension ranging from six months to a year. As of now, there has been no official report concerning the potential liability of the robot`s owner.
