High-Drama on the Streets: Autonomous Cruise Vehicle’s Two Near-Misses with Kids, a Day Apart!
On the busy streets of San Francisco, two alarming incidents were reported involving self-driving Cruise vehicles, part of General Motors’ self-driving vehicle division, which were accused of nearly hitting children in two separate close calls taking place just one day apart. While no physical harm was caused in these incidents, they have sparked an intense debate on the safety of autonomous vehicles and their ability to effectively interact with unpredictable elements of the environment like children.
The first incident took place when a self-driving Cruise vehicle was reported to abruptly stop when a child ran across the road. The report suggests that the child may have run into the path of the car unexpectedly, prompting the vehicle to halt suddenly in response. The autonomous vehicle’s built-in sensors and algorithms are designed to detect obstructions in their path, and the sudden move by the child would have registered as a threat.
The second incident occurred just the next day. This time, the self-driving Cruise vehicle was involved in another near miss with a child who was walking along the sidewalk. According to eyewitness accounts, the car, which was turning a corner, came dangerously close to the child. The car nearly mounted the pavement during the turn, once again shaking the confidence of the public in the vehicle’s ability to safely navigate crowded urban areas.
Several key concerns have been raised from these encounters. Central to this is the question of how well autonomous vehicles can predict, respond to or even prevent unexpected actions from pedestrians, particularly vulnerable groups like children who are smaller and less predictable in their movements. Despite being outfitted with advanced sensors, LIDAR systems, and high-resolution cameras, autonomic cars still struggle with the complexities of human behavior—something a human driver might react to based on experience and instinct.
Furthermore, both incidents also shed light on the vehicle’s path management and decision-making features. The incidents indicate that despite being pre-programmed with navigation maps and traffic rules, Cruise’s self-driving cars might struggle with complex urban traffic situations. These scenarios can include not just the over-reliance on Lidar and camera systems but also the sophisticated algorithms governing their movements and understanding of the real-world traffic scenarios.
While Cruise vehicles undergo extensive testing and followed protocols to ensure safety, these incidents have underscored the importance of continued refining and improvement. It is important to note, however, that autonomous vehicles, in general, have better safety records per mile driven than human-operated vehicles. Furthermore, many companies have are continually working to improve their autonomous systems with machine learning techniques, so they