Driverless cars were found to be more dangerous than humans in some of the most common driving conditions, a new study has revealed.
Autonomous vehicles were deemed safer than humans – except at dusk, dawn or when the car is turning, researchers at the University of Central Florida found.
Generally, driverless cars are involved in fewer accidents than their human-driven counterparts.
But during low-light conditions – specifically at dusk or dawn – robot cars were found to be five times more likely to end up in an accident.
And when turning, robot cars were almost two times more likely to be involved in an accident compared to human drivers, researchers found.
The study comes as driverless cars are cruising the roads in many countries, including the US and China.
But there have already been several crashes involving autonomous cars in the US, so the technology appears to be far from perfect.
A Tesla in self-driving mode launched into a police car in Fullerton, California, last week, ACB 7 News reports.
CCTV footage shows how the Tesla sped into the police vehicle at a junction.
The driver admitted enabling his car’s self-drive mode ‘while using his cell phone, a clear violation of responsible driving practices and California law,’ Fullerton Police said in a statement.
And last year, Robotaxi Cruise’s permit was suspended in California after a pedestrian was dragged under a vehicle in San Fransisco.
However, the automated cars were back on the road in May after the General Motors-owned company resumed testing, The Verge reports.
Another Tesla was blamed for causing an eight-car pileup in California in November 2022.
The Department for Transport has said self-driving cars could launch in the UK by 2026, following new legislation that passed in May.
It said the law is aimed to improve road safety by ‘reducing human error, which contributes to 88% of road collisions,’ and create jobs, according to Sky News.
An expert on automation safety said the UK is ‘two or three decades’ away from fully machine-driven cars, known as level 5 automated vehicles.
Saber Fallah, professor of safe AI and autonomy, told Metro.co.uk: ‘Each of these levels have different risks. When the Department for Transport says they will be on the road in the next two years, which level of self-driving are they talking about?
‘Level 5 is still a dream to be driven with humans on the road, especially in the UK and Europe.’
Teslas use level 2 advanced automation, which means humans still have ‘full responsibility for driving the car,’ he explained.
He said the main challenge in the UK for fully-automated cars was about safety and the machine’s inability to ‘think about a situation and made decisions’ like a responsible human.
‘It’s not just other drivers but cyclists, pedestrians and all road users. The car should have some sort of cognitive ability to negotiate with these road users.
‘In central London or city centres, I don’t think they are safe,’ he added.
Transport Secretary Mark Harper told the BBC in December that the technology worked following a roll-out in California where cars ‘without a safety driver, so in full autonomous mode’ were already in action.
He said the technology could ‘actually improve safety on the roads.’
The researchers from the University of Central Florida used data on 2,100 accidents in California from the National Highway Traffic Safety Administration (NHTSA) involving cars with some level of automated self-driving or driver assistance technologies.
Mohamed Abdel-Aty and Shengxuan Ding gathered data on more than 35,000 accidents involving human drivers between 2016 and 2022.
Get in touch with our news team by emailing us at webnews@metro.co.uk.
For more stories like this, check our news page.