The recently quiet revolution of autonomous vehicles has led us to a junction in the road. With the already established car manufacturers such as Tesla, Volvo and Audi changing lanes into the autonomous vehicle market - such other unlikely entrants as Apple, Amazon and Samsung are also making moves in the lucrative industry. However, as with most other socially interactive technologies such as social media, there is a danger that the law could be left behind again - this time by autonomous vehicles.
In response, the Scottish Law Commission, in partnership with the Law Commission of England and Wales, has begun a consultation over the potential regulation of autonomous vehicles - including self-driving cars, buses and trains. The consultation considers a wide range of issues, including: the licensing of the use of autonomous vehicles: how these autonomous vehicles interact with our current transport system, and, perhaps most importantly; what obligations are there on an owner of a privately owned autonomous vehicle if they are in an accident.
The current position is that all those who use a motor vehicle on the road must take out compulsory third party motor insurance, to cover a driver’s liability if they injure or cause the death of another person using the road, or cause damage to property. However when an accident may be caused by an autonomous car, that was not the fault of the human driver, who is left to pick up the pieces?
In 2016, the Government identified this as an issue, and introduced The Automated and Electric Vehicles Act 2018. Under this legislation, where an accident is caused by an automated vehicle when driving itself and the vehicle is insured at the time of the accident, and someone is hurt, or property is damaged as a result of the accident, then the insurer of the vehicle is liable for any damages relating to the accident.
This may mean that in accidents involving autonomous vehicles, insurance companies may choose to vigorously defend the claims - and investigate other angles in terms of contributory negligence from another party to the accident. The insurers may consider following a route of a defective autonomous vehicle, and head down the road of a product liability claim against the manufacturer. Or the insurers may look to pin blame on the human driver of the autonomous vehicle.
This argument came to a head in America in 2016, whereby a Tesla that was driven in its auto-pilot mode crashed with a tractor and its trailer when it pulled into a junction crossing the road - killing the driver of the Tesla. There were arguments raised on both sides of the fence, whereby many were quick to point the finger at the manufacturer for a failure to stop the car upon seeing the tractor and its trailer pulling into the junction. Investigators into the accident found that the vehicle had warned the driver several times to disengage the auto-pilot mode and drive the car manually.
A more recent case in 2019 saw a man charged with dangerous driving following reports he was caught sleeping in his Tesla driving over 85mph on a busy road in Alberta, Canada. The vehicle immediately increased speed to over 90mph when engaged by the police. Although there have been many advancements in the realm of autonomous vehicles, it is clear to see that the human ‘driver’ of the car is still considered the responsible party for ensuring the safety of themselves and the other road users around them. Tesla have always maintained that their vehicles are not fully autonomous, likely protecting their position on liability.
In any future case involving an autonomous vehicle, it is likely the following question may be asked - would the accident have happened if a human was driving the car? It is certain to say that technology has come on leaps and bounds in the last few years in respect of autonomous vehicles - however there is a human intuition that some argue may override any computer probability calculations used by an autonomous vehicle. However the consultation by the Law Commission of Scotland and the Law Commission of England and Wales advise that this is a tricky path to head down. If the objective standard is of human fault in an accident, then it will become difficult to apply this to autonomous vehicles, given the difference between the two operating systems. The consultation considers the following: “Take an example in which a person cycling without lights at night is hit by an automated vehicle. A human driver would have had difficulty in seeing the unlit cyclist. If the appropriate comparator claim is with a hypothetical human driver, the cyclist would bear considerable responsibility for the accident. However, the failure to have lights might have very little effect on an automated vehicle equipped with LIDAR (Light Detection and Ranging - A remote sensing method that uses pulsed laser to measure distance to a target. Usually abbreviated to LIDAR.)”
It is clear to see that we have a long road ahead of us in both the roll out of autonomous vehicle technology, and the regulation of the same. Either way there are some clear benefits to the technology, which may greatly assist many people’s lives.
Further information on road traffic accidents can be found here.