I don't really see why this is any different to any modern tech. People are always dying/being injured due to manufacturing errors/bugs. Just look at the recent Boeing disasters, or that guy that died while on Tesla 'autopilot'. Big Pharma drug accidents and so on. Yes there may well be cases of driverless cars being taken off the road or limited due to bugs - I don't think it will impact the progress though - it will add to test cases that need to be incorporated in future.Totally agree, however I do foresee a major problem that has been repeated elsewhere.
And logically it does not make sense.
Let's move forward a few? years:
1: 100 people die in cars due to human error = perfectly acceptable / a price worth paying
2: 50 people did in cars due to computer/software errors = totally and utterly unacceptable
Despite 50 people not being killed, that does not matter a monkeys.
The media, government. public etc will go ape over such numbers.
Mothers on TV in tears over how Apple/Google/Tesla's software killed their son/daughter/partner.
Cars deactivated till they can prove the software has been improved.
How do we get over this?
Or you think I'm wrong with this scenario, at least for many many years to come?