Marknadens största urval
Snabb leverans

The Car That Knew Too Much

Om The Car That Knew Too Much

The inside story of the groundbreaking experiment that captured what people think about the life-and-death dilemmas posed by driverless cars.Human drivers don't find themselves facing such moral dilemmas as "should I sacrifice myself by driving off a cliff if that could save the life of a little girl on the road?" Human brains aren't fast enough to make that kind of calculation; the car is over the cliff in a nanosecond. A self-driving car, on the other hand, can compute fast enough to make such a decision--to do whatever humans have programmed it to do. But what should that be? This book investigates how people want driverless cars to decide matters of life and death.In The Car That Knew Too Much, psychologist Jean-François Bonnefon reports on a groundbreaking experiment that captured what people think cars should do in situations where not everyone can be saved. Sacrifice the passengers for pedestrians? Save children rather than adults? Kill one person so many can live? Bonnefon and his collaborators Iyad Rahwan and Azim Shariff designed the largest experiment in moral psychology ever: the Moral Machine, an interactive website that has allowed people --eventually, millions of them, from 233 countries and territories--to make choices within detailed accident scenarios. Bonnefon discusses the responses (reporting, among other things, that babies, children, and pregnant women were most likely to be saved), the media frenzy over news of the experiment, and scholarly responses to it.Boosters for driverless cars argue that they will be in fewer accidents than human-driven cars. It's up to humans to decide how many fatal accidents we will allow these cars to have.

Visa mer
  • Språk:
  • Engelska
  • ISBN:
  • 9780262548557
  • Format:
  • Häftad
  • Sidor:
  • 176
  • Utgiven:
  • 3. februari 2026
  • Mått:
  • 133x0x203 mm.
  • Vikt:
  • 368 g.
  Fri leverans
Leveranstid: Kan förbeställas
  • Boken kan förbeställas tidigast 3 månader innan utgivningsdatumet

Beskrivning av The Car That Knew Too Much

The inside story of the groundbreaking experiment that captured what people think about the life-and-death dilemmas posed by driverless cars.Human drivers don't find themselves facing such moral dilemmas as "should I sacrifice myself by driving off a cliff if that could save the life of a little girl on the road?" Human brains aren't fast enough to make that kind of calculation; the car is over the cliff in a nanosecond. A self-driving car, on the other hand, can compute fast enough to make such a decision--to do whatever humans have programmed it to do. But what should that be? This book investigates how people want driverless cars to decide matters of life and death.In The Car That Knew Too Much, psychologist Jean-François Bonnefon reports on a groundbreaking experiment that captured what people think cars should do in situations where not everyone can be saved. Sacrifice the passengers for pedestrians? Save children rather than adults? Kill one person so many can live? Bonnefon and his collaborators Iyad Rahwan and Azim Shariff designed the largest experiment in moral psychology ever: the Moral Machine, an interactive website that has allowed people --eventually, millions of them, from 233 countries and territories--to make choices within detailed accident scenarios. Bonnefon discusses the responses (reporting, among other things, that babies, children, and pregnant women were most likely to be saved), the media frenzy over news of the experiment, and scholarly responses to it.Boosters for driverless cars argue that they will be in fewer accidents than human-driven cars. It's up to humans to decide how many fatal accidents we will allow these cars to have.

Användarnas betyg av The Car That Knew Too Much



Hitta liknande böcker
Boken The Car That Knew Too Much finns i följande kategorier:

Gör som tusentals andra bokälskare

Prenumerera på vårt nyhetsbrev för att få fantastiska erbjudanden och inspiration för din nästa läsning.