![Who would sacrifice one person in order to save five? – Global differences when it comes to making moral decisions | Max Planck Institute for Human Development Who would sacrifice one person in order to save five? – Global differences when it comes to making moral decisions | Max Planck Institute for Human Development](https://www.mpib-berlin.mpg.de/940660/original-1653032146.jpg?t=eyJ3aWR0aCI6MTIwMCwiaGVpZ2h0Ijo2MjgsImZpdCI6ImNyb3AiLCJvYmpfaWQiOjk0MDY2MH0%3D--1bc10b1944f67cae37c85d15ac7ab5781f49b704)
Who would sacrifice one person in order to save five? – Global differences when it comes to making moral decisions | Max Planck Institute for Human Development
![Should a self-driving car kill the baby or the grandma? Depends on where you're from. | MIT Technology Review Should a self-driving car kill the baby or the grandma? Depends on where you're from. | MIT Technology Review](https://wp.technologyreview.com/wp-content/uploads/2018/10/m.i.tsmartcarchoicescolo01-9.jpg?w=2760?crop=0px,148px,2760px,1552px&w=2760px)
Should a self-driving car kill the baby or the grandma? Depends on where you're from. | MIT Technology Review
![Michael Shermer on Twitter: "Trolley Problem test: subjects given chance to flip switch & divert train to side track to kill 1 worker & save 5. Only 2 of 7 did—opposite ratio Michael Shermer on Twitter: "Trolley Problem test: subjects given chance to flip switch & divert train to side track to kill 1 worker & save 5. Only 2 of 7 did—opposite ratio](https://pbs.twimg.com/media/DV9a4b3VMAEl7dQ.jpg)
Michael Shermer on Twitter: "Trolley Problem test: subjects given chance to flip switch & divert train to side track to kill 1 worker & save 5. Only 2 of 7 did—opposite ratio
![Evil AI Cartoons on Twitter: "Quiz: Is this not a Trolley Problem?👇🧵 https://t.co/DFrBF5mEh3" / Twitter Evil AI Cartoons on Twitter: "Quiz: Is this not a Trolley Problem?👇🧵 https://t.co/DFrBF5mEh3" / Twitter](https://pbs.twimg.com/media/FMSJTMuWUAI4SMs.png)