?

Hi everyone, I was listening to some tech news when I fell on a moral machine test.

The test is this: you put yourself in situations in which a self driving car has to kill a group of people to save another one, you decide who dies.

In the end they show the general results and how you're situated in comparison to those and guess what, there's a gender preference, look at this

Test can be taken here, aside from that gender thing it's actually interesting too. http://moralmachine.mit.edu/

EDIT: On second thought I should mention also that as you see I had strong gender bias too yet I applied no gender critera because I simply considered the ones in the car should die rather than the walking, so yeah maybe they were affected by a kind of critera like this as well.