Though the technology and production of self-driving cars is still under way, self-driving cars could soon be made available to the public. But now the question of whether they should be remains to be discussed. Part of being a responsible driver is the ability to make moral decisions when presented with a potential problem; however, in a self-driving car, the moral decision-making is left to the computer. But how does the computer determine what is right? Settling on a universally accepted moral code then becomes instantly more complicated. In order to find out the public’s perspective on moral decisions made by machine intelligence, such as self-driving cars, researchers built an online game called “Moral Machine.” On average, people who played the game believed that the moral thing for the self-driving car to do was spare the young over the old, humans over animals, and the masses over a few. However, researchers did find some variations among the responses based on particular cultural systems, beliefs, and values. For example, sparing those who are younger over those who are older was a preference from those in Latin America, France, Hungary, and the Czech Republic, whereas those in Asia and the Middle East favoured sparing the old over the young because those cultures place importance on elderly respect. Of course, the study should not be taken as the final answer as to what is universally moral. Before making any final decisions, more research needs to be done and more discussions need to be held globally so the best decision, programing, and policies can be made.