Cool Thesis of the Week: Randomness as Fairness
Economic theory expects certain things of people: They will behave predictably, act to maximize utility, and if they can choose between getting money and not getting any, they will usually take the cash. However, says Justin Stewart ’12, there are well-known cases in which people regularly reject an offer of free money, against all the predictions of economics.
Justin, of Los Angeles, California, is writing his thesis, “Randomness as Fairness: An Exploration of the Ultimatum Game” on one example of this unexpected behavior. His Math-Economics thesis, which he is writing with Professors John Rorke and Albyn Jones, examines the “ultimatum game,” a game in which two participants interact with each other to decide how to split a given amount of money—or walk away from the money entirely, leaving both participants with nothing. Participants routinely make decisions “very very different from what you’d predict should happen” with standard economic theory, instead opting for strategies they know will leave them with less money. Along with this, Justin says he has a novel finding: People tend to assume situations generated by randomness to be inherently more fair.
Economic theory doesn’t match economic reality.
In the ultimatum game, participants are paired up, and one, the “proposer,” is given a certain amount of money—in Justin’s case, ten dollars. The proposer offers to the “responder” a way the two can split the money—for example, six dollars to the proposer and four to the responder—and the respondent can either accept or reject the offer. If the responder accepts, the participants walk away with the proposed amounts, and the game is over. If the responder rejects the offer, neither participant gets any money at all, and the game is over. If the assumptions of economic theory are right, Justin says, and participants are utility-maximizing “homo economicus,” responders should accept any offer that gives them any positive amount of money: They would end up with more money that way. The proposer, anticipating this, should offer the responder the lowest possible amount of money, leaving more for themselves.
However, says Justin, “that doesn’t always happen.” “Economic theory,” he explains, “doesn’t match economic reality.” Instead, proposers offer “on average nearly 40% of the stake,” while responders reject any offer that doesn’t give them at least 30% of the total. Justin says there are two explanations for the behavior of proposers. One hypothesis claims they “have a preference for fairness,” meaning they value fairness enough to sacrifice their own financial gain. An alternative is that they are in fact “behaving strategically” to maximizing their monetary intake, but fear punishment from the responder if they offer too little, so they offer enough to keep their partner happy.
Justin devised an experiment with the goal of “refining our understanding of fairness concerns” which asked the question, “How does the introduction of randomness affect people’s bargaining decisions?” He asked volunteers what they would offer to another person in the ultimatum game, and what amount would be the minimum offer they would accept. He also asked what they would offer if the decision were made for the responder by a coin toss, and what would be the minimum amount they would accept from a computer that generated the amount randomly. Even when a computer was making the choices, participants knew they would still be paid according to what it decided.
Justin found that people offered more to human responders than to computers. From this he gathered that to some extent, they were worried about being rejected by responders, though they also often offered some money to computers—meaning they also cared about fairness. Justin also found that people were willing to accept less from a computer than they were from another person. What’s more, Justin found that the difference in how much people offered was greater than the difference in how much they were willing to accept. There is no strategic benefit to a responder in rejecting an offer, Justin says, so he reasoned that the difference in the two minimally accepted offers could be taken to represent “how much they value fairness.” By virtue of the fact that people were more willing to accept random offers than offers made by other people, Justin reasoned to his “randomness as fairness” hypothesis: If people see a situation as random, he claims, they are more likely to view it as “fair.” If this valuation of fairness is assumed to be constant for each participant, Justin further reasoned, the remainder in the difference between how much they offered to a computer and to a human, as compared to the difference in how much they would accept, can be taken to represent their strategic considerations.
Even though it is created in synthetic conditions, Justin says that “the ultimatum game is a model of reality.” In markets where a seller has a monopoly, buyers will sometimes not pay even what they think a product is worth, if they feel that the seller is charging an unfair price. Justin’s hypothesis therefore has implications for the real world in markets of varying sizes and degrees of removal from the personal level: “In cases in which we view the price-setting mechanism as randomly determined, we may feel less that the choice we have to make is ‘intrinsically unfair.’”
Do you have or know of a thesis that compels attention? Just want to see your face in the Quest? Email firstname.lastname@example.org with “Cool Thesis” in the subject line.