How do you quantify something as complex and personal as humor? University of Alberta researchers have developed a mathematical method of doing just that — and it might not be quite as personal as we think.
“This really is the first paper that’s ever had a quantifiable theory of humor,” says U of A psychology professor Chris Westbury, lead author of the recent study. “There’s quite a small amount of experimental work that’s been done on humor.”
“We think that humor is personal, but evolutionary psychologists have talked about humor as being a message-sending device.”
The idea for the study was born from earlier research in which test subjects with aphasia were asked to review letter strings and determine whether they were real words or not. Westbury began to notice a trend: participants would laugh when they heard some of the made-up non-words, like snunkoople.
It raised the question — how can a made-up word be inherently funny?
Westbury hypothesized that the answer lay in the word’s entropy — a mathematical measure of how ordered or predictable it is. Non-words like finglam, with uncommon letter combinations, are lower in entropy than other non-words like clester, which have more probable combinations of letters and therefore higher entropy.
“We did show, for example, that Dr. Seuss — who makes funny non-words — made non-words that were predictably lower in entropy. He was intuitively making lower-entropy words when he was making his non-words,” says Westbury. “It essentially comes down to the probability of the individual letters. So, if you look at a Seuss word like yuzz-a-ma-tuzz and calculate its entropy, you would find it is a low-entropy word because it has improbable letters like Z.”
Inspired by the reactions to snunkoople, Westbury set out to determine whether it was possible to predict what words people would find funny, using entropy as a yardstick.
“Humor is not one thing. Once you start thinking about it in terms of probability, then you start to understand how we find so many different things funny.”
For the first part of the study, test subjects were asked to compare two non-words and select the option they considered to be more humorous. In the second part, they were shown a single non-word and rated how humorous they found it on a scale from 1 to 100.
“The results show that, the bigger the difference in the entropy between the two words, the more likely the subjects were to choose the way we expected them to,” says Westbury, noting that the most accurate subject chose correctly 92 percent of the time. “To be able to predict with that level of accuracy is amazing. You hardly ever get that in psychology, where you get to predict what someone will choose 92 percent of the time.”
This nearly universal response says a lot about the nature of humor and its role in human evolution. Westbury refers to a well-known 1929 linguistics study by Wolfgang Köhler in which test subjects were presented with two shapes, one spiky and one round, and were asked to identify which was a baluba and which was a takete. Almost all the respondents intuited that takete was the spiky object, suggesting a common mapping between speech sounds and the visual shape of objects.
The reasons for this may be evolutionary. “We think that humor is personal, but evolutionary psychologists have talked about humor as being a message-sending device. So, if you laugh, you let someone else know that something is not dangerous,” says Westbury.
He uses the example of a person at home believing they see an intruder in their backyard. This person might then laugh when they discover the intruder is simply a cat instead of a cat burglar. “If you laugh, you’re sending a message to whomever’s around that you thought you saw something dangerous, but it turns out it wasn’t dangerous after all. It’s adaptive.”
The idea of entropy as a predictor of humor aligns with a 19th-century theory from the German philosopher Arthur Schopenhauer, who proposed that humor is a result of an expectation violation, as opposed to a previously held theory that humor is based simply on improbability. When it comes to humor, expectations can be violated in various ways.
In non-words, expectations are phonological (we expect them to be pronounced a certain way), whereas in puns, the expectations are semantic. “One reason puns are funny is that they violate our expectation that a word has one meaning,” says Westbury. Consider the following joke: Why did the golfer wear two sets of pants? Because he got a hole in one. “When you hear the golfer joke, you laugh because you’ve done something unexpected — you expect the phrase ‘hole in one’ to mean something different, and that expectation has been violated.”
The study may not be about to change the game for stand-up comedians — after all, a silly word is hardly the pinnacle of comedy — but the findings may be useful in commercial applications, such as in product naming.
“I would be interested in looking at the relationship between product names and the seriousness of the product,” notes Westbury. “For example, people might be averse to buying a funny-named medication for a serious illness — or it could go the other way around.”
Finding a measurable way to predict humor is just the tip of the proverbial iceberg. “One of the things the paper says about humor is that humor is not one thing. Once you start thinking about it in terms of probability, then you start to understand how we find so many different things funny. And the many ways in which things can be funny.”
The findings were published in the Journal of Memory and Language.
- Video: The math of mirth