castleofargh
Sound Science Forum Moderator
- Joined
- Jul 2, 2011
- Posts
- 10,475
- Likes
- 6,111
J.E. (Hans) Korteling, Alexander Toet, in Encyclopedia of Behavioral Neuroscience, 2nd edition, 2022
Abstract
A. Wilke, R. Mata, in Encyclopedia of Human Behavior (Second Edition), 2012
Heuristics and Biases: A Short History of Cognitive Bias
In this hobby, we could take confirmation bias as a meaningful example. Some dictionary wrote this:
A more correct outcome would obviously require to only rely on facts pertaining to the matter at hand.
Now I'll just present a few experiments showing how something apparently unrelated still manages to affect our behavior or experience:
People get tasked to count money, while another group counts M&M's. Then people go to another room one by one and on the way someone drops what she/he was carrying. The M&M's group will have more people stop to help. And over other similar tests, it was found that dealing with money will on average make people more egoistical. Money manages to affect decisions that aren't related to financial concerns. It's not logical, but it sometimes happens to some people.
A similar experiment is about people waiting for the bus, another is done on kids, the trend remains. At the same time, various experiments have shown that counting our money made us feel good and made us more willing to take on challenges.
People gamble and while anybody with basic education on statistics knows when successive results are not statistically related( random sequences where previous results do not affect the following ones), most people will still "see" patterns within the results and want to bet based on those discovered patterns when nothing in principle or in practice validates that behavior.
The Monty Hall problem. You have 3 doors, you pick one, then one of the 2 other doors is opened (with no prize behind). You can decide to keep your initial choice or change it for the remaining door.
Statistics clearly tell us to change, but most people had trouble letting go of their original gut feeling choice and would mostly stick to the door they first picked.
They knew nothing about where the prize was, they had no particular reason to stay with the same door, and stats actually tell us to change. But something inside people's head made them reluctant to let go of what they had already decided to be the answer, no matter how arbitrary that initial choice was.
People working on matching fingerprints were given extra information about the prints in the form of a DNA result. Sometimes saying the DNA already matched, sometimes saying it didn't and sometime saying the result was inconclusive(none of it specifically following the actual matching of the fingerprints, to offer a variety of scenarios). The result of that experiment showed how of course the DNA information that should absolutely not affect their own work, did. It also revealed what has been found in various areas of study, that the harder it was to match the fingerprints, the more they got influenced by the DNA information. Simply said, when the thing we're trying to find is hard to get, the brain will assign more significance to other, less related sources of information to make up for the missing data. Some people talk about this in terms of the brain answering an easier but false question, instead of the hard but correct one.
Next, consider a particular encounter with one person clearly identified as belonging to a certain group (ethnic, country, fans of a football team, members of Sound Science...). That encounter if it has gone particularly bad (or really well) can easily have someone expect all members of that group to behave similarly. It's to the point where he might predict behavior and motivation for people he doesn't know, just from that one initial encounter.
It's a prevalent tendency in the world. It isn't logical or right, but it's in us. From racism based on anecdotes, to a priori linked to being a Samsung phone owner instead of having an iPhone. Our brain wants to simplify things into convenient groups and patterns. The complicated world becomes simpler, easier to handle, at the cost of being somewhat wrong or very wrong about too many things to count.
A group is given one of 2 numbers by turning a wheel (one small, one big number). Then they're asked to estimate the number of African countries, members of the United Nations. People given the small number gave on average a smaller value than those with the bigger number from turning the wheel.
Similarly, a businessman negotiating a selling price knows the value of giving a first number that's above what he wishes to sign on. Because it creates a mental reference, and the buyer will be satisfied from simply getting below that first reference.
Maybe the most significant aspect about biases is thinking we're not susceptible to them. Many experiments demonstrate that all too well. Somebody ran tests on biases on a group, then asked that group to estimate how biased they thought they have been in those experiments. After that, they gave the participants results showing how biased they actually were in those tests.
Then for the lolz, the participants were asked again to estimate their bias, and they still on average kept saying that they were less biased than they had just been shown to be.
How is this actually related to audio? But of course this was all a clever plot to sell you blind testing and measurements. You shouldn't trust yourself as much as you do, so how about some more love for evidence based data and tests that try to remove some of our biases?
It's also to have something to show people who get mad when we suggest they might be biased (as we all are). Because while we mean "remember that you're only human", they often think we just called them stupid, and as you can expect, that really helps nobody.
If I made big mistakes(I'm not a psychoanything), if some of the experiments I describe are not what I make of them, or if they got debunked by other work. Let me know, I'll look into it, correct my mistake, then deny it ever existed, like one should. ^_^
Found this on the wiki, to illustrate that maybe your previous rational about why you weren't biased, might have missed a few dozen scenarios. Brains are complex:
Abstract
Cognitive biases are systematic cognitive dispositions or inclinations in human thinking and reasoning that often do not comply with the tenets of logic, probability reasoning, and plausibility. These intuitive and subconscious tendencies are at the basis of human judgment, decision making, and the resulting behavior. Psychological frameworks consider biases as resulting from the use of (inappropriate) cognitive heuristics that people apply to deal with data-limitations, from information processing limitations, or from a lack of expertise. Neuro-evolutionary frameworks provide a more profound explanation of biases as originating from the inherent design characteristics of our brain as a neural network that was primarily developed to perform basic physical, perceptual and motor functions, and which also had to promote the survival of our hunter-gatherer ancestors.
A. Wilke, R. Mata, in Encyclopedia of Human Behavior (Second Edition), 2012
Heuristics and Biases: A Short History of Cognitive Bias
In the early 1970s, Amos Tversky and Daniel Kahneman introduced the term ‘cognitive bias’ to describe people's systematic but purportedly flawed patterns of responses to judgment and decision problems.
In this hobby, we could take confirmation bias as a meaningful example. Some dictionary wrote this:
The general idea is clear enough, can be found in all common definitions, and we all know of this most common and very human behavior. Our own desire for a certain outcome becomes a significant influence on how we think, how we search for answers, and it also determines what we're more willing to accept as true when information is presented to us.The fact that people are more likely to accept or notice information if it appears to support what they already believe or expect.
A more correct outcome would obviously require to only rely on facts pertaining to the matter at hand.
Now I'll just present a few experiments showing how something apparently unrelated still manages to affect our behavior or experience:
People get tasked to count money, while another group counts M&M's. Then people go to another room one by one and on the way someone drops what she/he was carrying. The M&M's group will have more people stop to help. And over other similar tests, it was found that dealing with money will on average make people more egoistical. Money manages to affect decisions that aren't related to financial concerns. It's not logical, but it sometimes happens to some people.
A similar experiment is about people waiting for the bus, another is done on kids, the trend remains. At the same time, various experiments have shown that counting our money made us feel good and made us more willing to take on challenges.
People gamble and while anybody with basic education on statistics knows when successive results are not statistically related( random sequences where previous results do not affect the following ones), most people will still "see" patterns within the results and want to bet based on those discovered patterns when nothing in principle or in practice validates that behavior.
The Monty Hall problem. You have 3 doors, you pick one, then one of the 2 other doors is opened (with no prize behind). You can decide to keep your initial choice or change it for the remaining door.
Statistics clearly tell us to change, but most people had trouble letting go of their original gut feeling choice and would mostly stick to the door they first picked.
They knew nothing about where the prize was, they had no particular reason to stay with the same door, and stats actually tell us to change. But something inside people's head made them reluctant to let go of what they had already decided to be the answer, no matter how arbitrary that initial choice was.
People working on matching fingerprints were given extra information about the prints in the form of a DNA result. Sometimes saying the DNA already matched, sometimes saying it didn't and sometime saying the result was inconclusive(none of it specifically following the actual matching of the fingerprints, to offer a variety of scenarios). The result of that experiment showed how of course the DNA information that should absolutely not affect their own work, did. It also revealed what has been found in various areas of study, that the harder it was to match the fingerprints, the more they got influenced by the DNA information. Simply said, when the thing we're trying to find is hard to get, the brain will assign more significance to other, less related sources of information to make up for the missing data. Some people talk about this in terms of the brain answering an easier but false question, instead of the hard but correct one.
Next, consider a particular encounter with one person clearly identified as belonging to a certain group (ethnic, country, fans of a football team, members of Sound Science...). That encounter if it has gone particularly bad (or really well) can easily have someone expect all members of that group to behave similarly. It's to the point where he might predict behavior and motivation for people he doesn't know, just from that one initial encounter.
It's a prevalent tendency in the world. It isn't logical or right, but it's in us. From racism based on anecdotes, to a priori linked to being a Samsung phone owner instead of having an iPhone. Our brain wants to simplify things into convenient groups and patterns. The complicated world becomes simpler, easier to handle, at the cost of being somewhat wrong or very wrong about too many things to count.
A group is given one of 2 numbers by turning a wheel (one small, one big number). Then they're asked to estimate the number of African countries, members of the United Nations. People given the small number gave on average a smaller value than those with the bigger number from turning the wheel.
Similarly, a businessman negotiating a selling price knows the value of giving a first number that's above what he wishes to sign on. Because it creates a mental reference, and the buyer will be satisfied from simply getting below that first reference.
Maybe the most significant aspect about biases is thinking we're not susceptible to them. Many experiments demonstrate that all too well. Somebody ran tests on biases on a group, then asked that group to estimate how biased they thought they have been in those experiments. After that, they gave the participants results showing how biased they actually were in those tests.
Then for the lolz, the participants were asked again to estimate their bias, and they still on average kept saying that they were less biased than they had just been shown to be.
How is this actually related to audio? But of course this was all a clever plot to sell you blind testing and measurements. You shouldn't trust yourself as much as you do, so how about some more love for evidence based data and tests that try to remove some of our biases?
It's also to have something to show people who get mad when we suggest they might be biased (as we all are). Because while we mean "remember that you're only human", they often think we just called them stupid, and as you can expect, that really helps nobody.
If I made big mistakes(I'm not a psychoanything), if some of the experiments I describe are not what I make of them, or if they got debunked by other work. Let me know, I'll look into it, correct my mistake, then deny it ever existed, like one should. ^_^
Found this on the wiki, to illustrate that maybe your previous rational about why you weren't biased, might have missed a few dozen scenarios. Brains are complex: