Help me recall the name of this mass media theory/phenomenon

The friendliest place on the web for anyone who enjoys boating.
If you have answers, please help by responding to the unanswered posts.

Civilitas

Senior Member
Joined
Feb 20, 2018
Messages
128
Location
Air
Vessel Name
M/V Peter Iredale ;)
Vessel Make
rusting hulk
I am trying to recall the name of a phenomenon I read about somewhere a few years ago. It's to illustrate a point in a presentation I'm making to a high school class. Nutshell: you find information you know to be wrong from a well-known source, but then you go on to trust the source on other information that you can't verify.

I have been struggling to find the name of it but keep failing. This is a pretty diverse and intelligent group and maybe someone here can ID it. Here's how it was described to me in a classic example.

Situation: You are a genuine expert in your narrow field: accounting, medicine, air traffic control, harbour pilot, etc. You are reading your morning paper (recall those days?), and you see an article about an accident or controversy in your area of expertise. You can immediately tell from the "facts" related that the reporter has made a complete hash out of it and the truth, while intimated at, is WAY more complex and hidden vs. what they are getting at, or even the opposite.

So you dismiss it as useless. But then you go on to read the paper, but still generally trusting its articles and opinions on subjects you don't know anything about in depth - when clearly your ONE moment of insight showed that to be a bad bet.

There is an actual name or specific moniker for this cognitive dissonance wrt authority. Can anyone recall it, or recognize what I'm talking about?

NOTE: this has nothing at all to do with the current media criticism, etc. I'm trying to give really concrete examples for a class on epistemology and to show how "appeal to authority" is a more widely used way of knowing than we think it is.
 
Greetings,
Mr. C. Holy cow. I had to look up epistemology and I'm still not sure...
https://plato.stanford.edu/entries/epistemology/


Perhaps the word you're looking for is marketing.

Oh man, I just read that and it gives me a headache. That is so badly written.

I just say it's "the study of how we know what we know." Which is pretty clear vs. that whole page of a 1st paragraph there... ;) I'm not a "philosopher" but have some familiarity, and this is high school, not Yale Divinity school we are talking about...

But this is really actually interesting common-sense stuff that's good to be aware of, ways we delude ourselves. Like the sunk cost fallacy, etc.
 
I know exactly what you mean, but also don't know the phrase. Not sure I ever heard it called something.

In weather forecasting it is still a problem to this day. It manifested itself very obviously in the early days of weather satellite pictures, 1980's.

A weather forecaster would look at a surface analysis (which may or may not correct in terms of frontal boundary placement. Look at a satellite picture for the same time, see that in reality, the frontal boundary was significantly more advanced (or retarded) than as depicted on the surface analysts.

Then look at the 24 hour (or some time period in the future) forecast position and use that incorrect forecast position to write his/her forecast (usually TAFs, Terminal Aerodrome Forecast), even though it was obvious (because of the satellite pic) that the surface analysis was wrong and therefore any forecast derived from that analysis would be wrong. :facepalm:

I still see amateurs doing the same thing when they use Windy or any weather data. :eek:
If the weather is not as expected today, then there is no point in looking to the future for that particular model run.

I look at Windy every day, but I only look at the current weather. Then if and only if the current weather is behaving as forecast, I'll look at some point in the future that I have an interest in.

I think the issue is we, humans, have a tendency to believe what we see. therefore whether it's a weather chart or financial spreadsheet, if it's printed, it looks more believable. :D
 
Cognitive dissonance ?
 
Deference to authority? The context is different from your examples, but the same sort of dysfunction.
 
Selective Unsupported Reliance?
 
Cognitive dissonance ?

Close....I think it is called "cognitive bias"!!! Google it...pretty interesting stuff. We touch on it in decision making in my line of work.
 
Greetings,
Do you mean that guy I see on TV selling Preparation H really doesn't know what he's talking about? But, but...he's wearing a lab coat.


200w.webp
 
"You are a genuine expert in your narrow field: accounting, medicine, air traffic control, harbour pilot, etc"


An expert is one that knows more and more about less and less , until he finally knows Everything about Nothing!
 
Deference to authority? The context is different from your examples, but the same sort of dysfunction.

Yes, this is the general concept, though generically called "appeal to authority."

This particular situation is when you STILL do that, though you have distinct empirical evidence invalidating it.

I still can't find the reference to this. I spent a little time this afternoon chasing after it, and at least found this amusing site. RT would likely love this:

https://youarenotsosmart.com/

It's a one-man kinda operation about these kind of fallacies, with podcasts as well as essays/posts. Amusing and informative.

I'm glad Richard above validated I'm not making this up and he knows of it; it's a genuinely studied/labeled example of logical fallacy - just can't remember it!
 
Last edited:
Close....I think it is called "cognitive bias"!!! Google it...pretty interesting stuff. We touch on it in decision making in my line of work.

I believe that Cognitive Bias is a group of behavioral biases. Within them, is one called 'Anchoring' or 'Focalism'.

Anchoring or focalism is a cognitive bias where an individual relies too heavily on an initial piece of information offered (considered to be the "anchor") when making decisions.

https://en.wikipedia.org/wiki/Anchoring

Ironically, I was just discussing this concept with my brother over drinks last night.

The idea that your first piece of information is presented and accepted as fact, and as such does not get critical review. However, each piece of contradictory information, regardless of whether it is true is subject ot massive amounts of scrunity and often is rejected and fails to displace the primary piece of information.

This is, by and large, what drives the popularity of religion.
 
The idea that your first piece of information is presented and accepted as fact, and as such does not get critical review. However, each piece of contradictory information, regardless of whether it is true is subject ot massive amounts of scrunity and often is rejected and fails to displace the primary piece of information.

This is, by and large, what drives the popularity of religion.
user_offline.gif


Explains socialism and "global warming" and the claim to be setteled science , both a religion.
 
Last edited:
You can add "second hand smoke" to your list, FF.
 
Anchoring or focalism is a cognitive bias where an individual relies too heavily on an initial piece of information offered (considered to be the "anchor") when making decisions.

There is a very amusing bit about the Anchoring effect in that website I linked above that I just ran across. Link:

https://youarenotsosmart.com/2010/07/27/anchoring-effect/

Briefly: Peopel are about to bid in an auction. They are told to write down the items (shirt, iron, etc.). At the end of the list, they are told to write down the last two digits of their ssn.

Those with high numbers consistently bid more/overpay. The mere visual reminder of "high" numbers anchors their thinking (biases it) in that direction.
 
I think the phrase you are looking for is the " Gell-Mann amnesia effect"
Being the belief that a source can be trusted on other statements even when that statement about which your personal knowledge and experience shows the source to be in major error.
See, for example, most news publications.
JohnS
 
Possibly related to another condition I heard about, Rectal Iris, in which no amount of information—regardless of source—contradicting deeply seated beliefs is tolerated?
 
Don't know the name of it, but have stopped letting myself be misled, at least when it comes to reading the paper, or listening to the "news." Every single time that they talk about something that I know and understand in depth, they make at least one major mistake that throws the whole point of the story off. Every. Single. Time.


So now I always assume that -- in every newspaper story that I read, and every news report that I hear or see -- there is at least one major mistake, and because of it their conclusions are probably wrong.


Yeah. I really am THAT cynical!
 
One of the things that has given me a low opinion of news reports, was thirty years of reading news reports of events that I participated in, that I barely recognized.
 
"Rectal Iris, in which no amount of information—regardless of source—contradicting deeply seated beliefs is tolerated?"

Especially when the belief is akin to a religion, like socialism.

Explains why so many command economies are a Utopia.
 
"Rectal Iris, in which no amount of information—regardless of source—contradicting deeply seated beliefs is tolerated?"

Especially when the belief is akin to a religion, like socialism.

Explains why so many command economies are a Utopia.

Or the nearly religious belief that anyone who chooses not to wrap themselves in the flag as a litmus test for patriotism must be a socialist.
 
We all have the same problem to some degree. "We don't know what we don't know"
What we believe we know we believe to be fact. Otherwise we would have to accept that we are full of poop and don't really know all that much. Not a pretty picture.
 
I think the phrase you are looking for is the " Gell-Mann amnesia effect"
Being the belief that a source can be trusted on other statements even when that statement about which your personal knowledge and experience shows the source to be in major error.
See, for example, most news publications.
JohnS

JohnS rocks it out of the park. I and lots of smart people have struggled with this for a week.

Not to say this shuts the convo down; there is a lot of wisdom here.

This from Northern Spy is really good stuff:


A lot of good wisdom and discussion here.
 
Mixed message from your example

What you are describing is HALO EFFECT whereby the reader’s halo for the publication is positive and the one article that is read and know to be inaccurate DOES NOT alter the perception of the publication so the reader continues to read additional content in the publication !!!

Halo effect can be positive OR negative !

If you are focusing only on the person who is reading the article in your example, not the publication, I would have said that the reader rejected the article, not the publication source, based on the reader’s PARADIGM....and it is an example of the “paradigm effect” leading the reader to reject the article/author and NOT the publication.

It’s not dissonance as someone else has suggested.

I have a Ph.D and years of industry and academic experience
 
Back
Top Bottom