The Average Person Is Wrong

This article deals with the question of the fundamental beliefs that describe how we view the world around us. So, some of the top-level categories of beliefs are: Religious, Political, Scientific, Gender-Roles and Economics. Lets consider Religious beliefs: Most people have religious beliefs that directly contradict those of most (70%?) of the rest of the world. Therefore (assuming that contradictions cannot exist in what is true), at least 70% of the world’s people are wrong about religious beliefs.  The same goes for political beliefs, scientific beliefs, beliefs about the proper roles of men and women and, of course, beliefs about economics.

Economic Beliefs

This matter of our “beliefs about economics” is what we want to focus on. There are a couple of ways to look at this. One way has to do with value. In the stock market, everybody wants to make money, so they invest in whatever they believe has the lowest price, but the highest value. Those whose beliefs turn out to have been correct make money, and those whose beliefs turn out to have been wrong loose money.

Another type of economic belief has to do with the worthwhile-ness of investing our time and energy in education (and which specific field of education.) Another is the belief about the worthwhile-ness of looking for a job (given what a person believes to be the overall positivity or negativity of the environment.) And then there is the belief about the worthwhile-ness of putting in a full-days effort once a job has been lined up.

A wide divergence in economic beliefs

As we saw in the case of the religious, political and scientific fields, the beliefs that people hold to be universal truths are (roughly) 70% provably wrong. (At least in the sense that they contradict one another, and we are assuming here that two contradictory universal beliefs cannot both be correct.) For example, lets take the case of one of the most commonly  held religious beliefs, about Hell.  Either the majority of people are going to be eternally tormented in the flames of Hell – or they are not. Its hard to see how Hell could both exist and not exist.

Similarly, there are also wide contradictions in the beliefs held regarding the universal economic truths about the worthwhile-ness of Working, Looking for work, and Studying to improve job skills.   (And these wide contradictions in beliefs exist even after you factor in the specifics of any one person’s or region’s economic profile.)

We all just happened to be right … about everything 

One universal fallacy that pretty much everyone holds (hence it is universal) is that “I am generally correct about my fundamental beliefs and those who disagree with me are generally incorrect.” But of course that is absurd. We cannot all, simultaneously have the privilege to be standing upon the great Mountain of Truth, when we generally strongly disagree with one another.

The only logical conclusion that the “average” person (there is no such person, by the way) can rationally draw is that, he, (the average person) is most likely wrong about (at least 70%) (or some other large number which directly correlates to the degree of mutually-exclusive belief camps) of his fundamental beliefs.

The nice thing about this, is that, by recognizing this mathematical truth, we will be more willing and able to critically re-examine our cherished core beliefs.

A disease of the belief system?

Here at Get Help Get Active, we believe that a Mind-Based Habit of Low Self-Exertion (MBH-LSE) is sometimes caused by false beliefs about the worthwhile-ness of Working, and Looking for work, and Studying to improve one’s job skills. Now, if, indeed, we can confidently be suspicious of our beliefs, (70% of the time, on average), then we will be in a much better spot to be able to change our beliefs.

So what about you?  What are your beliefs about the worthwhile-ness of Working, and Looking for work, and Studying to improve your job-skills?  In your own opinion, is it possible that you suffering from MBH-LSE?  Is there a part of you that suspects that some of your core beliefs may be wrong?  If so, then please consider taking advantage of the 100% free help that is available.  Get help and get active!

This entry was posted in Uncategorized. Bookmark the permalink.

40 Responses to The Average Person Is Wrong

  1. NHS says:

    Its not 80%. I think. It sort of depends on the specific category. It could be higher or lower.

  2. NHS says:

    Why Do Bosses Exist? Teaching what you need to do and then making sure that you did it. That’s it. So about the second part … That can only be done during the working phase of WSL. Also deals with the matter of a Self-Starter. SS do not need that second function that the boss provides. And they will be okay in S & L of WSL. But non-SS will be lost in S & L.

    What Is A Self-Starter? What is a non-SS. Which kind of person are you?

  3. NHS says:

    Voluntary Brainwashing, learned helplessness and the perception of sloth

    Defining sloth as a sickness of the belief system. Yet the victim … (that is an odd way to look at it) … anyway the victim bears some responsibility. That is, to the extent that he sees that his beliefs are false or disingenuous, then he is responsible to put in the effort to attempt to reverse them.

  4. NHS says:

    Which Truths Do You Focus On?

    There are many different truths. Some are true in the short term and some are true in the long term. Example: is the earth flat or round? From the short distance perspective it is proveably flat. Round from long distance. People of the lie. Focus requires exertion. Pleasant truths require less exertion to focus on than do unpleasant truths. Perhaps “pleasant” is the wrong word? Nearterm payoff truths – rather than pleasant – require less exertion to focus on.

  5. Leader says:

    Mutually Exclusive Core Beliefs – What are the top level belief categories? Difficult to divide that up. Proposal: there are religious, political, scientific, gender-roles, sociological. So there are five. And each one could have several sub-questions. And each of those subquestions could have several answers. So there would be wide variations if we were to compare ppl.

    And generally folks get these views and do not change them throughout their lives. Teacher4 could not name a single belief that he had changed in his life.

    Intellectual arrogance and historical repetition re the self-perception that “I live in the ‘modern era’ in which ancient superstitions and dogmas have finally been spotted and corrected – even though the last few stragglers of them were only spotted and corrected just A few years ago.

  6. Leader says:

    What Is Confirmation Bias? The mountain of truth. I was lucky enough to have been born there. I agree 95% with my parents – but that is only a coincidence.

  7. PAL says:

    How Wrong Is The Average Person?

    I’m working on an article about Mutually Exclusive Fundamental Beliefs (MEFB). My assumptions are that there are several categories in which our beliefs are clustered, including: Religion, Science, Politics, Gender-Roles, Social-Norms and Economics. For example, some ppl believe in God and some are atheist. Some ppl believe that the speed of light is an unbreakable barrier and some ppl do not. In politics, there is a wide spectrum of different view-points, each of which generally considers each of the *other* view-points to be wrong. I consider these to be examples of MEFB.

    I am interesting in finding a way of measuring “how much” people (on average) disagree about fundamental beliefs. Since I view any two MEFB as logically incompatible with objective truth, I conclude that, in a case where there are two different people and they each hold a different MEFB, then at least one of them *must* be wrong. Ultimately, I would like to be able to extrapolate this simple “at least one person must be wrong” example to the full range of MEFB (in Religion, Science, Politics, Economics, etc) in order to know how wrong an *average* person must logically be.

    The reason that I want to know how wrong the average person must be about their MEFBs is that I am thinking that, by seeing the mathematical truth of the degree of *average* error, then folks will be more willing to critically re-examine their fundamental beliefs.

    I became interested in this question because I believe that laziness can be propelled by false economic beliefs. For example, a person who believes that America is the so-called “Land of Opportunity”, will naturally be compelled to seek out employment. Alternatively, if a person believes that America has little or no opportunity, then they will naturally be dis-inclined to either look for work or to study to improve their career skills.

    It seems to me as though the answer to the question of “How wrong the average person must logically be, due to the incompatibility of MEFBs?” must have occurred to many other people in the past. So I did some searching around on the internet to see if there was a neat answer already available to this question, but I did not see a clear answer. I wonder if any of my friends might be able to guide me in this area?

    • PAL says:

      Generally, everyone believes that he is at (or at least near) the peak of the Mountain of Truth.

    • Leader says:

      It should be a tree rather than a matrix. Since many subquestions will be irrelevant if you disagree at a higher branch.

    • Leader says:

      The safest case is to have no beliefs. In that cases you can be wrong about nothing. If you have only one belief then u can be wrong about either only one belief – or about 100% of your beliefs (a matter of perspective?). As your ME belief tree grows you then, to the extent that your beliefs overlap other MEBT, your BT puts you into a group if folks who are typically wrong. I don’t want to say on average – rather the idea is that we are just examining the beliefs of a person who has been randomly drawn from the group of which you are a member. Always recall that you may very well be 100% right about all of your beliefs.

    • Synthesis says:

      Mutually Exclusive States of Universal Truth MESUT – most people have beliefs that have a positive-negative form. For example, I believe that the Earth is 6000 years old (positive) and I believe that ppl who think it is millions of years old are wrong (negative.). So this belief forms a MESUT against all other beliefs as to the age of the earth.

      • Synthesis says:

        This pos-neg distinction is important bcz there are so many ppl who perceive this discussion at the level of “no one is wrong – it’s just a matter of a difference of opinion.”. So if indeed no one ever has an incorrect belief then the final benefit of being able to question your beliefs and replace unhealthy beliefs with healthy ones will be short- circuited.

      • Synthesis says:

        You need to acknowledge that that alternative view exists and welcome in a friendly cordial and respective manner. Unfortunately ppl who hold that view will not be able to benefit from the grand conclusion of MESUT – but that is okay. At least if you are respectful to them them you will still be on speaking terms.

      • Synthesis says:

        Mutually Exclusive Claims of Universal Truth – not states.

  8. PAL says:

    Computing the APE Number

    Lets start out by only studying a comparison of Mutually Exclusive Claims of Universal Truth (MECUT) in one area – such as Religious beliefs. And only take into account those people who use a POS-NEG form (i.e., “My belief re this question is correct and all other beliefs are wrong”) for their belief. That needs to be up front. Of course that would leave out all of the many shades of grey in human belief trees, but at least it would have the nice effect of providing a simplified first-swipe attempt to determine the Average Percentage of Error in MECUT. The APE Number. Hmmm … the irony …

    Should all beliefs weigh the same for purposes of comparison?

    One important challenge will be the question of “weighting” beliefs. I.E., are some beliefs more fundamental than others? Example, two ppl who believe that Book #1 is the only divine book (a doctrinal point which they each consider to be of very high importance), but they disagree about something that they each consider to be a doctrinal point of lesser importance. So, if you just consider those two beliefs: Agreed re the divine-ness of Book #1, but Disagreed re the minor doctrinal point – then would it be fair to say that they two disagree 50% of the time?

    How to define a difference when there are so many missing nodes?

    Another challenge has to do with comparing people who have missing nodes in their belief trees. Example, person A considers Book #1 to be the only holy book (as a MECUT). Also, person B considers Book #2 to be the only holy book (again, as a MECUT). So, it is easy to conpare this top-level node re “list all holy books.” But suppose person A has (for any reason) never developed any secondary beliefs re the specific doctrines contained in Book #1. Meantime, person B, *has* developed an intricate belief tree re the specific doctrines in Book #2. Now, how can we compare the beliefs of person A and person B – since there are so many missing or non-corresponding nodes between the two belief trees?

    How to compare beliefs which have different intensity / certain-ness

    When Person A holds a belief of MECUT #1 re question #1, but he is less than certain of his belief, then how can his belief be compared with Person B who holds a belief of MECUT #2 re question #1, but Person B holds that belief with 100% certainty? Also, there is a difference between the intensity which which we hold a belief and the certainty with which we hold a belief. The degree of intensity answers the question: To what extent does the belief in question motivate me in my thoughts and actions? The difference between intensity and certain-ness reminds me of the question of top-level importance versus a minor-doctrinal-point which is a lesser importance.

    Stating the question versus estimating the answer

    Arriving at the APE Number would probably require a very expensive survey and analysis. However, an estimate of the APE Number is free and easy, and instructive. That is what the article should really focus on. To get the reader to think about the implications. And an outline for how the actual study might, in the future, be carried out.

    • Synthesis says:

      There are three kinds of claims of truth: 1) regional (might just apply to the person making the claim – as in an opinion), 2) universal but not necessarily exclusive of other truths and 3) MECUT.

      First you need to define a belief tree, then the three kinds of beliefs, then define the subset of beliefs from belief trees that are going to be compared.

      Btw, Arizona’s claim that no one is ever wrong can be rewritten as: Everyone’s claim of truth is true except for those people who make MECUT (who are all wrong in the sense of the negative sides of their claims) – with the exception of this MECUT that I am making right now.

    • Synthesis says:

      People argue for one (or a combination) of these three reasons: 1) incompatible MECUT, 2) a misunderstanding or 3) they just like to argue (either due to pathology or a recreational inclination.)

    • PAL says:

      Are ALL Beliefs Actually MECUT?

      There are people who hold the belief that everyone’s belief about the name of God is correct. Such people sometimes consider their position to be more enlightened than the retrograde belief that God has only one specific name. But this idea is actually just another MECUT. Here is a thought experiment to prove it:

      Imagine that we have 5 people, each of whom has a belief about the name of God. 1-3 hold the view that they alone known the name of God, and that those who disagree with them are wrong. So they each have the Positive/Negative belief couple that defines what is usually thought of as a MECUT. The fourth person’s belief is that 1-3’s Postive beliefs are wrong. (In this case #4 believes that the name of God is unknowable.) The fifth person holds the “enlightened” view that all names of God are correct. So they consider themselves to be more peaceful and tolerant in that they do not claim that other people are wrong. But the enlightenedness is premature, since it avoids the Negative beliefs of 1-4 and the Positive belief of 5. This chart illustrates:

      See MECUT versus Pantheism.

    • PAL says:

      Levels of Belief and Disagreement

      The AI belief model could have a Enforcement-Imperative Variable (EIV) for each belief. The EIV answers the question: How important is it to me that I should push back against people who disagree with this particular belief? So, for example, the values of the EIV might range from:

      Possible Values of the Enforcement-Imperative Variable:

      1 = It is completely unimportant to me that the disagree-er disagrees.
      2 = I get aggravated by the disagree-er, but I can easily ignore them.
      3 = I speak up and tell the disagree-er that he is wrong.
      4 = I raise my voice when I confront the disagree-er.
      5 = I threaten to attack the disagree-er.
      6 = I attack the disagree-er.
      7 = I threaten to kill the disagree-er.
      8 = I kill the disagree-er.

      So far as peaceful human interactions are concerned, we really only need to concern ourselves with EIV 4-8. And MECUT could take this into account in the final production of the APE number. This is because, for the vast, vast, vast majority of our beliefs, we agree with one another and/or consider any disagreement as benign. For example, we all agree (in our belief) that the laws of physics are essentially constants throughout the universe. And the laws of mathematics.

      There are only a few beliefs that we actually do tend to disagree over. They come in a few categories: These are metaphysical things – What is the name and specific characteristics of God? And ownership questions – Who should by all “rights” own a specific property?

      So the point is that we agree about a practically infinite number of things, but the things that we disagree over are the ones over which we become violent.

  9. Doug says:

    You have touched upon a fascinating subject! When I have attempted to discuss this, especially with church people, it is generally avoided and I am shunned. This is why I strive to learn the larger truths that the Bible carries for all time. The insignificant stuff that most churches hold so dear is virtually worthless in my opinion.

    Why? It has changed over the last 2000 years and it will change in the next 2000 years.

  10. AUH says:

    A Chain Of Opposing Explanations

    Administrator update via: [link]

    Suppose that there is an argument. Side A claims that A is the truth. Side B claims that B is the truth. All of the people are divided into these two camps supporting either Side A or Side B. Suppose also that there is an ascending chain of arguments and counter-arguments each of which supports the claims (alternately) of Side A and then Side B. As you start up the chain and hear the first argument (supporting Side A), you understand the clear truth of the argument, so you naturally support Side A…

    Then, proceeding up the chain, you hear the response of Side B – and it is so clear and convincing that it causes you to switch your support to Side B. But when you hear the counter argument from Side A you switch back to Side A. This alternating back and forth continues for an unknown number of steps. At each stage the argument is clear and convincing and it neatly addresses all of the points made be the lower step on the opposing side – perhaps as a paradigm shift, or perhaps just increasing the total question applicable info.

    Let’s call the ascending steps A1, B1, A2, B2, … up to An, Bn. So, in this case the final truth favors B – but it could just as easily end with An – in which case the final truth would favor A.

    A “Weak Proponent” (WP) and a “Strong Proponent” (SP) of belief X are ppl who are, respectively, only familiar with the supporting claims of X up to levels i and j – i.e., arguments Xi and Xj, where j > i.

    So there is a phenomenon which occurs fairly regularly in which a WP of A argues with a SP of B, so the WP of A finds himself stumped by an argument that he cannot “top.”

    At this point, the reasonable thing for the WP of A to do would be to say: “Hmm … maybe you are right about the truth of B. But let me do some more research and then get back to you about this question.”

    But, instead what often happens is that the WP of A recalls (vaguely) the teaching that he had previously heard from a SP of A. Then the WP of A contents himself to refer back to the thinking and explaining of said SP of A (which the WP of A cannot, himself, clearly recall or understand.) And then he (the WP of A) further contents himself with the belief that said SP of A (if he were here) would prove himself to be a *stronger* proponent of A than is the currently present SP of B. And that is where things stay.

    When a child is born he is, presumably, an unbiased judge of things. But, because he is also at the zero-level in the COOE, anyone will be higher up the chain than he. So, the child naturally comes to embrace and fully believe whatever he is told.

    In practice, most people are at or relatively near the bottom of the COOE in terms of being proponents of their beliefs. So that anyone who has spent even a little bit of time in investigating the holes in their beliefs will be a stronger proponent of the opposition belief.

  11. AUH says:

    How about a tall and skinny chart showing the COE in action?

    Also point out that ppl often reach a certain point as they as end up the COE (in support of their belief) and they learn a certain “Indisputable Truth” (IT). At that point they will loose interest in any further argument with the opposition. So that even a relatively SP of the opposition will be dismissed (in the mind of the WP) by reference to the particular IT – which makes all further arguments pointless. Of course though if the IT is itself in error then the person will be forever locked in by their reliance upon it.

    There is also a sense in which a person’s “switch-ability” looses it’s ability to switch sides. You will notice that this will not necessarily happen in a sporting event. In other words, suppose that you start by believing that Team A is going to win and Team B will loose. Say it’s basketball for example. Then Team B has a run of points and takes the lead. At that point betting people will tend to lean towards Team B as being the likely winner. But then, if there is a run of points for Team A, the same betters will tend to switch their expectations. And these turn-overs can happen multiple times during a single game. Interestingly, this would make the game a very exciting game.

    But, when it comes to assessing the over-all worthwhile-ness of policy positions when we are hearing from a line of ever-ascending SP of policy A versus policy B, we tend to form our opinions fairly early on and then stick to them. Perhaps the difference between the sporting analogy and the ascending (and opposing) SP appeals do not occur at the same level of human perception. Since it is very easy to judge “who is ahead” in the sports case (by just looking at the scoreboard), but it much more difficult to judge “who is ahead” in the COE case.

  12. PAL says:

    There is a famous quote to the effect that the more certain that we are about a particular topic, then the more likely we are to be wrong. Or maybe it is that the more willing we are to fight for a principle, the less we are actually certain of the principle? It gets back to the idea that nearly perfect uncertainty can suddenly veer over into near perfect certainty.

    The way to re-write this article is to simply focus on MECUT which have been held throughout time regarding the true name of God. Although it may raise some hackles, it is, mathematically & politically speaking, the clearest example of how the APIW.

  13. PAL says:

    The Gambler’s Fallacy as a Case of MECUT

    A good example of how the average person is definitely wrong occurs in the case of the population of gamblers. Many (not all, but many) gamblers report that they have a feeling that they are going to win a big prize or “hit it big”. But, of course, most do not win. So we see that the population of gamblers contains a mathematically provable high density of people who hold false beliefs.

    By definition, the large number of contradictory beliefs held by the overall population of gamblers (where each ones believes that he will win an unusually large prize) forms a clear example of Mutually Exclusive Claims of Universal Truth (MECUT). The Average Probability of Error (the APE number) is precisely computed by the businesses which provide gambling services. In fact, the success of a gambling enterprise revolves around its ability to accurately determine APE among the clientele.

    One objection to this idea may be that the population of gamblers is not representative of the population at large. For example, it may perhaps be that gamblers are more likely than the average person to suffer from some type of physical or mental illness or brain injury? (I am just guessing.)

    Is There Similarity Between Religious Belief and the Gambler’s Belief?

    And this “feeling” that the gambler has that he is going to win big is (?) eerily similar to the religious feeling that people hold in the case of the MECUT of the name of God. Isn’t that interesting?

    This article (about APE) could be introduced by this old saying:

    “The fact that there are people who are willing to die (or to kill) for a certain cause has never been a good way to measure the actual merit of that cause.” Anon.

  14. Using AI as a Proxy to Derive Weighted Belief Trees in order to Estimate APE among MECUT

    Each person’s mind holds a Weighted Belief Tree (WBT). And it is a tree – not a forest. Because the most fundamental belief is the same for everyone. It is: “The world is real.” All other beliefs hinge on this one.

    The problem that we encounter when trying to estimate APE among MECUT is that we do not have a clear way to determine the percentage weight which belongs to an individual’s specific MECUT. For example, consider the case of the belief of the name of God, or the belief that a gambler has that he is more likely to win than other lottery players. In both of these cases, if you were to ask believers what percentage of the overall weight of their belief tree hangs on that specific belief, then all they could do would be to provide a very rough and difficult to gauge estimate.

    AI might be able to solve this problem (of determining the percentage weight of any particular belief in a WBT) by inferring the particular weights and structure of a person’s WBT by attempting to replicate the individual’s choices and behaviors based on an assumed WBT. Then, after many trials, we would (presumably) land on a numerically accurate WBT which would reflect the actual WBT of the individual.

    At that point it would just be a simple matter of extracting the weight of the MECUT in question, determining the percentage size of the group that holds the most common belief, and then subtracting that number from 100%. The result would be a numerically accurate estimate of the APE for that particular MECUT.

    In order to combine all possible MECUT to estimate the overall top level APE, various mathematical approaches could be employed. But in any case, we would still end up with a consistent and profoundly meaningful number.

    The work of constructing sample WBT has probably already been done. The next step would be to find that work and then extract from it the set of all MECUT in order to produce an estimate of the APE!

    BTW, the WBT idea would be useful in the case of comparing contradictory beliefs such as “The system is set up to keep the little guy down” versus “God helps those who help themselves. One problem is that we tend to get fixated on certain non-adaptive nodes in the WBT. That principle is nicely encapsulated in the saying: “It is rare that we need to be taught anything new, but it is very common that we need to be reminded of what we already know.” Refocusing.

  15. MECUT As Game Theory

    Approaching this problem at level of “universal truth” has been a problem bcz it arrouses unnecessary territorial animus. A better approach might be to think of it as an online game. Players make decisions based on global truths and personal truths. At least this way the model can be established in order to be able to compare the MECUT weighted belief trees (WBT). And to distinguish the MECUT WBT from the personal truths WBT.

  16. PAL says:

    EDITOR: changed paragraphs to use even edge alignment on 5-16-2015.

    06-10-2015 = Does Everyone Exert Themselves Equally?
    06-10-2015 = What is the Dollar Value of Structure?
    06-11-2015 = Who Owns Your Paid Training?
    07-11-2015 = One Deadline And Two Surprise Offers
    What is Work?
    Sex and Work
    Financial Incentives
    Big Feedback and Little Feedback
    Can I Read a Book?
    Dealing With Discouragment
    The GHGA Display Board for Conferences
    What Does the Bible Really Say?
    The PAL Pamphlete
    I Want to do it, But I don’t Like to do it
    (start with next older)

  17. masi says:

    thank you for the reminder not to believe everything that I think.

  18. ZEG says:

    The more I read this article, the more startling the implications of what is proposed in the title of the article, because indeed the average person IS wrong. Two thoughts stem from that: how is society/civilization able to still function considering that the majority of it’s makeup is operating under false assumptions and beliefs. ; the second question is, who is the “average” person, and would they self identify as such? We all like to think we’re above average at least in some aspects of life, but mathematically speaking not everyone can be above average. The broader implications of this ought to be observable to the average (there we go again with that word) social scientist.

    • PAL says:

      How Can Society Continue To Function?

      There are certain areas in which false beliefs are quickly exposed as false. For example, suppose that a person runs out of gas. This person is also mentally ill, so they are able to manufacture beliefs which any normal person would immediately identify as wrong. So, the person generates a belief within his mind that he can mix water and lemon juice and the result will become an exact replacement for gasoline. So, he puts the mixture into his car. Very quickly, his false belief will be exposed. Because of his economic failure, the water + lemon juice idea will be unlikely to spread to other people.

      There are other areas in which false beliefs are very slow – or even impossible – to be exposed as false. Most religious beliefs fall into this category. Religious beliefs normally take the form: If you do act such-and-such or believe doctrine such-and-such or say words such-and-such, then you will be rewarded in a karma like process. And the karma reward will occur at some possibly distant time – or perhaps it will only occur in the afterlife. But because there is no easy way to prove that a given future positive event was due to the religious observance, it becomes very easy to imagine that it (some possibly coincidental future positive event) was the result of the karma-like formula. And, of course, afterlife events are not currently provable.

      So one answer to the question: “How does society function when there are so many people who are operating under false beliefs?” is that, in many cases, the false beliefs are irrelevant to the day-to-day technical operations of society. And these technical operations (putting gas in the car, putting food on the table, etc) are only very rarely infected with false beliefs.

      Of course though, the discussion of false (and un-proveable and un-dis-proveable) beliefs typically generates a lot of animosity between people. This reminds me of an old saying to the effect that the willingness of a person to die (or kill) for a belief has never been a good way to measure the accuracy of that belief. (Interestingly, you never hear about people arguing over whether or not water + lemon juice can be used as a gasoline substitute.)

    • PAL says:

      Should Economic Brainwashing Be A
      Qualifying Condition For Disability?

      Lets imagine that we had an indisputable way to scientifically verify that there actually are a lot of good job opportunities available to anyone in a given region.

      But a certain person has been brainwashed into believing that there are *not* any good (or even passable) job opportunities in that same region. Furthermore, his brainwashing has been so complete, that it includes an entire supporting network of (false) beliefs that effectively reinforce and strengthen one another. The result of this brainwash-induced network of supporting beliefs is that any attempt to challenge his beliefs with facts and proofs are quickly discarded (by the false belief network) as transparent lies.

      So, in keeping with this person’s false perceptions (which he believes to be true), he will naturally conclude that it does not make any sense for him to look for work.

      Now, here is the question: Could such a person (or his representative advocate) fairly claim that the brainwashed victim’s indelibly ingrained perceptions act as a kind of handicap?

      A Brainwash-Induced Network of Self-Reinforcing
      And Self-Proving (But Generally False) Beliefs

      BTW, this question is not actually all that far-fetched. In fact, many Democrats and Republicans would describe each other’s political perceptions as a result of a brainwash-induced network of self-reinforcing, self-proving and seemingly self-consistent (but false) beliefs.

      Mutually conflicting religious belief systems are another example of the same phenomenon. In other words, advocates of Religious Belief System A, will generally consider advocates of Religious Belief System B as being victims of a brainwash-induced network of self-reinforcing, self-proving and seemingly self-consistent (but false) beliefs.

      Does brainwashing exist?  Yes, everyone would agree to that.    It is the act of working to alter a person’s belief system.  Usually it is done for some nefarious end, but it could just as well be an unfortunate byproduct of someone’s good intentions – as in the case of well-meaning people who (unknowingly) teach false religious beliefs to children.

      So what is Economic Brainwashing?  It is the act of teaching false economic beliefs. But do false economic beliefs really exist – or is it just a matter of a person’s perspective whether something is true or false?  That is a hard question to answer. The problem is actually similar to the case of Religious Belief System A’s claims of universal truth versus the (mutually exclusive) claims of universal truth which are propounded by Religious Belief System B.

      Starts getting shaky at this point:

      Legal Triangulation Using MECUT?

      Perhaps though, we can use this Mutually Exclusive Claims of Universal Truth (MECUT) quality using 3 or more MECUT positions to prove that economic brainwashing really does exist?  Interestingly, while it would probably be very difficult to legally prove that some specific person was suffering from economic brainwashing, perhaps the court would be able to conclude that at least one person in a group of MECUT-holders was indeed suffering from economic brainwashing? (Due to the logical constraints imposed by the MECUT property.)

      One problem that would result from assenting to the idea that Economic Brainwashing constitutes disability would be that people would attempt to become brainwashed, just so they could collect the disability payments.

      • Anonymous says:

        Cartoon idea Above the person’s mind is a geodesic sphere representing his mutually supporting beliefs (which have been arrived at via brainwashing.) Someone attempts to communicate new ideas to him, but they bounce off as transparent lies.

        Also, this idea of a brainwash-induced network of self-reinforcing, self-proving and seemingly self-consistent (but false) beliefs should be combined with the next logical step, which is the “Three Questions of Due Diligence” concept: http://gethelpgetactive.org/medicine/#comment-2141

  19. IMTIM says:

    No One is Wrong, Right?

    (Admin update at this link.)

    Some people take the view that no one is “wrong”. They might further explain themselves by saying that different people just have different views, as in the case of the Blind Men and the Elephant. This is often said in terms of religious thinking.

    But we do not look at provable, testable things in that way. In the math class, students are either right or wrong. Right? (Perhaps we should be using different words in this scenario?)

    At Least Everyone is Equally Sincere

    But is there another sense in which students who receive poor school grades are not actually wrong? Yes, there is. They are not wrong in the sense that they do sincerely desire to know the truth about math.

    So lets imagine that the math test had this question: “Do you sincerely desire to know the truth about the value of 2 + 2?” Then everyone could answer: Yes. In that sense they would all be correct. And that is the same sense in which all people who have differing religious or metaphysical beliefs are equally correct.

    Of course though, if and when some specific question from the untestable metaphysical realm moves into the testable realm of the physical, then, presumably we could, retroactively, fairly identify beliefs as either true or false.

    Does Everyone Want (With Equal Sincerity) to Know the Truth?

    One little side question comes out of this. Consider the example of the math test which has the question asking students if they sincerely want to know what 2 + 2 is. Previously we said that, yes, everyone could truthfully say that they all want to know the truth. But is that accurate? Do all people want (with equal sincerity) to know the truth?

    I suspect that different people have different levels of sincerity in their desire to know the truth. A lot of it comes down to mixed motives. For example, consider the case in which a person has been taught a certain religious truth from an early age. Then science comes along with evidence that disconfirms some specific religious belief. (In this scenario, not all of their beliefs are being disconfirmed by science – just some specific belief.)

    If the religious person person perceives that their overall world view is being threatened by this new science-based truth, then will they be equally desirous of knowing the truth – compared to a person who does not perceive a similar threat to their worldview? Probably not.

    Is there a Bug in the System?

    I am thinking that, in the example of the person who perceives a threat to their world view, the “bug” in their thinking system may largely have to do with the fact that they have incorrectly assumed that each of their separate beliefs are “frozen” together into a single perfect crystal. Hence, if any one of their beliefs turns out to have been wrong, then the whole structure is in danger of being shattered.

    Which reminds me of the doctrine of papal infallibility. Although, in fairness to the Catholics, the Achilles Heal of doctrinal infallibility is a common thread in many different religions and other non-religious institutions.

    Get Help and Get Active

    We can easily see when other people’s minds have been infected by false beliefs. But most people find it very difficult to believe that their own minds could have been affected by false beliefs.

    Is it possible that your mind has been affected by one or more untrue perceptions?

    The premise of Get Help Get Active is that, for some people, a constellation of false beliefs about a) the worthwhileness of making an effort and b) the ability of the individual to self-direct result in a state of Adversarial Manager Dependence (AMD). AMD causes a person to have an incorrect intuition about the need to take action in the face of responsibility. In many cases, this disinclination to take action on self-perceived responsibility results in sadness and poverty.

    Free help is available. It is actually even better than free. We will pay you to participate in a 5 week study of your own ability to self-control and self-direct. Click here to get started. Remember, you will never pay anything and your participation is confidential. Get help and get active!

    • PAL says:

      Comics:

      * 5 Blind Men and Elephant

      * Man taking math test with 2+2 = ? He thinks “I hate math”.

      * Teacher asks students what 2+2 =. They shout out different answers. The teacher says that they are all correct. (2016) Versus 1950 – They all say 2+2 = 4. What has changed?

      * The perfect crystal – but it contains a mixture of truth and error – but the people gathering around it are all repeating: “The crystal is perfect”

    • Anonymous says:

      Elephant story illustrates positive and negative beliefs – the men are right in their positive belief that the elephant is like rope wall tree snake but they are wrong in their negative belief that other beliefs are wrong. Hindi perception of an Avatar.

      • Anonymous says:

        Venn Diagram – right and wrong beliefs of the five blind men – ECXELLENT – needed for MECUT article. They are right in their avatar perception. They are wrong to generalize / extend that perception to exclude others. To illustrate positive and negative belief Frames

  20. Anonymous says:

    Ethnocentricity is a Mutually Exclusive Claim of Universal Truth

    Mercifully it is not as “hot” of a topic as religious belief. It is the perfect example to use when explaining the APE Number! Plus, most people don’t know about it as a science-y term. Weave that into the “home team advantage”.

Leave a Reply

Your email address will not be published.

Please complete this math problem to help us to eliminate spam * Time limit is exhausted. Please reload CAPTCHA.