How much do we know about Stonehenge? Less than we think. And what has Stonehenge got to do with the Ice Age? More than we might think. This blog is mostly devoted to the problems of where the Stonehenge bluestones came from, and how they got from their source areas to the monument. Now and then I will muse on related Stonehenge topics which have an Ice Age dimension...
THE BOOK
Some of the ideas discussed in this blog are published in my new book called "The Stonehenge Bluestones" -- available by post and through good bookshops everywhere. Bad bookshops might not have it....
To order, click HERE
Some of the ideas discussed in this blog are published in my new book called "The Stonehenge Bluestones" -- available by post and through good bookshops everywhere. Bad bookshops might not have it....
To order, click HERE
Tuesday 26 April 2011
On Pseudoscience
I seem to spend a lot of my time dealing with material that I consider to be pseudo-scientific, so I thought it worthwhile to reproduce a chunk of the Wikipedia entry on Pseudoscience. This is a very difficult area, and I have touched on it before in this blog -- entries can be found by using the search facility for "Popper" and "Occam's Razor." Of course, one person's science is another person's pseudoscience, and in the real world the border between the two is difficult to find. It would be nice if we could all follow faithfully the scientific precepts of Karl Popper, and pursue the ideal of scientific falsification. But psychology and even sociology -- and even politics -- come into the frame, and sometimes we get so attached to our theories (for reasons of self-esteem and reputation) that we become blind to their deficiencies, and refuse to see what others consider to be blindingly obvious. And as we all know, politics and even economics can maintain the "respectability" and the "value" of hypotheses (like the bluestone / human transport theory) which have really become redundant and which are not supported by actual field evidence. In the territory covered by this blog, all hypotheses are (or should be) based on published field evidence which is verifiable and maybe falsifiable -- but the problem is that some hypotheses are based on just one or two small pieces of field evidence, and are then formulated and even published by authors who simply refuse to accept the great mass of published material that tends to disprove what they claim to be true........
PSEUDO-SCIENCE
From Wikipedia
The following are some of the indicators of the possible presence of pseudoscience.
Use of vague, exaggerated or untestable claims
• Assertion of scientific claims that are vague rather than precise, and that lack specific measurements.[29]
• Failure to make use of operational definitions (i.e. publicly accessible definitions of the variables, terms, or objects of interest so that persons other than the definer can independently measure or test them).[30] (See also: Reproducibility)
• Failure to make reasonable use of the principle of parsimony, i.e. failing to seek an explanation that requires the fewest possible additional assumptions when multiple viable explanations are possible (see: Occam's razor)[31]
• Use of obscurantist language, and use of apparently technical jargon in an effort to give claims the superficial trappings of science.
• Lack of boundary conditions: Most well-supported scientific theories possess well-articulated limitations under which the predicted phenomena do and do not apply.[32]
• Lack of effective controls, such as placebo and double-blind, in experimental design.
Over-reliance on confirmation rather than refutation
• Assertions that do not allow the logical possibility that they can be shown to be false by observation or physical experiment (see also: falsifiability)[33]
• Assertion of claims that a theory predicts something that it has not been shown to predict.[34] Scientific claims that do not confer any predictive power are considered at best "conjectures", or at worst "pseudoscience" (e.g. Ignoratio elenchi)[35]
• Assertion that claims which have not been proven false must be true, and vice versa (see: Argument from ignorance)[36]
• Over-reliance on testimonial, anecdotal evidence, or personal experience. This evidence may be useful for the context of discovery (i.e. hypothesis generation) but should not be used in the context of justification (e.g. Statistical hypothesis testing).[37]
• Presentation of data that seems to support its claims while suppressing or refusing to consider data that conflict with its claims.[38] This is an example of selection bias, a distortion of evidence or data that arises from the way that the data are collected. It is sometimes referred to as the selection effect.
• Reversed burden of proof. In science, the burden of proof rests on those making a claim, not on the critic. "Pseudoscientific" arguments may neglect this principle and demand that skeptics demonstrate beyond a reasonable doubt that a claim (e.g. an assertion regarding the efficacy of a novel therapeutic technique) is false. It is essentially impossible to prove a universal negative, so this tactic incorrectly places the burden of proof on the skeptic rather than the claimant.[39]
• Appeals to holism as opposed to reductionism: Proponents of pseudoscientific claims, especially in organic medicine, alternative medicine, naturopathy and mental health, often resort to the "mantra of holism" to explain negative findings.[40]
Lack of openness to testing by other experts
• Evasion of peer review before publicizing results (called "science by press conference").[41] Some proponents of theories that contradict accepted scientific theories avoid subjecting their ideas to peer review, sometimes on the grounds that peer review is biased towards established paradigms, and sometimes on the grounds that assertions cannot be evaluated adequately using standard scientific methods. By remaining insulated from the peer review process, these proponents forgo the opportunity of corrective feedback from informed colleagues.[42]
• Some agencies, institutions, and publications that fund scientific research require authors to share data so that others can evaluate a paper independently. Failure to provide adequate information for other researchers to reproduce the claims contributes to a lack of openness.[43]
• Appealing to the need for secrecy or proprietary knowledge when an independent review of data or methodology is requested.[43]
Absence of progress
• Failure to progress towards additional evidence of its claims.[44] Terence Hines has identified astrology as a subject that has changed very little in the past two millennia.[45] (see also: Scientific progress)
• Lack of self correction: scientific research programmes make mistakes, but they tend to eliminate these errors over time.[46] By contrast, theories may be accused of being pseudoscientific because they have remained unaltered despite contradictory evidence. The work Scientists Confront Velikovsky (1976) Cornell University, also delves into these features in some detail, as does the work of Thomas Kuhn, e.g. The Structure of Scientific Revolutions (1962) which also discusses some of the items on the list of characteristics of pseudoscience.
• Statistical significance of supporting experimental results does not improve over time and are usually close to the cutoff for statistical significance. Normally, experimental techniques improve or the experiments are repeated and this gives ever stronger evidence. If statistical significance does not improve, this typically shows that the experiments have just been repeated until a success occurs due to chance variations.[citation needed]
Personalization of issues
• Tight social groups and authoritarian personality, suppression of dissent, and groupthink can enhance the adoption of beliefs that have no rational basis. In attempting to confirm their beliefs, the group tends to identify their critics as enemies.[47]
• Assertion of claims of a conspiracy on the part of the scientific community to suppress the results.[48]
• Attacking the motives or character of anyone who questions the claims (see Ad hominem fallacy).[49]
Use of misleading language
• Creating scientific-sounding terms in order to add weight to claims and persuade non-experts to believe statements that may be false or meaningless. For example, a long-standing hoax refers to water by the rarely used formal name "dihydrogen monoxide" (DHMO) and describes it as the main constituent in most poisonous solutions to show how easily the general public can be misled.
• Using established terms in idiosyncratic ways, thereby demonstrating unfamiliarity with mainstream work in the discipline.
Psychological explanations
Pseudoscientific thinking has been explained in terms of psychology and social psychology. The human proclivity for seeking confirmation rather than refutation (confirmation bias),[55] the tendency to hold comforting beliefs, and the tendency to overgeneralize have been proposed as reasons for the common adherence to pseudoscientific thinking. According to Beyerstein (1991), humans are prone to associations based on resemblances only, and often prone to misattribution in cause-effect thinking.
Lindeman argues that social motives (i.e., "to comprehend self and the world, to have a sense of control over outcomes, to belong, to find the world benevolent and to maintain one’s self-esteem") are often "more easily" fulfilled by pseudoscience than by scientific information.[56] Furthermore, pseudoscientific explanations are generally not analyzed rationally, but instead experientially. Operating within a different set of rules compared to rational thinking, experiential thinking regards an explanation as valid if the explanation is "personally functional, satisfying and sufficient", offering a description of the world that may be more personal than can be provided by science and reducing the amount of potential work involved in understanding complex events and outcomes.
Subscribe to:
Post Comments (Atom)
3 comments:
Your reproduction of Robin Heath & John Michell's map from "The Measure Of Albion", showing their rectangle connecting Stonehenge, Lundy Island, Caldey Island and the Preselis, reminds me how seductive pseudo-science can be. I came across this book in a public library by serendipity and could't put it down for quite a while. It appeals to the individual's psychological desire for a lost collective wisdom, a lost Understanding, in a similar way as astrology does to some people, including members of my wider family.
Quite agree, Tony. It's very seductive, and especially dangerous when the pseudo-science practitioner trots out loads of wonderful mathematics and astronomical calculations which cannot be instantly challenged or seriously questioned -- and so tend to leave people in awe and wonderment!
Robin markets his books quite aggressively (well, like the rest of us he has to make a living) -- but I have crossed swards with him on many occasions!
Brian,
I just want to make it clear, for the record, that I am not part of this debate you are having with Robert on 'pseudoscience'. I believe on 'falsifiability' as I believe on Occam's Razor.
Everything that I put forth is both 'falsifiable' as well as 'naturally simple'. You will not find in any of my explanations the supernatural or the superhuman. And I am always open to engage with others in an earnest and honest conversation.
Kostas
Post a Comment