“I really seriously do not understand how anyone can believe materialism.”

Barry Arrington has a post on UD If My Eyes Are a Window, Is There Anyone Looking Out? which contains some fairly standard arguments for dualism.  What caught my eye was the comments. Here is a sample:

How do you argue against someone who does not see the obvious self-contradiction in the statements -

“I choose to believe in materialism”. “I choose not to believe in free will.”

To believe these statements takes denial of basic logic.

I think some people just like to believe and say counterintuitive things because they think it sets them apart from the average Joe. Believing that I am just a machine has got to be the MOST counterintuitive thing I could possibly believe. It’s possible that you are just a machine, I can’t be sure about that, but I’m sure I’m not and, and as you say, any theory that provides the wrong answer to this question will not get serious consideration from my (average) mind.

This subject goes to the heart of the questions I’ve been puzzling on recently and has resulted in the few comments I’ve made here at UD.
It starts with the question, “Why don’t people accept sound arguments?”
The answer appears to be that their worldview will not allow it.

It is an ironical that comments like this appear just when Denyse is making post after post about how evolutionists/materialists/the liberal elite are suppressing intellectual freedom.  This group that cannot understand the materialist case so they dismiss materialists as at best deluding themselves and possibly being deliberately deceptive because they want to show off. It amounts to “they are obviously wrong so the only interesting question is why do they say these things”. This is not a good basis for debate. In Barry’s case this has extended to banning arguments he believes to be “obviously” wrong from his debates (there are some UD regulars who do respect opposing views – most notably vj torley). What I cannot work out is how to engage with this mind set or if there is any point in trying to.

There are of course many responses to all the points that Barry makes. They have been covered many times both in the blogosphere and serious writing and there seems little point in repeating them here. Interestingly in one comment Billmaz raises some of these objections. He/she appears to be an IDist acting as Devil’s Advocate and shows a real intellectual interest in seeing both points of view so can hardly be dismissed as deluding him/herself.  This will presumably lower the emotional baggage that gets in the way of any serious debate. It will be interesting to see to what extent Barry and the commenters engage with what Billmaz writes – after all as far as they are concerned he/she is obviously wrong.

A Response to Larry Correia on Gun Control

Amongst the furious debates on gun control in the USA following the Newtown massacre there is one blog post by the novelist Larry Correia that is particularly popular with those who oppose gun control. At the time of writing it has had 1700 responses- many of them links from other blogs. What I can’t find in all those responses (although it may be there) is a systematic critique of what he has written. Which is a pity, because, while I learned a lot, and even sympathised with some of what he had to say, it is full of logical and factual errors.

The heart of the problem is that  Correia sees the world as a fight between the good guys and the bad guys.  As he explains, his career has been in guns, law enforcement, self-defence training and eventually writing adventure novels (I haven’t read them but the covers and titles suggest strongly that they are full of good guys killing bad guys). In his world we (the good guys) are under threat from bad guys with weapons and the way to deal with this is for  the good guys kill the bad guys before they do too much harm. So for him the solution to Newtown is to arm teachers; gun free zones are invitations to bad guys to attack defenceless victims; limitations on magazine size make it harder for the good guys to kill the bad guys and so on.

He appears to be unable to conceive of an alternative, but they do exist. Indeed most affluent democracies are that alternative. It is the United States that is the outlier having by some distance the highest rate of gun ownership in the world and a homicide rate that is four to five times that of comparable countries. The difference is even greater for “rampage” murders such as Newtown.  For example, there has been one such event in the UK since 2000.  There have been four events that clearly count as rampage murders in the USA in the last two years alone (Newtown, Aurora, Tucson and Oak Creek) and several others that might count.  He wants to arm teachers. Wouldn’t it be better if it was not necessary to arm teachers because the chances of a mad gun man attacking a school are so small they can be ignored? Guess what – for most comparable countries that is the case.

Correia does write about other countries but this section of his post is the most erroneous and misleading. For example, he writes of Australia:

Australia had a mass shooting and instituted a massive gun ban and confiscation …. As was pointed out to me on Facebook, they haven’t had any mass shootings since. However, they fail to realize that they didn’t really have any mass shootings before either.

Australia had many mass shootings in the 18 years prior to the gun ban (estimates vary from 13 to 16 depending on definitions) and just one possible event since (the event at Monash University where two students were killed).

Like many pro-gun US writers he seizes on the fact that both Australia and England introduced very stringent gun control legislation in the late 1990s following massacres and since then violent crime has risen in both countries. But this is reading far too much into that specific legislation and those statistics.  In both countries there was extensive gun control for decades before the 1990s. In England  virtually no civilians have carried guns for self-defence since the war, and no sane criminal would have been concerned about the possibility their victim was armed. Yes violent crime increased in both countries in the late 90s early 2000s, but this was part of a worldwide trend that had already started and has reversed in the latter part of the 2000s. This violent crime did not extend to homicide which remained low – far, far lower than the USA (In England the big increase was in assault –  a very broad category which covers everything from a minor scuffle in a bar to a major mugging). The explanation of this pattern is debateable and complicated – but one thing it certainly was not caused by was criminals suddenly finding their victims could not defend themselves.  The 1990s legislation was  a reaction to specific rampage murders – important and horrific but responsible for a negligible proportion of homicides. In the case of Australia it appears to have succeeded dramatically. In the case of the UK it is hard to tell because such incidents were so rare in the first place.

What Correia appears not to understand is that the USA has a unique gun culture. By a gun culture I mean more than lax gun control laws or even the high availability of guns. I mean that the use of guns as defensive weapons is accepted and actively promoted. Other countries do not have advertising campaigns suggesting you need a gun to defend yourself. In England  most people would be shocked if a friend declared they had a gun for self-defence even if that gun were legal. Outside the USA people do own guns (even in England!) but they are for professional or sporting reasons. I believe this cultural difference is far more important than the difficulty of obtaining a gun legally.  Even in England it is not that difficult to own a gun. The one rampage killing in England since the late 1990s legislation was done using legally obtained guns. Likewise, Norway’s only rampage murder – the horrific Breivik massacre – was committed using legally obtained weapons (he even got formal training – much as Correia would like teachers to get). The reason there are far less rampage murders in England is because such murders are extremely hard to do without guns and people in England just don’t think about guns as a realistic option for killing fellow citizens.  Even in countries such as Switzerland, where famously guns ownership is high, guns are treated as instruments of sport not as weapons. Although gun ownership is  required for men of military age there are strong restrictions on carrying guns. To quote a website that advises foreigners on living in Switzerland:

Strict legislation in Switzerland has made it extremely difficult to obtain a license to bear arms, and the trend is moving towards even stricter laws. For information purposes only, 400 people had a license to bear arms in the canton of Geneva in 1998. Only eight "survivors" still have authorization today. Understandable when you realize how little violent crime there is in Switzerland.

Crime is very, very low in Switzerland – but it is not because criminals are frightened of their victims defending themselves with a gun.  Switzerland has extensive gun control – it is just that it applies to bearing arms not owning them.

So, the USA is stuck in a unique culture where guns are closely linked to violence and self-defence. Other countries have far lower homicide rates and far lower rates of rampage killings (and very little debate about the need for gun control).  But does that mean increased gun control would change that culture in the USA? Given the vast number of guns already in circulation, the almost religious belief in the second amendment, and the deep political divide it represents – maybe the best way forward is to accept that the culture cannot change and concentrate on making sure the good guys have the weapons they need?

The problem with this argument is that it turns crime prevention into an arms race. If the teachers/guards are armed then maybe they will kill the bad guys first, but the bad guys are the ones who take the initiative and can research their their targets. What is to say he (it is almost always a male) will not simply increase his body armour and weapons to whatever level is necessary (remember the extensive kit used by the Aurora killer) and make sure he kills the armed guard/teacher first? In an arms race the side that is waiting for the possibility of attack on one or two out of thousands of possible targets is always going lose  out to the side that can choose and survey its target and pick its time and weapons.

Laws are both the result of culture change but also contribute to it. They not only prevent people doing what society considers to be wrong. They also help to define what is wrong. Speed limits define what is an acceptable speed even on roads where there is no possibility of being caught. People report income for taxation purposes even though there is no way they will be found out. Drink driving and smoking legislation have changed our ideas about what is acceptable behaviour. Gun control laws can do something similar for guns. They can contribute to breaking that cultural link between guns and violence. If selling certain types of weapon is illegal then they will not be advertised and they will not be seen in gun shops. If carrying them is illegal then they will not be shown off at gun shows or other occasions. If someone has doubts about owning a powerful gun but feels peer pressure to have one – the law gives them a great excuse for not having one. Of course, laws by themselves are not sufficient, if they are not promoted and at least to some extent enforced, they will be meaningless.  But if the USA really wants do something about its high homicide rate and particularly its rampage murders then surely this has to be worth a try.

Continuation of discussion of Gpuccio’s challenge from TSZ

This post is a continuation of a debate that has been going on for some time on TSZ – moved here because TSZ is having technical problems.

These are my most recent comments repeated.


Gpuccio

More to the point – I think I have another example which would give you reason to refine your dFSCI process if you want to preserve 100% specificity. Before I do the work let me check the function is acceptable:

“The string identifies for each month over a period of 120 months whether the London monthly mean high temperature is above or below long-term average.”  

As I have given you the function before working out the string you can see that I am prespecifying it!


Gpuccio

With the other two functions, instead, relying only on an explicit, non contingent property, the computation of dFSI would not change in the prespecified or postspecified case. The target space and the search space remain the same in both cases.

That’s false. For the other two examples if they were post-specified this would be something like taking the string, studying the papers it points to, and seeing what you can find that they had in common. As all papers have something in common (even if it is just a distinctive phrase somewhere in the text) then the probability of success is 100%. That’s why I suggest you simply amend the process to say no post-specified functions. Any function could potentially be post-specified.

But, to be complete, I would obviously ask what the period is (and in particular, if it is a future period or a past period whose values are already known), and what the long term average reference is.

I was thinking of the last 10 years – 2002 to 2012 – I could do a longer period but it would be tedious. I was going to use http://www.holiday-weather.com/london/averages/ for the averages. Although the values for 2002 to 2012 are known I was not going to use them to generate the string. That’s why I said “identify” rather than “predict”. I will not even look at the actual temperatures until after I have generated the string – although I won’t be able to resist checking it has worked when I have finished. The string will simply be a string of 120 bits with 1 for above average and 0 for below average. I realise you want 500 bits but that would be really tedious to look up all the data, so I hope 120 will be sufficient to prove the case.

The Law of the Conversation of Information and Bernoulli’s Principle of Insufficient Reason

I have just been rereading Bernoulli’s Principle of Insufficient Reason and Conservation of Information in Computer Search by William Dembski and Robert Marks. It is an important paper for the Intelligent Design movement as Dembski and Marks make liberal use of Bernouilli’s Principle of Insufficient Reason (BPoIR) in their papers on the Law of Conservation of Information (LCI) and many ID friendly authors, including Dembski, use it extensively elsewhere. Without BPoIR many of the arguments presented for Intelligent Design would collapse. The point of Dembski and Marks paper is to address some fundamental criticisms of BPoIR. I hope to show that they fail to do this.

For Dembski and Marks BPoIR provides a way of determining the probability of an outcome given no prior knowledge. This is vital to the case for the LCI. The idea of the LCI is that outcomes such as genes or proteins can be seen as successful searches for limited targets in a large search space. The improbability of hitting that target without any prior information (determined using BPoIR) is the total amount of information in that outcome. They then go on to argue that the chances of finding any strategy for increasing the probability of hitting the target (i.e. reducing the information) are always so low that it is not possible to increase the overall probability of hitting the target. The more the strategy increases the probability of success, the lower the probability of finding the strategy. Or as they put it – the information lost through introducing the strategy is always made up for by the information gained in finding the strategy – so that information is conserved – and overall the information based on using BPoIR without prior knowledge is preserved. Their proof of this also relies on BPoIR. So the LCI is deeply dependent on BPoIR.

Dembski and Marks are well aware that BPoIR has been severely criticised by philosophers and statisticians as eminent as J M Keynes. In particular Keynes pointed out that the BPoIR does not give a unique result. A well-known example is applying BPoIR to the specific volume of a given substance. If we know nothing about the specific volume then someone could argue using BPoIR that all specific volumes are equally likely. But equally someone could argue using BPoIR all specific densities are equally likely. However, as one is the reciprocal of the other, these two assumptions are incompatible. This is an example based on continuous measurements and Dembski and Marks refer to it in the paper. However, having referred to it, they do not address it. Instead they concentrate on the examples of discrete measurements where they offer a sort of response to Keynes’ objections. What they attempt to prove is rather limited point about discrete cases such as a pack of cards or protein of a given length. It is hard to write their claim concisely – but I will give it a try.

Imagine you have a search space such as a normal pack of cards and a target such as finding a card which is a spade. Then it is possible to argue by BpoIR that, because all cards are equal, the probability of finding the target with one draw is 0.25. Dembski and Marks attempt to prove that in cases like this that if you decide to do a “some to many” mapping for this search space into another space then you have only a 50% chance of creating a new search space where BPoIR gives a higher probability of finding a spade. A “some to many” mapping means some different way of viewing the pack of cards so that it is not necessary that all of them are considered and some of them may be considered more often than others. For example, you might take a handful out of the pack at random and then duplicate some of that handful a few times – and then select from what you have created.

What is the significance of this? It is not totally clear. I think their point is that a “some to many mapping” is equivalent to changing the probability of the individual items in the search space from the BPoIR assumption of every outcome being equal, to any other distribution of probabilities e.g. making one amino acid more probable than another when building a protein. So they have attempted to prove that if instead of assuming all amino acids are equal you take another probability distribution at random, you have a 50% or less chance of happening on a probability distribution which will improve your chance of meeting the target.

There are two problems with this.

1) It does not address Keynes’ objection to BPoIR

2) The proof itself depends on an unjustified use of BPoIR.

But before that it is worth commenting on the concept of no prior knowledge.

The Concept of No Prior Knowledge

Dembski and Marks’ case is that BPoIR gives the probability of an outcome when we have no prior knowledge. They stress that this means no prior knowledge of any kind and that it is “easy to take for granted things we have no right to take for granted” (They compare it to the physics concept of nothing that preceded the big bang). However, there are deep problems associated with this concept. The act of defining a search space and a target implies prior knowledge. Consider finding a spade in pack of cards. To apply BPoIR at minimum you need to know that a card can be one of four suits, that 25% of the cards have a suit of spades, and that the suit does not affect the chances of that card being selected. The last point is particularly relevant. BPoIR justifies us in claiming that the probability of two or more events are the same. But the events must differ in some respects (even if it is only a difference in when or where they happen) or they would be the same event. To apply BPoIR we have to know (or assume) that these differences are not relevant to the probability of the events happening. We must somehow judge that the suit of the card, the head or tails symbols on the coin, or the choice of DNA base pair is irrelevant to that the chances of that card, coin toss or base pair being selected. This is prior knowledge.

In addition, as Keynes pointed out, the more we try to dispense with assumptions and knowledge about an event then the more difficult it becomes to decide how to apply BPoIR. Another of Keynes examples is a bag of 100 black and white balls in an unknown ratio of black to white. Do we assume that all ratios of black to white are equally likely or do we assume that each individual ball is equally likely to be black or white? Either assumption is equally justified by BPoIR but they are incompatible. One results in a uniform probability distribution for the number of white balls from zero to 100; the other results in a binomial distribution which greatly favours roughly equal numbers of black and while balls. To choose the correct assumptions we would have to know more, for example the process by which the bag was filled.

Now I will turn to the problems with the proof in Dembski and Marks’ paper.

The Proof does not Address Keynes’ objection to BPoIR

Even if the proof were valid (I believe I show below that it is not) then it does nothing to show that the assumption of BPoIR is correct. All it shows is that if you make an assumption other than what Dembski and Marks believe follows from BPoIR then you have 50% or less chance of improving your chances of finding target. The fact remains that there are many other assumptions you could make and some of them greatly increase your chances of finding the target. There is nothing in the proof that in anyway justifies assuming BPoIR or giving it any kind of privileged position.

But the problem is even deeper. Keynes’ point was not that there are alternatives to using BPoIR – that’s obvious. His point was that there are different incompatible ways of applying BPoIR. For example, just as with the example of black and white balls above, we might use BPoIR to deduce that all ratios of base pairs in a string of DNA are equally likely. Dembski and Marks do not address this at all. They point out the trap of taking things for granted but fall foul of it themselves.

The Proof Relies on an Unjustified Use of BPoIR

The proof is found in appendix A of the paper and this is the vital line:

image

This is the probability that a new search space created from an old one will include k members which were part of the target in the original search space. The equation holds true if the new search space is created by selecting elements from old search space at random; for example, by picking a random number of cards at random from a pack. It uses BPoIR to justify the assumption that each unique way of picking cards is equally likely. This can be made clearer with an example.

Suppose the original search space comprises just the four DNA bases, one of which is the target. Call them x, y, z and t. Using BPoIR, Dembski and Marks would argue that all of them are equally likely and therefore the probability of finding t with a single search is 0.25. They then consider all the possible ways you might take a subset of that search space. This comprises:

Subsets with

no items

just one item: x,y,z and t

with two items: xy, xz, yz, tx, ty, tz

with three items: xyz, xyt, xzt, yzt

with four items: xyzt

A total of 16 subsets.

Their point is that if you assume each of these subsets if equally likely (so the probability of one of them being selected is 1/16) then 50% of them have a probability of finding t which is greater than or equal to probability in the original search space (i.e. 0.25). To be specific new search spaces where probability of finding t is greater than 0.25 are t, tx, ty, tz, xyt, xzt, yzt and xyzt. That is 8 out of 16 which is 50%.

But what is the justification for assuming each of these subsets are equally likely? Well it requires using BPoIR which the proof is meant to defend. And even if you grant the use of BPoIR Keynes’ concerns apply. There is more than one way to apply BPoIR and not all of them support Dembski and Marks’ proof. Suppose for example the subset was created by the following procedure:

· Start with one member selected at random as the subset

· Toss a dice,

o If it is two or less then stop and use current set as subset

o If it is a higher than two then add another member selected at random to the subset

· Continue tossing until dice throw is two or less or all four members in are in subset

This gives a completely different probability distribution.

The probability of:

single item subset (x,y,z, or t) = 0.33/4 = 0.083

double item subset (xy, xz, yz, tx, ty, or tz) = 0.66*0.33/6 = 0.037

triple item subset (xyz, xyt, xzt, or yzt) = 0.66*0.33*0.33/4 = 0.037

four item subset (xyzt) = 0.296

So the combined probability of the subsets where probability of selecting t is ≥ 0.25 (t, tx, ty, tz, xyt, xzt, yzt, xyzt) = 0.083+3*(0.037)+3*(0.037)+0.296 = 0.60 (to 2 dec places) which is bigger than 0.5 as calculated using Dembski and Marks assumptions. In fact using this method, the probability of getting a subset where the probability of selecting t ≥ 0.25 can be made as close to 1 as desired by increasing the probability of adding a member. All of these methods treat all four members of the set equally and are equally justified under BpoIR as Dembski and Marks assumption.

Conclusion

Dembski and Marks paper places great stress on BPoIR being the way to calculate probabilities when there is no prior knowledge. But their proof itself includes prior knowledge. It is doubtful whether it makes sense to eliminate all prior knowledge, but if you attempt to eliminate as much prior knowledge as possible, as Keynes does, then BPoIR proves to be an illusion. It does not give a unique result and some of the results are incompatible with their proof.

In Praise of Medicine

Three posts in one day – a record for me. This one is short but heartfelt.

10 days ago I go bashed myself in the face with a costume rail and broke a small bone in my face – the zygomatic arch. This is really a very small thing – very little pain, no blood, no bruising – just a noticeable dent.  However, the only way to get your face straightened out is surgery which I had last Saturday.

What got my attention was the sheer number of really pleasant, highly competent people working to correct this rather trivial problem which I inflicted on myself – surgeon, doctors, anaesthetists, nurses, health care workers, even the porters .  I know its their job but everyone made me feel like they really cared and that I was in excellent hands. This happened to be the NHS and therefore state funded but no doubt the same would apply in most countries in the world.  Sometimes I think we take health services for granted.

When is a Statement Scientific?

Some right wing religious commentators are upset about this article by biologist Greg Hampikian (to my embarrassment I got involved in one such debate and was overwhelmed by the level of feeling). What is interesting about this is the nature of the accusation. For example:

such utter, scientifically false rubbish that it leaves one gasping

“An anti-ID biology professor who doesn’t even know the facts of life, let alone evolution

“Junk Biology Promotes Uselessness of Men

In other words commentators are not just disagreeing with the feminist message (I tend to agree with them) but implying that Hampikian has asserted something that is scientifically false. Furthermore it is not just false, but obviously false. So, given Hampikian’s qualifications the only logical conclusion is that he is lying and lying in a crude way that will easily be discovered.

When it comes down to it there are  three things that Hampikian asserted that have been called scientific errors:

  • That the father’s sperm does not merge with the mother’s egg.

I must confess this does seem wrong and I don’t understand what he is getting at.  The sperm is completely absorbed by the egg which seems to be as about merged as you can get.

  • Women are both necessary and sufficient for reproduction, and men are neither.

Discussions of necessary and sufficient are always a bit tricky as there are almost always some kind of implied conditions.  Only a very special set of statements are necessarily true under all conditions – mathematical statements, statements that are true by definition etc.  Clearly with current technology male sperm is required for reproduction and the only way we know to make such sperm is from a man.  So women are not sufficient. I think perhaps Hampikian was referring to the fact that the sperm could have been created many months or even years before fertilisation. So it is not necessary to have a man around for conception. The other possibility is that he was referring to fact that the technology exists to manufacture DNA and insert it into an egg – although we are a long way from manufacturing a complete set of male DNA – while we have no concept of how to manufacture an egg.  It is a rash statement by Hampikian which is open to many interpretations – but this was an opinion piece not a scientific paper.

  • Identifying “you” with the unfertilised egg in your mother’s womb.

This is the one that really got people excited.  They felt that this was clearly biologically wrong – that “you” were created when the egg was fertilised. 

It is easy to see why this is important to people  with certain religious and political beliefs.  They want to clearly identify the creation of an individual with fertilisation because they also believe that destroying the embryo at any point after fertilisation, i.e. early abortion, is tantamount to murder.  Hampikian is identifying the individual with an even earlier stage in the process of development, but anything that blurs the link between fertilisation and creation of an individual is contrary to some religious beliefs.

I don’t want to dispute whether fertilisation is the point where an individual is created. I would only say that it is not a scientific question.  If you believe in the soul and think that is the point where the soul is linked to the embryo then it is a theological question.  If you are concerned about the rights of the embryo it is a legal question. If you are concerned about the ethics of abortion or research on embryos it is an ethical question.  Or you may simply regard it is a matter of definition – how do you define “individual”.  But none of these are scientific issues.  Scientists know a lot about the chemistry and biology of reproduction and, with the possible exception of his statement about sperm not merging, everything that Hampikian wrote is consistent with what is known (as you would expect from a professor of biology).

What evidence do commentators produce to back up their assertion that this a gross scientific error? There are quite a lot of assertions that it is obviously true and anyone disagreeing is guilty of obfuscation – but this is not evidence.  The only real evidence is extracts from biology textbooks about fertilisation such as

Human development begins at fertilization [with the joining of egg and sperm, which] form a single cell called a zygote. This highly specialized…cell marks the beginning of each of us as a unique individual.

and

a new, genetically distinct human organism is formed.

These need a bit of dissection. It is a scientific statement to say when a cell becomes genetically distinct and when development begins. But Hampikian does not say anything about either of these.  He just writes about “you” which is not a scientific term.  On the other hand statements about what marks the beginning of us an individual or human organism is a matter of definition. These terms carry much more with them than science. When something is a human individual is very much an open debate and has many emotional, legal, religious and other connotations.  Two scientists could agree on every detail of the chemistry and biology, they could make an identical film of what happens, but still differ as to whether to call the fertilised egg an  individual.  Indeed some scientists have identified other (later) stages in the development process with the start of the individual including syngamy, implantation in the uterus, and 14 days after fertilisation (when the embryo can no longer split into more than one individual).  This was not the result of experiments or careful observation as to when the individual began.  It was a definition based on a different interpretation of known facts based on concerns which are outside the science.

This whole thing needs putting in perspective. It is an opinion piece. The whole idea is present interesting, novel and possibly controversial ways of looking at thiings. Hampikian is simply using the non-scientific word “you”, with all its emotional connotations, to give such a view of your origins. Had he been trying to make a different point he might have wrote about how you used to be trillions of disparate atoms scattered about the earth which came together in an organised way.  Some sociologists might talk about how you were the result of sociological forces that defined your role in society.  Some psychologists might talk about the factors in your early childhood that made you into an individual human being. This is an Op Ed piece not a science text book. Of course Hampikian knows the basics of human reproduction as does every kid who has done school science.  To suggest that he was trying to mislead the public on this is absurd.

Blogging and Jürgen Habermas

JuergenHabermas

I am the point of finishing an MSc dissertation on measuring the quality of political debate on the Web. As I result I got to know about the philosopher Jürgen Habermas and in particular his concept of communicative rationality.  It is a big concept but an important part of it is identifying what is required for rational constructive debate.  Political theorists such as Dahlberg, Steenberg and Fishkin have built on these principles as part of the Deliberative Democracy movement and come up with more concrete criteria for “good” political debate (unfortunately all slightly different).  As I studied them I thought it would be interesting to see how they worked as criteria for blogging and on-line debate in general.

Here are Steenberg’s criteria which are as good as any:

I. Justification

Assertions are backed up with justifications i.e. give evidence or at least an argument for what you assert – seems obvious but often omitted

II. Common Good

Arguments are for the common good and not for the benefit of particular citizens.  I guess most people at least pretend they are doing this.

III. Respect

Discussion is on the basis of respect for participants and their arguments.  This one is the one that is most commonly breached on the internet.  In fact I would say it is more often breached than not.

IV. Constructive politics

Discussion is constructive and attempts to find a mutually acceptable solution.  Not sure about this – much internet debate is about examining ideas – there is no requirement to come up with a solution at all.

V. Participation

All citizens affected by the deliberation are involved (presence) and have equal ability to express their views (voice).  This is more to do with the initial set up of the debate but certainly things like moderation practice can effectively limit participation.  But also subtler things like the sheer quantity or size of comments from one or two participants can drown out contrary opinions.

VI. Authenticity

Participants do not attempt to deceive each other. Clearly important – deception takes place all the time in internet debate – from a simple sock puppet to false credentials or straightforward lies.

Personally I am going to make a bigger effort to stick to these principles.  I really think they capture the foundation of rational debate and make blogging worthwhile.  Which is why I have put them up here for public viewing Smile



Follow

Get every new post delivered to your Inbox.