Wednesday, 5 November 2008

WRONG: Chuck Yeager was the first man to break the sound barrier

In 1947, the 24-year-old test pilot Chalmers “Slick” Godlin didn’t become the first pilot to break the sound barrier. Though slated to be the first in the field, he – not unreasonably – wanted $150,000 in hazard pay from the Bell Aircraft Corporation if he was going to travel at 800 mph in a plane that had every chance of falling apart in mid-air. The US government took over the project instead and gave the job to Chuck Yeager, who, at regular US Air Force pay, came a lot cheaper than Godlin.

There is evidence, however, to show that another pilot, George Welch, broke the sound barrier a couple of weeks before Yeager. Flying an F-86 at the same air base as Yeager’s experimental X-1, Welch was under strict instructions not to break the sound barrier in his aircraft before the X-1. He dived from 35,000 feet, however, and ground crews reported hearing a sonic boom. As it was in a gravity-assisted dive, rather than on the level, Yeager is in the record books and Welch is forgotten.

In 1964, NASA and the Federal Aviation Authority decided to further their knowledge of sonic booms by conducting a series of tests in Oklahoma called – honestly – Operation Bongo II. Specifically, they generated eight sonic booms a day, beginning at 7am, for a period of six months. Five years and 15,000 complaints later, the government lost a class action lawsuit brought by the people of Oklahoma City.

If you’re a dominatrix or a cowboy, or are close friends with one, you’ve probably experienced a sonic boom for yourself, albeit at a small scale – the crack of a whip is caused by the tip moving at supersonic speed.

Monday, 27 October 2008

WRONG: Red, blue and yellow are the primary colours

At school, you were probably taught in art classes that the primary colours – the ones you use to mix all other colours – were red, blue and yellow. Sorry to break this to you, but your teacher wasn’t telling the truth.

Primary colours, the building-block hues, aren’t a property of light itself but a consequence of the way our eyes work. Cells in the retina respond to red, green and blue; accordingly, everything we can see is made up of combinations of red, green and blue light. (Some animals have a fourth type of cell that gives them a fourth primary colour, probably close to the ultraviolet range. Bees can’t see red, but they can see “bee purple”, which isn’t purple at all, but a combination of yellow and ultra-violet. All this begs the question of why bees don’t attend more raves.)

TVs work on this principle. Shine a red light and a green light together, for example, and you’ll see yellow where they overlap. Add blue and you’ll get white where all three combine.

So much for light. Now back to art lessons, or the “subtractive” system of colour mixing. In the subtractive system, you don’t add light to make new colours, but start with white then filter it. White light bouncing off red paint appears red because all other wavelengths have been absorbed by the paint. That’s why black surfaces get hot in the sun, because they absorb much of the light, then transmit some of that energy as heat.

Mix two paints together and the effect will always be darker than its constituents, because even fewer wavelengths will make it through the double filter. The three primary colours of the subtractive system (the “primary pigments”) are cyan, magenta and yellow (ie they are colours that can’t be made by mixing other colours, and together make up all other colours). Add cyan to magenta and you get blue. Magenta and yellow make red, and yellow and cyan make green. Most “full-colour” printing is composed of four translucent layers – C, M, Y and K (black). The K layer is only necessary because so much of the average page is type and because no one’s yet synthesised perfect primary inks.

The only reason Miss Johnson gave you red and blue with your yellow paints in art class was that they’re a lot cheaper to produce than cyan and magenta, and while it seems obvious that red and yellow should make orange, it’s less immediately apparent that magenta and yellow will make red.

Wednesday, 22 October 2008

WRONG: Judges have to wear wigs and gowns

Court dress, like all aspects of the law, is governed by ancient and complicated traditions. In the case of wigs and gowns, it dates back to the 17th century, when every gentlemen worth the name owned enough wigs and face-paint to shame a transvestite. The make-up, based on toxic lead oxide, actually caused hair loss, making wigs as much a necessity as a fashion statement. A wealthy nobleman wouldn’t have been seen dead without his Full-Bottomed periwig or a dildo (don’t get excited, a “dildo” was the name for a curly, detachable pigtail).

The rules for what judges should wear were codified officially in the Judges’ Rules of 1635.

According to Thomas Woodcock in his exhaustive history Legal Habits, the fashion for wigs was dying out in other professions by the 19th century, but the Full-Bottomed and Tie wigs remained in use by judges and barristers respectively (in England and Wales – the uniform and general legal tradition are slightly different in Scotland). The law moves very slowly, and is, by nature, conservative; one 19th-century judge, according to Woodcock, refused to recognise his own son at the bar because he was wearing a new style of wig.

Small alterations to the dress code have been introduced by daring judges over the years since the Rules were published, but more substantial changes only took place on the creation of new courts, such when the Courts of Chancery, the Admiralty, Probate and Matrimonial Causes were combined to form the High Court in 1873-5[2]. Even then, the changes tended to affect only the colour and cut of gown.

Today, judges are not, in fact, obliged to wear the wig and gown. In hot weather, or when children are on trial or acting as witnesses (and are likely to be intimidated), court dress may be dispensed with at the judge’s discretion.

The Lord Chancellor commissioned a report on the possibility of radically changing court dress in 2003[3], but its findings were not acted upon. It seems the gravitas conveyed by the historical costume (embodying as it does centuries of tradition) aren’t considered worth sacrificing simply for modernisation’s sake. All societies make a distinction between “sacred” and “profane” contexts, with costume historically acting as a key indication of the difference between the two. If you doubt its significance, ask yourself this: does the Pope wear a funny hat?

Thursday, 16 October 2008

WRONG: Germany started World War One

Germany didn’t start the war, and despite what Baldrick would have you believe, neither did it begin when a bloke called Archie Duke shot an ostrich because he was hungry.

Europe in the early part of the 20th century was a good time for factions: communism, anarcho-syndicalism and a festival of less-popular -isms were battling for the support of the masses while, at the other end of the social scale, the competing imperial monarchies flexed their impressive moustaches in Africa and Asia. It was a family affair – Kaiser Wilhelm II of Germany, Czar Nicholas II of Russia and King George V of the United Kingdom were all Queen Victoria’s grandchildren, as were the wives of five other crowned heads of Europe (*Sweden, Spain, Romania, Greece and Norway).

An unfortunate side effect of the struggle was an arms race, as no one nation could afford to get left behind. Germany and France were already at loggerheads over who owned the Alsace-Lorraine borderlands and Turkey and Russia were grumbling at each other across the Balkans.

As it was, the spark for this powder keg was provided by one of the century’s clumsiest assassination attempts. A gang of Serbian nationalists, probably associated with a terrorist group called the Black Hand, resented Austria-Hungary’s influence on their side of the border and hatched a plan to blow up the Arch-Duke Franz Ferdinand (Uniquely in history, he inspired both a global war and a well-tailored indie band.) during a state visit to Sarajevo. The first conspirator chickened out as the motorcade passed by. The second threw a bomb, which missed. In the ensuing panic, the other five failed to do anything at all.

That would have been that, except that Franz Ferdinand decided later to visit the blast casualties in hospital, and his driver, unaware of a change in plans, happened to take a wrong turn. As he backed up to return to the official route, he passed the seventh member of the conspiracy, Gavrilo Princip, who had gone to get his lunch. Presumably unable to believe his luck, the 19-year-old Princip shot Franz Ferdinand and his wife five times. Too young to be executed, he was imprisoned for life but died of TB in 1918.

To cut a long story short, Austria-Hungary bullishly used the assassination as an excuse to invade Serbia. Russia unexpectedly declared war against Austria-Hungary in support of Serbia. Germany had already agreed to side with Austria-Hungary, so declared war on Russia. France came out in support of Russia, so Germany declared war on them too.

Here’s where it gets really stupid. To get to France, Germany had to go through Belgium. The Belgians refused permission, but Germany ploughed through anyway, and that’s where Britain – to Germany’s surprise – got involved, thanks to a 75-year-old treaty designed to protect Belgium’s independence. The Turks (the Ottoman Empire) joined the Germans, the Italians joined the Allies in 1915 and Bulgaria joined the Austro-Hungarians. Even America stuck its oar in towards the end. Some 15 million deaths later, the Germans (and the Austro-Hungarians, etc) lost.

Monday, 6 October 2008

WRONG: Free speech is protected in the UK

Even in America, where the First Amendment is so well known it’s practically a celebrity, freedom of speech is limited. You may not directly incite the overthrow of the government, for example, though talking in broad terms about how a hypothetical revolution might, sort of, be a good thing, is okay.

In Britain, Article 10 of the European Convention on Human Rights (enshrined in domestic law as the Human Rights Act of 1998) protects “the right to freedom of expression”. However, the less-well-quoted second paragraph adds a series of get-out clauses regarding national security, public health and safety, crime, defamation, confidentiality and judicial impartiality. In other words, free speech is not free. Speech costs, and right here’s where you start paying.

Defamation is perhaps the harshest restriction. Those accused of it in the UK are, in a sense, guilty till proven innocent. The burden of proof is on the defendant. If you’re accused of defamation – and it’s not in question that you said what they say you said – you have to prove that what you said was not defamatory. There are four grounds for this:

1. What you said was true. Eg, Jeffrey Archer is a liar. (He was convicted of perjury, therefore he is uncontestably a liar.)

2. It was “fair comment”, or, in other words, merely an opinion. But to be fair comment, a jury must agree that it is responsible, constructive and informed. They must also determine that it was not made with malice. Eg, Jeffrey Archer is not a suitable candidate for the priesthood.

3. It was in the interest of the public (as opposed to being of interest to the public) and not motivated by malice. Traditionally the case for public interest has been hard to make, but would typically involve a cabinet minister doing something unethical.

4. It was said by a peer or MP in the Houses of Parliament in the course of parliamentary proceedings, or under oath by a witness in a court of law. Additionally, a third party reporting neutrally from either of those places may repeat the claims without being guilty of defamation. Neutral reporting is not necessarily a defence in other circumstances. Saying “According to Fred Bloggs, Jeffrey Archer is a bag of shit” is just as defamatory as Fred Bloggs’ original statement. So is “Allegedly, Jeffrey Archer is a bag of shit”. The only reason newspapers get away with playing the “allegedly” card is that it’s not usually worth the plaintiff’s time and effort pursuing in court everyone who repeats an allegation after the first instance.

Unless, of course, the original statement was made by someone with no funds, but repeated by a newspaper or magazine with deep pockets and a weak case.

Another exception to the freedom of speech is the legislation concerning incitement to racial and religious hatred – in the words of the Act, you may not “stir up hatred” against anyone on racial or religious grounds. But how do you define a religion or a race? And what, precisely, constitutes “stirring up”? Is it illegal to say, “I think everyone should hate Aum Shinrikyo members”, bearing in mind that fair comment is not a defence for incitement to religious hatred? (Aum Shinrikyo is the Japanese cult behind the sarin gas attacks on the Tokyo underground)

What about, “I applaud anyone willing to slap a Norwegian”?

There are many more examples of “unfree” speech:

• Perjury: you cannot lie under oath in court. Well, you can, but you’re leaving yourself open to arrest. (Jeffrey.)
• Similarly, you can’t give a judge backchat or talk out of turn, or you’ll risk being in contempt of court.
• Judges can restrict court reporting. Some cases in Family Court or involving national security are heard in camera (ie privately), and the press may not report the names of rape or blackmail victims. The press may also be subject to a temporary ban on reporting certain proceedings, such as committals, at which a magistrate decides whether there is enough evidence for a jury trial.
• You can’t plagiarise someone or otherwise infringe their copyright, though you may quote someone else’s material in a published work within legally proscribed limits.
• Technically, you can’t blaspheme against the god of the Church Of England, though recently courts have tended to leave it to Him to punish transgressors.
• You can’t make a protest within a kilometre of Parliament without a permit.
• You can’t lie about your products in advertising material.
• Breach of confidence is illegal – you may not, for example, tell anyone about your time working as Jade Goody’s personal nutritionist if your contract with her expressly forbade telling tales.

As a postscript, please don’t slap Norwegians, they’re a lovely bunch of people. Except the black metal band Mayhem. Before they even released their first album, the singer, Dead, had lived up to his name by committing suicide and the bassist Count Grishnackh had killed the guitarist Euronymous by stabbing him 24 times. The replacement guitarist, Blasphemer, later fired the replacement singer Attila (his real name, somewhat improbably) by kicking him down the stairs and smashing his head twice into a wall. The second replacement singer, Maniac, accidentally fractured a fan’s skull when he threw a severed sheep’s skull into the audience. Drummer Hellhammer, meanwhile, is the reliable band member: when not on tour he works as a night watchman in a mental hospital. Snaps to Chris Campion of The Observer for digging up the details of this story. So to speak.

Monday, 29 September 2008

WRONG: Alexander Fleming discovered penicillin

Bacteria are creatures consisting of a single cell, usually about a thousandth of a millimetre wide. They reproduce themselves in the body, or indeed anywhere warm, moist and nutritious, though there are some astonishingly resilient germs out there (not to mention in there). Deinococcus radiodurans (“terrible berry that resists radiation”), for example, can survive a dose of radiation 500 times stronger than that which would kill a human. Geothermobacterium ferrireducens prefers boiling temperatures, while leifsonia aurea likes nothing more than a sub-zero chill.

In 1928, Alexander Fleming famously noticed small areas of inhibited growth in a dish of staphylococcus, and closer inspection revealed the active agent to be penicillium mould, similar to the one that makes an encounter with gorgonzola cheese so memorable. After three years, however, Fleming gave up on his penicillium studies, believing the organism couldn’t exist long enough in the human body to be effective. Three years later he tried again, but still had difficulty persuading other researchers to help modify the mould for human use.

It wasn’t till 1942 that a patient was successfully treated with penicillin. Oxford University’s Howard Florey, Ernst Chain and Norman Heatley were the first to turn the mould into a medicine, though during the war they had to travel to Peoria, Illinois to find a lab that could produce it on a workable scale (their seed mould came from a mouldy melon on the local streetmarket).

All this, however, overlooks the work of three other scientists.

A Costa Rican toxicologist rejoicing in the name of Clodomiro Picado Twight reported his discovery of penicillium’s anti-bacterial properties to the Paris Academy Of Sciences over a series of experiments dating from 1915, long before Fleming. His work was largely ignored, and his original manuscripts were only discovered and published in 2000.

Earlier than Twight, Ernest Duchesne, a French doctor, noticed that Arab stable boys encouraged the growth of mould on saddles to help heal sores on the horses. At the age of 23, he conducted experiments that proved strains of penicillium would cure animals infected with typhus and submitted his research to the Institut Pasteur in 1897. In what was clearly to become something of a habit among French academics, the institute ignored him completely.

Even Duchesne, however, wasn’t the first. As early as 1877 the eminent British physicist John Tyndall had noticed penicillium’s effect on bacteria. While demonstrating Pasteur’s theory of the existence of microbes, he observed in one experiment that, “The penicillium was exquisitely beautiful. In every case where the mould was thick and coherent, the bacteria died.”[5] But being a physicist rather than a physician, Tyndall gave no further thought to his observation.

Fleming, Florey and Ernst Chain shared the Nobel prize in 1945 for their work on creating the therapeutic form of penicillin. Yet resistant strains of bacteria had already emerged two years later. Thanks, Darwin!

Wednesday, 24 September 2008

WRONG: MI5 and MI6 exist

This isn't a conspiracy theory: of course the intelligence services exist, just not under the names MI5 and MI6. These designations aren’t offical titles - the former became the Security Service in 1931, and the latter didn’t have a name at all until 1994, because it didn’t officially exist. It was publicly acknowledged in the Intelligence Services Act of 1994, however, and took on its previously informal title, the Secret Intelligence Service.

The terms MI5 and MI6 are so popular, however, that the services even use those titles for their official (and surprisingly snazzy) websites. James Bond has a lot to answer for.

MI5 and MI6 are far from being the only MI (Military Intelligence) units in British history. By the end of World War II, there were a full 15 others: MI4 provided maps, MI7 governed propaganda, MI9 debriefed escaped PoWs and provided false documentation, while MI15 was concerned with aerial photography. These departments were later disbanded or merged, leaving just two (that we know of…) plus GCHQ, which is largely concerned with intelligence-gathering and the security of information (ie writing and breaking codes).

MI5 deals with covert domestic intelligence – in the words of the Security Service Act of 1989, “the protection of national security and, in particular, its protection against threats from espionage, terrorism and sabotage.” It is based at Thames House on Millbank in London, and is responsible to the Home Secretary. Fans of the BBC1 MI5 drama Spooks will be disappointed to learn that “Thames House” in that show is actually the Freemasons’ Hall on Great Queen Street.

MI5 files are gradually destroyed as they become obsolete, or are released into the National Archives if they are of historical interest. Recent releases include evidence that the leader of the British Union Of Fascists in the 1930s had, almost endearingly, written to Mussolini asking for a signed photo.

Another file highlights MI5’s suspicion that the black American singer Paul Robeson was a communist, while also expressing admiration for his voice after attending a concert. Swallows And Amazons author Arthur Ransome was a suspected Bolshevik, too, but then he did marry Trotsky’s secretary.

The overseas intelligence division MI6 was created in 1909 when Britain decided to establish a permanent secret service (though there have been British government spies in Europe since the time of Henry VIII at the very latest). Its first chief, Mansfield Cumming, was a Naval officer who was enticed into the job by Admiral AE Bethell with the suitably enigmatic note: “My dear Mansfield Cumming […] You may perhaps like a new billet. I have something good I can offer you and if you would like to come and see me on Thursday about noon I will tell you what it is.”

Ever the workaholic, Cumming turned up for work a week early. Disappointingly for those who romanticise espionage, he was obliged to note in his diary, “Went to the office and remained all day but saw no one, nor was there anything to do.”

Cumming ran MI6 out of a succession of flats for his entire life (he worked such long hours that he preferred not so much to work from home as live at the office), but the Service has been based at London’s Vauxhall Cross since 1994. The gigantic, green-and-beige, Lego-style building is so far from being a secret headquarters that it featured in a mortar-attack sequence in the Bond movie The World Is Not Enough. A year later, life imitated art when terrorists (suspected to be the Real IRA) fired a rocket at the building from Vauxhall Bridge.

Thursday, 18 September 2008

WRONG: There are four gospels

The Bible isn’t carved in stone, so to speak – the books that make it up were decided at a series of synods and councils between the 4th and 15th centuries. The four you’ve probably heard of, Matthew, Mark, Luke and John, are the accepted “canon”, but there are many more that didn’t make the cut. The Gospel of Thomas, for example, is believed to date from the first century CE, and is a collection of Jesus’ sayings written by his (spiritual) “twin”, Thomas.

The Gospel Of Peter, also from the first century, is notable for presenting the cross that Jesus was crucified on as being able to speak. (It says “Yea”, which may not be the sermon on the mount, but is pretty good for a lump of wood.)

Other books of dubious origin from the second century or thereabouts include:

• The Gospel Of James, who claims to be Jesus’ step-brother.
• The Infancy Gospel of Thomas (nothing to do with the other Gospel of Thomas), one version of which records that a boy punched the infant Jesus, who responded by cursing him to death. When the neighbours complained, Jesus blinded them with his powers. Meek and mild, my arse.
• The Gospel of Judas, which doesn’t claim to be by Judas, but asserts that he betrayed Jesus under direct orders from him.
• The Gospel of Nicodemus, which includes a passage that purports to be Pontius Pilate’s report to the Emperor Claudius.
• The Gospel of Mary, which may refer to Mary Magdelene or the Virgin Mary.
• The Gospel of Pseudo-Matthew, an account of the Virgin Mary’s childhood.
• The Gospel of Philip, which suggests that Jesus married Mary Magdalene.

Most interesting of all is the Gospel of Eve, which is almost entirely lost. The only reason we know of it at all is thanks to the early church father Epiphanius, who quoted it and dismissed it as – get this – a heretical justification of oral sex. A tragedy for us all that it was lost.

The fact that there are only four canon gospels is itself largely the result of a second-century theologian’s insistence. Irenaieus of Lyons decreed that, “It is not possible that the Gospels can be either more or fewer in number than they are. For since there are four zones of the world in which we live, and four principal winds… it is fitting that she [the church] should have four pillars.” The four we have are merely the ones judged by the early church as most likely to be accurate accounts of Jesus’ life. See also WRONG: Jesus definitely existed

Tuesday, 9 September 2008

WRONG: “Jedi” became officially recognised as a religion after the 2001 Census

People start religions all the time – some even manage it accidentally. During the Second World War, the natives of the Melanesian island of Tanna encountered black Westerners for the first time, and, more importantly, saw them enjoying large quantities of airlifted goods. Their only previous Western contact had been with white missionaries and planters and they came to believe, somewhat paradoxically, that if they rejected the white way of life, returning instead to their traditional ways, they too would be granted access to all the miraculous wealth of the West. To this day, the cultists believe that a god called John Frum (Possibly a pidgin abbreviation of “John From America”) will one day come to make them rich. His personality appears to be a combination of the local deity Kerapenmun and recent import John The Baptist. If you think that’s odd, Tanna is also home to a cult of Prince Philip worshippers.

In the UK, however, religions don’t really have any special status outside of very specific legal contexts (employment discrimination law and religiously-motiviated violence), so there’s little advantage in seeking official sanction. The government doesn’t particularly care what you believe.

390,000 Britons
did indeed enter their religion as “Jedi” on the 2001 Census, believing they were mischievously forcing the government to acknowledge The Force as an official religion. But in the tradition of young Padawans since a Long Time Ago, they were reckless. All the Office Of National Statistics was forced to acknowledge was that 0.7 per cent of the British population got a kick out of writing “Jedi” on an offical piece of paper.

25 of the Jedi adherents were from the tiny Isles of Scilly, 28 miles off the coast of Cornwall – go, Scilly!

Wednesday, 3 September 2008

WRONG: Columbus discovered America

On October 12 1492, having spent two months on board a rancid, crowded ship crossing the Atlantic Ocean with no guarantee of a destination, the Genoese adventurer Christopher Columbus finally made landfall in America. But he wasn’t the first to arrive on the continent. He wasn’t even the first European to arrive on the continent. He didn’t, strictly speaking, arrive on the continent at all – he was on an island that the locals called Guanahani, and which he renamed San Salvador. (It’s now one of the Bahamas, though no one’s sure which one.)

Columbus didn’t actually find mainland America until August 1498, six years later, on his third voyage. He made landfall in Venezuela, describing the natives as being “very numerous, and for the most part handsome in person”. He never made it to North America, and he died convinced that he’d landed in India.

It’s often claimed that Columbus undertook his voyages to prove the earth was round. In fact, there wasn’t a lot of dissent on that subject among the learned, and Columbus actually came to believe that it wasn’t a sphere. “I have come to another conclusion regarding the earth,” he wrote to the King of Spain. “Namely, that it is not round as they describe, but of the form of a pear… or like a round ball, upon one part of which is a prominence like a woman’s nipple.” Sailors, eh?

While Columbus may have started the rush to exploit the New World, other explorers have a greater claim to the title The First European In America (the hunter-gatherers who crossed the Bering land-bridge and colonised America 25,000 years earlier, were, of course, Asian). John Cabot, or Giovanni Caboto to give him his proper name, was the first to map the North American coastline in 1497, placing him on the mainland a year earlier than Columbus.

Before the arrival of either of the notable explorers, however, the Vikings had settled Greenland and Newfoundland. In 1960 archeologists discovered the 1,000-year-old remains of a Norse village in L’Anse Aux Meadows in Newfoundland. Among the debris found on site were a bronze fastening-pin and a bone knitting needle, which makes you wonder a little about the Vikings’ reputation as hammer-swinging berserkers. (The Vikings didn’t have horned helmets either, by the way, that’s a myth.)

Norse sagas also recall Leif Ericsson’s arrival in Vinland (usually identified as Newfoundland) to preach the Catholic faith. They mention lands called Markland (“wood land”) and Helluland (“stone land”) to its south.

Evidence is much sketchier for the claims made for other nations’ expeditions. Among those chancing their arms:

• Ireland – St Brendan was reported as having made a legendary journey to the “Isle Of The Blessed” in the sixth century CE.
• Polynesia – a cross-disciplinary team of anthropologists and biologists claims that their DNA studies of ancient chicken remains prove that the birds, which are native to South-East Asia, must have been introduced to South America by Polynesians no later than 1424. Their thesis is backed up by the discovery of sweet potatoes, an American vegetable, in archeological digs of pre-European Polynesian settlements.
• Australia – Similarities in skull types between Australian aborigines and prehistoric Brazilians have led some to speculate that aborigines somehow found their way to Brazil around 50,000 years ago, 25,000 years before the arrival of bands of settlers across the Bering land bridge from Asia.
• Mali – Drawing from a tradition of oral history and ancient Egyptian documents, the historian Gaoussou Diawara theorised that the Muslim emperor Abubakari II sailed to Brazil in 1312 with a fleet of 2000 small boats.
• China – Gavin Menzies, a retired British submarine commander, wrote a popular book claiming Admiral Zheng He made it to America in 1421 (he must have been practically neck-and-neck with the Polynesians if he did).
• Portugal – There is evidence to suggest skeletons found in Canada may be Portuguese, dating from 1424.

Tuesday, 26 August 2008

WRONG: Jesus definitely existed

Next time you’re in Japan, make a journey to the village of Shingo at the far north of Honshu island. You’ll find a road sign marked "Christ Grave", leading to a tomb with a cross on it. Local legend has it that rather than die on the cross, Jesus fled through Russia to Japan and lived out his days in Shingo as a rice farmer with his wife Miyuko and his three daughters.

(The dead guy on the cross was apparently Jesus’ brother Isukiri, who sneaked up there when the Romans weren’t looking so that Jesus could escape.)

The Bible disagrees. As far as Christians are concerned, Jesus lived and died in Palestine roughly between 1 and 38 CE. (CE, the secular equivalent of AD, stands for Common Era. The dating of Jesus’ birth by those who accept he existed is pretty vague in its own right. Estimates by historians place it between 18BC and 1AD.)

The trouble is, there’s no proof outside of the Bible itself. The New Testament (the part of the Bible written after Christ) is an assemblage of books written at different times by different authors, and the authenticity of the ones claiming to be eyewitness testimony is suspect. Parts of the Gospels of Luke and Matthew, for example, contradict each other, while other parts appear to be copied from the Gospel of Mark. Mark, meanwhile, is suspiciously confused about Palestinian geography for a native.

Mark, incidentally, also mentions Jesus having sisters. Look it up – chapter six, verse three.

The earliest non-Biblical reference to Jesus was by Flavius Josephus, a Jewish historian born in 37CE. In his Antiquities Of The Jews, written about 93CE, he – improbably for an ultra-orthodox Pharisee – describes Jesus as being “the Christ”, or Messiah. Even The Catholic Encyclopedia acknowledges that, “The passage seems to suffer from repeated interpolations,” meaning it is very likely that the key sentences are a forgery, inserted into the text by 4th-century Christian translators.

Writing in 112CE, the Roman historian Tacitus described in his Annals Nero’s persecution of Christians 50 years earlier and mentioned their founder “Christus”. Errors in his description of Pilate imply, however, that his sources were Christians in Rome rather than offical documents. The passage itself is open to question – the radical former clergyman Robert Taylor claimed that there was no evidence for it even existing in copies of the Annals before the 15th century. More pesky forgers.

Suetonius, author of The Twelve Caesars (120CE) referred to “disturbances at the instigation of Chrestus” among Jews in Rome, though Chrestus was a common Greek name and may simply have been a local troublemaker. Aside from these three, the dozens of Greek, Roman and Jewish historians writing at the supposed time of Christ or in the century after made no reference to him whatsoever.

As Bertrand Russell wrote, “Historically, it is quite doubtful whether Christ ever existed at all, and if he did we do not know anything about him.” Even the things we think we know are dubious. Belgian historian Franz Cumont, for example, uncovered the following details about Mithras, already an enormously popular deity among Romans and other gentiles by Jesus’ time. Mithras was worshipped as “The light of the world”, he was part of a holy trinity in a cosmology that invoked heaven and hell, and he would redeem his worshippers on the Day of Judgement. His birthday festival was on 25 December, and he took part in a last supper before he died and ascended to heaven. His worshippers underwent baptism, ritually consumed bread and wine on Sundays and celebrated a rite of rebirth in late March/early April[3][4]. Quite a coincidence.

Tuesday, 19 August 2008

WRONG: You can still be executed in Britain for High Treason, Piracy and Arson in Her Majesty’s Shipyards

For some reason, this old saw is still lingering wherever pedants meet pubs.

A ridiculous number of offences have been punishable by death in Britain over the years. So many, in fact, that the statute books of the 18th century later became known as “The Bloody Code”. Alongside murder, you could be executed for treason, stealing from a shipwreck or from a rabbit warren, writing graffiti on Westminster Bridge, poaching, stealing letters, sacrilege, blacking-up your face at night (this was more a measure against robbery than minstrels), impersonating a Chelsea Pensioner (again, a measure against benefit frauds, rather than impressionists), cutting down young trees, being in the company of gypsies for a month and, remarkably, “strong evidence of malice” in 7-to-14-year-old children. According to the Lord Chief Justice, in 1801 a boy of 13 was executed for stealing a spoon.

Executions were popular public events. In 1807, 40,000 came to see the hanging of the murderers Owen Haggerty and John Holloway at the Old Bailey – so many that a sudden rush near a pie stall caused more than thirty spectators to be trampled to death. Given the circumstances, the irony may have been lost on them.

By 1861, various parliamentary reformers had managed to reduce the list of capital crimes to four: murder, high treason, arson in royal dockyards and piracy with violence. A century later, the Murder Act (Abolition Of The Death Penalty) of 1965 introduced a five-year moratorium on execution for murder that was made permanent in 1969. Outstanding death sentences were commuted, but it was too late for Gwynne Owen Evans (also known as John Welby) and Peter Allen, a pair of small-time thieves who stabbed a workmate of Evans’ to death during a robbery. They were simultaneously hanged in Manchester and Liverpool respectively, becoming the last people to be executed in Britain.

High treason ceased to be a capital crime in 1971 and the Crime And Disorder Act of 1998 put an end to the remaining two. In 1999, Jack Straw signed the 6th protocol of the European Convention On Human Rights, formally abolishing the death penalty, though the Convention does contain the proviso that any signatory state may employ it during time of war. In 2002, the penalty was abolished in the Turks and Caicos islands in the West Indies (see here for more about British overseas territories), meaning that it is now impossible to be executed for any crime on British territory anywhere in the world, no matter whose rabbit warren you’ve been caught with your hand in.

Tuesday, 12 August 2008

WRONG: Inflation is a bad thing

Inflation, or the ongoing rise in prices that reduces the spending power of money, is the bane of every national treasury. For the uninitiated, it’s the mysterious tendency of the pound (or cruzeiro, or baht) in your pocket to buy you less and less each year. It’s measured against the Consumer Price Index, which tracks the cost of a literal (and big, and unlikely) shopping basket of goods. The basket for 2007 included:

•olive oil
•frozen pizza
•an electric fan
•a toothbrush
•chicken kievs
•a digital radio
•tracksuit bottoms
•wallpaper paste
•kiwi fruit
•nursing home fees
•motor oil
•a dvd recorder
•an acoustic guitar
•a hamster
•squash court hire
•fish and chips
•“corn-based snacks”

The following is one very simple model of inflation (the “cost push” model, if you really want to know):

1. Everyone wants higher wages.
2. As wages go up, employers pass on the cost of the raises to consumers by putting up prices.
3. If prices are up, everyone wants higher wages.
4. Go to Stage 2.

Monetarists, who are generally followers of the economist Milton Friedman, would say inflation is the result of changes to the money supply. The more money there is in circulation, the less it is worth, as the German government of the 1920s discovered when they printed so many banknotes that people found it cheaper to burn them than buy firewood. In Hungary in July 1946, prices more than tripled every day, leading to the issue of the quite remarkable 100 million trillion (100,000,000,000,000,000,000) pengo note. It was worth about 10p, and meant the smaller denomination notes had the same value – and possibly the same use – as a sheet of toilet paper. The treasury, presumably clamping their hands over their ears and shouting “La la la la la,” printed an even larger 1 billion trillion pengo note, but it was never issued. Instead, sadly, the pengo was replaced with the forint, which doesn’t sound nearly as much like an animated penguin.

A similar case is occurring in Zimbabwe today – at the time of writing, the weekly national lotto prize stands at 1.2 quadrillion dollars.

It should be noted, however, that “money” does not only mean cash. It covers spending of all kinds.

Followers of John Maynard Keynes (the fact that the names of the two most prominent 20th-century economists combine to form “Milton Keynes” is a coincidence. The Buckinghamshire new town’s name originated as a corruption of “Middleton” and a Norman landowning family called Cahaines) argue that money supply is only a small factor among the causes of inflation. Aggregate demand – a combination of government spending, public consumption, investment, and the export/import balance – is the key. Other schools of economic thought have different opinions, and so far there is no clear consensus on the exact causes of – or remedy for – inflation, though monetary controls have historically proved effective.

The unexpected conclusion that most economists have come to, however, it that inflation isn’t always a bad thing. As long as it is slow and wages are matching it, then it’s a good sign that the economy as a whole is growing, and is controllable because it is predictable. Deflation – a sustained downward trend in prices – sounds great in principle, but actually reflects an unwillingness to spend, as in the Great Depression of the 1930s.

Monday, 4 August 2008

WRONG: JD Wetherspoon founded a chain of pubs

The name may conjure an image of a genial 19th-century Yorkshire brewer with muttonchop sideburns and a leather apron, but that’s just what they want you to think. Like Mr Kipling, JD Wetherspoon never existed. (Sara Lee, on the other hand, did. She was the daughter of Charles Lubin, the founder of Kitchens Of Sara Lee, a frozen baked-goods company that later became simply The Sara Lee Corporation. Sara Lee Schupf went on to become a philanthropist, supporting especially the cause of women in science.)

The chain was actually founded in 1979 by a 24-year-old law student called Tim Martin. The 6’6”, mullet-sporting Ulsterman took the name “Wetherspoon” from one of his teachers, while the “JD” – brilliantly – came from JD “Boss” Hogg in The Dukes Of Hazzard.

Martin supposedly modelled the chain itself on an essay by George Orwell about a fantasy pub called The Moon Under Water, which offered a quiet, drunkard-free atmosphere with friendly staff and no music. A reading of Orwell’s essay, however, demonstrates that something must have been lost in translation – he demanded beer in pewter or china mugs, open fires, liver-sausage sandwiches, strictly Victorian fittings and “motherly barmaids”.

Monday, 28 July 2008

WRONG: Wills must be drafted by solicitors

In order to make sure your worldly goods go to the right people you do not need to write, “I, Fred Bloggs, being of sound mind and body, do hereby declare…” on a fancy piece of parchment. Technically, a scribbled note on the back of an envelope carries the same legal force as a solicitor’s document.

It is wise, however, to get a solicitor’s help anyway to prevent the will being ambiguous should someone contest it. Also, there are points of law that may change your mind about what you leave to whom. Unmarried partners, for example, are entitled to nothing at all unless it is specifically allocated in a will.

The following are the strict legal requirements for your will to be valid:

1. It must be written down, and written voluntarily.
2. You must be over 18.
3. You must be of sound mind (though there’s little point in asserting, “I am of sound mind”, because how would you know if you weren’t?)
4. It must be signed in the presence of two credible witnesses who are not themselves beneficiaries in the will. (The will will still be valid if they are, but the witnesses won't get anything.)
5. You have to sign it in the witnesses’ presence, then they have to sign it.
6. It’s worth adding the date, too, for the sake of clarity.

Even if you fulfil all the requirements, your dependents can still successfully contest the will if they feel they have been unjustly ignored. Spouses, ex-spouses (if you/they have not remarried), children, step-children or partners who depend on you financially may all claim “reasonable financial provision” under the Inheritance Act of 1975. A court will decide whether Junior has a right to your money.

Outside of those provisos, you can do what you like. In 1926, a Toronto lawyer called Charles Vance Millar decided to leave his fortune to the Toronto woman who gave birth to the greatest number of children in the ten years after his death. Four women eventually scooped $125,000 each from “The Great Stork Derby” for having nine children each. Several women had more, but, rather coldly, were disqualified because some of the children were stillborn.

In 1930, Time magazine reported the case of a Mr TM Zink of Iowa, who made an even more eccentric bequest. He left $100,000 in 1930 with the instruction that in 2005, when he expected the capital to be worth substantially more, there should be built a library in his name where women would be banned not only from using the library, but also prohibited from working as staff and even being represented as authors. He left his daughter $5 and his wife nothing at all. According to The People’s Almanac, the will was successfully contested.

Next to Mr Zink, William Shakespeare seems a positive charmer, having left to his wife “my second-best bed”.

Monday, 21 July 2008

WRONG: Your vote is secret

Next time you’re engaging in the democratic process, take a look at the back of your ballot paper – you’ll find it’s marked with a unique number. On its own, that number works as a measure against the counterfeiting of ballot papers (as is the “official mark”, a perforation made by the returning officer’s clerks when they hand you the paper). The number, however, corresponds to another on the counterfoil for each paper, which would be fine if that counterfoil did not also record your electoral number.

If you haven’t fallen asleep yet, this means that – in theory – your vote could very easily be traced back to you by The Man.

Don’t panic, though, because in practice it’s not legally possible for The Man to identify your vote except in extreme circumstances. The laws governing elections are very strict, and require that ballot papers are counted face up so that no one can read the number (though this would be very hard to enforce while the papers are removed from the ballot box.) The papers – even the void ones – are then sealed in packets, as are the counterfoils, and returned to the Clerk Of The Crown, who stores them for a year before destroying them. Any deviation from these rules could result in six months in prison for the offending official.

Only the Speaker of the House Of Commons, the High Court or Crown Court can order the opening of the packets, and even then only if they already know that a vote has been fraudulently cast and that the result of the election may be in doubt. The vote-tracing procedure has not been employed in a Parliamentary election since 1911.

In 1998, a Home Affairs Select Committee recommended that the numbering system be abandoned, but its findings have not been adopted by the Electoral Commission.

Incidentally, since 1885 the office of Clerk Of The Crown In Chancery has been held by the Permanent Secretary to the Lord Chancellor. Another of his duties is to use the unique silver matrix of the Queen to affix the Great Seal Of The Realm. Every document of state comes complete with one of these plastic (formerly wax) medallions embossed with the Queen’s mark and tied on with string. Elevations to the peerage are dark green, actions relating to the Royal family are blue, and the appointment of bishops is red.

Friday, 11 July 2008

WRONG: Mars is red

While Mars certainly isn’t green or blue, it’s no redder than, say, the Australian outback. Both are a drab, rusty brown, and Mars' sky is a butterscotch-yellow. After powerful dust storms, iron-rich particles form a yellow-brown haze. This dust absorbs light of shorter wavelengths (the blues), and scatters the yellows and reds around the sky. You can see a similar effect in any smog-laden city, or at sunset. (ie The red sky at night so popular with shepherds.)

There is nevertheless controversy over what precise colour the Martian sky is. NASA was recently accused of “tweaking” the colour of their pictures to fit with people’s expectations. Their first pictures, from back in 1977, were certainly doctored to appear blue, but only because the the scientists were unprepared for presenting their material to the press. “Several days after the first release,” said Imaging Team Leader Tim Mutch, “we distributed a second version, this time with the sky reddish. We smiled painfully when reporters asked us if the sky would turn green in a subsequent version.”

One point of interest is that Mars’ sunrise and sunset are actually blue. While the dust particles scatter away all wavelengths, their particular size relative to the angle through which the light is travelling close to the horizon causes blue light to be directed along the line from the sun to the dust particle (to the viewer), causing a blue haze.

Even stranger, Blur's bass player Alex James is a keen astronomer and was one of the first non-academics to get involved in the Beagle Mars lander project, having asked his accountant to find a way for him to go to Mars. His accountant put him in touch with Professor Colin Pillinger, who was then trying to drum up support for his proposed project.

Thursday, 3 July 2008

WRONG: The Government runs the Stock Exchange

The first ever modern share was issued by the Dutch East India Trading Company, a collection of spice merchants who banded together in 1602 to compete against the Spanish and Portuguese. The investors were mainly middle-class tradespeople like bakers, coopers and barber-surgeons (including, wonderfully, one Agneta Cock). Rather than receive their annual share of the company’s profits – the “dividend” – in cash, Mrs Cock and her fellow investors were paid in pepper, mace and nutmeg.

Modern shares are essentially subject to the same rules as those governing La Cock’s pioneering transaction: in return for your investment, you claim a dividend, and you can sell the share whenever you like for however much anyone is willing to pay.

The London Stock exchange is a public limited company (in other words, you can buy shares in it, and its accounts are published). In 1698, a rowdy group of traders (plus ça change) was expelled from the Royal Exchange market, crossed the road and started dealing in Jonathan’s Coffee House, Change Alley. It was 75 years before they thought to build their own offices (even though Jonathan’s burned down in 1748), but they did, and prospered as a private company. Amazingly, it was a further 200 years (1973) before female traders were allowed to join the boys’ club. Its private investors voted in 2000 for it to become a public limited company, and if you fancy it, you can look up a stockbroker in the Yellow Pages today and buy some of their shares. Don’t let them pay you off in nutmeg.

Friday, 27 June 2008

WRONG: Germany surrendered to the Allies on May 7 1945

On 5 May 1945, a delegation of German military bigwigs flew to Brussels, had a spot of lunch, then continued by car to an old red schoolhouse in Rheims, France, that served as General Eisenhower’s headquarters. The intention was to surrender to the Allies.

Unfortunately, the officers, led by Admiral von Friedeburg, had not been given authority to sign by their superiors, so surrender was delayed while they sent a message – via British Army messengers – to the remains of the German government to get further instructions. (German High Command wanted to avoid having to surrender to the Soviets as well as the Western Allies, but Eisenhower demanded unconditional surrender.) General Gustav Jodl arrived the following day, but forced more delays until finally signing at 2.41am on 7 May. The ceasefire became effective at 11.01pm the following day.

And that would seem to be that – unless you want to be pedantic, which we do. First, there had been earlier surrenders. Himmler had proposed a conditional surrender in April, but Eisenhower replied that he wasn’t interested. Von Friedeburg had also signed a conditional surrender on behalf of the government on 4 May in front of Field-Marshal Montgomery, though it was later superseded by the unconditional one.

Then there were the Soviets to consider – Germany didn’t officially surrender to them until 9 May.

Yet all of this overlooks a point of order. While Admiral Doenitz, the German President (Hitler appointed him before his suicide, believing the Navy was the only branch of the military not to have betrayed him) had given his assent, it was only the military who had surrendered. Doenitz’s government, based in Flensburg, had not. So while the armed forces had surrendered, civilian Germany was still, in a sense, at war.

The Allies got around this by simply ignoring the Flensburg government. One term of the 7 May surrender was that “This act of military surrender,” (note the “military”) “…will be superseded by any general instrument of surrender imposed by, or on behalf of the United Nations and applicable to Germany and the German armed forces as a whole.” The Allies arrested the Flensburg government on 23 May, thereby dissolving the immediate problem, and subsequently created a new Allied Control Council, the power-sharing agreement between the Soviet Union, France, the UK and the USA. The Council’s Declaration Regarding The Defeat Of Germany And The Assumption Of Supreme Authority By Allied Powers signed on June 5 effectively eliminated the German government and cemented the surrender in law.

Thursday, 19 June 2008

WRONG: The Roman Empire fell to the Barbarian hordes

Edward Gibbon’s 18th-century History Of The Decline And Fall Of The Roman Empire is the classic work on, among other things, the history of the decline and fall of the Roman Empire. It ran to six volumes and was published over 13 years, so this blog entry may seem a little compact by comparison. The short version is that Rome wasn’t overrun by a tide of hairy Germans. Rather, if the Empire ended at all, it was handed over to a tide of relatively well-groomed Germans.

In the first century CE, the Empire stretched from Portugal to the Middle East, from England in the north to parts of Africa in the south. Gibbon’s thesis was that over the next few centuries the Romans became lazy, allowing their hired help too much of a free hand in the running of the unwieldy Empire. He singled out Christianity, and in particular the Christian afterlife, as a factor in weakening the Romans’ interest in the here and now. (He also argued that the Roman Catholic church had greatly inflated their accounts of martyrdom, for which various church critics hauled him over the coals. So to speak.) Other historians have blamed economic planning, plague and technological advances for the shift in the balance of power between Rome and its dominions.

While it’s tempting to date the end of the Empire to the fifth century CE, when the last emperor, Romulus Augustus, was deposed by the Hun Odoacer, it’s not as if Odoacer turned up with an army of Goths, beat down the gates of Rome and seized the throne. For one thing, Rome had long since been usurped as the capital city – by Odoacer’s time, the Empire’s business was conducted out of Ravenna.

The diplomat patrician Flavius Orestes promised a third of the Italian peninsula to the foederati, a division of the Roman army composed of assorted Huns, Goths and Scirians, if they deposed the Emperor Julius Nepos. When they succeeded, Orestes made his son Romulus emperor and rescinded on the deal. Odoacer led the foederati in revolt. Then he beat down the gates with an army of Goths and seized the throne.

Even then the Goths only took the Western Empire. The Emperor Diocletian had previously divided the empire into two; Byzantium became the capital of the Eastern Empire, a body that continued in full force for another thousand years. The Western Empire, meanwhile, was resurrected in 800CE by Charlemagne and Pope Leo III as the Holy Roman Empire, and lasted until the time of Napoleon. All of which militates against a “fall” of the Roman Empire; a broader view is that it simply evolved into something else.

Byzantium, by the way, was renamed Constantinople in honour of the Emperor Constantine. After 1453 it was called Istanbul (and sometimes Konstantiniyye) by the Ottoman rulers. It has had many names during its long history: in Iceland, it’s still known by the Old Norse (ie Viking) name Mikligarour, meaning Big City.

Friday, 13 June 2008

WRONG: Right-handers live longer than left-handers

Left-handers have not been treated well by language. The Romans considered right good and left bad, as indicated by their words for the two: dexter (right) led to dextrous, and sinister speaks for itself. In Old English, riht (from the Latin rectus) denoted anything straight or true, and left was a word meaning weak or worthless. The French gauche still means both “left” and “socially clumsy”. It was long believed that left-handers died younger, and many will still pass along the information as fact, but the death rate is actually more or less the same for both lefties and righties. The belief may have begun because many elderly left-handers were forced to become right-handed in childhood. Though the practice is more-or-less defunct, it artificially skewed the statistics and made it seem like there were more right-handers than lefties in the upper age brackets. Those viewing the statistics inferred wrongly that the lefties must have died earlier.

Having said that, it was recently demonstrated that the gene most closely associated with left-handedness also carries with it a slightly increased risk of schizophrenia. On the plus side, another study shows that left-handers are historically more likely to be high achievers, and yet another that lefty men are on average at least 15 per cent richer than right-handed men. The rule doesn’t appear to apply to women, which seems harsh.

Studies of stroke victims have opened a debate on the whole nature of left and right. In The Man Who Mistook His Wife For A Hat, Oliver Sacks reported the case of Mrs S, who had no concept of “left” at all. She ignored the food on the left half of her plate, made up only half her face and, if she lost something, turned round to the right until she found it. It wasn’t simply that she couldn’t see things on her left – she couldn’t even conceive of there being a “left” side to anything. Which begs the question of how she therefore managed to understand “right”.

Wednesday, 4 June 2008

WRONG: The internet was invented by a Brit

It’s widely believed that Sir Tim Berners-Lee invented the internet, but that’s not strictly true. He invented the World Wide Web, which lives on the internet. It’s the difference between inventing books and inventing paper.

So if Sir Tim invented the book, who invented the paper? In 1945, Vannevar Bush, Director of the US Office Of Scientific Research and Development, wrote an article for Atlantic Monthly called As We May Think, in which he proposed an electro-mechanical system called the Memex. This would allow for direct links between microfiched documents and give scholars access to “millions of fine thoughts”. (He presumably didn’t anticipate 2 Girls 1 Cup.) The article decribes a hypothetical scholar, sat at his Memex researching the history of the bow and arrow in a way that sounds remarkably like Googling.

Seventeen years later, the Defense Department’s Advanced Research Projects Agency (DARPA) Director Joseph Licklider updated Bush’s idea in a series of memos articulating his vision of a “Galactic Network” of interlinked computers. The concept survived, but sadly for us all, the name did not.

In 1969, thanks to the technical genius of Vint Cerf and Bob Kahn (among others), the first node of the first computer network – ARPANET – went live at UCLA. A small number of other agencies, including the British Post Office, joined in, creating the International Packet Switching Service in 1978. The movement continued to grow through the 80s into various educational and commercial networks which were finally linked together using Cerf and Kahn’s technological advances (“TCP/IP”, or Transmission Control Protocol/Internet Protocol).

So much for the paper. Now the book: those in the know were able to transfer information directly between computers, and even e-mail each other, but the World Wide Web didn’t come into being until 1990 at CERN, the European Organisation for Nuclear Research. The Centre employed some 3000 staff, with research data stored on hundreds of separate computers and networks, much of it inaccessible to other employees. In late 1989, Tim Berners-Lee proposed an information-management system based on hypertext, an interlinking form of text that was a direct descendant of Vannevar Bush’s Memex ideas. The result, one year later, was the first browser, a prototype of applications like Microsoft’s Internet Explorer.

After toying with names like The Information Mine (which he abandoned after noting the initials) and Information Mesh, he decided to emphasize the global and decentralised nature of his project and called it WorldWideWeb. (He later renamed his program Nexus to distinguish it from the networked “large universe of documents” it enabled, ie the World Wide Web.) The first ever web address was Sadly the original page no longer exists, but a later copy reveals the prophetic first words to be, “The WorldWideWeb (W3) is a wide-area hypermedia information retrieval initiative aiming to give universal access to a large universe of documents.”

At that point there were 600,000 internet users, most of them academics. Thanks in part to Berners-Lee’s astounding generosity in making the web free to all (he never made a penny from it directly, though by any normal commercial standards he would have been entitled to) within five years, there were 40 million. According to Internet World Stats, at the time of writing that number is close to a billion and a half.

But wouldn’t we all have been happier surfing the Galactic Network?

For no reason at all, here are some genuine websites from the history of the internet whose creators didn’t think through their URLs:

• – wisely changed after a few years to
• – dedicated to tracking down therapists, not rapists
• – for those seeking celebrities’ representation, not gifts for prostitutes
• – an Italian firm specialising in batteries that would also benefit from a hyphen: powergen-italia
• – home of the Mole Station Native Nursery, of course!

E-mail – and its use of the @ symbol – was invented in late 1971 by Ray Tomlinson of Bolt, Beranek and Newman Technologies. The first e-mail is lost to history, but Tomlinson recalls that, “Most likely the first message was QWERTYUIOP or something similar […] It is equally likely to have been ‘TESTING 1 2 3 4’. Probably the only true statements about that first email are that it was all upper case (shouted) and the content was insignificant and forgettable”.

Friday, 30 May 2008

WRONG: Veins are blue because of deoxygenated blood

An amazing number of people seem to believe that deoxygenated blood is blue. In a way, it makes sense – veins (in pale skin) look blue, after all, and we were all taught in school that veins carry blood back to the heart and lungs to be refreshed with oxygen. But ask yourself this: have you ever seen blue blood? Anywhere? (Other than in a mollusc or a horseshoe crab, smartarse.)

It’s true that deoxygenated blood – blood returning from the organs and extremities – is a darker shade than “fresh” blood. The stuff you see when you cut your finger or donate blood may appear a deep, burgundy-red, compared to the bright letterbox-red familiar to surgeons and axe-murderers. (Though this is largely down to light conditions, too.)

Blood vessels near the surface are slightly translucent, as is the skin. Light from outside penetrates the skin and is reflected back out, but shorter [EDIT:] longer wavelengths (ie colours from the red end of the spectrum) are more likely to be blocked by the skin on the journey out again. As a result, what you see appears bluer than it otherwise would.

Optical physicists determined that the blueness of a vein depends on the depth of the blood vessel, its width and the blood content in the tissue around it. Others have speculated that the blue effect is also the product of a contrast with yellowy-pink skin, an optical illusion called the retinex effect.

But even the Queen bleeds red. The expression “blue blooded” in the sense of aristocratic is a direct translation of the Spanish term sangre azul, which referred to the nobility’s descent from “pure” Northern European stock rather than from Moorish ancestors, like the proles. Pale-skinned Northerners simply had more visible blue veins.

Having said all that, not everyone’s blood is red. In 2007 a 42-year-old Canadian man underwent surgery for tissue damage to his legs and surprised the surgeons by bleeding green. The rare condition, called sulfhaemoglobinaemia, was caused by sulphur in his migraine medication binding with his blood cells.

Friday, 23 May 2008

WRONG: King Cnut tried to hold back the tide

First, don’t let anyone get away with calling him Canute – that’s just an affectation to make his name easier to pronounce (and less dangerous to misspell). Properly called Knut, he was son of the Danish invader Svein Forkbeard. He took control of Anglo-Saxon Britain in 1016, and was King not only of England but also Denmark and parts of Norway and Sweden.

The story of him trying to hold back the tide is frequently misreported, though as the first account of it was in the 1120s (Historia Anglorum, Henry Of Huntingdon), the entire event is open to question. Rather than try to order the sea back in order to prove his power, Knut, a Christian, wished to demonstrate that only God had true power, as evidenced by his own inability to turn the tide back.

(King Xerxes of Persia was less humble in his dealings with the sea in the 5th century BCE. According to Herodotus, after his bridge of boats across the Hellespont was destroyed in a storm, he ordered that the sea be given 300 lashes, fettered and branded with an iron.)

After his tidal display, Knut supposedly never wore his crown again, placing it instead on a crucifix, saying, “Let all the world know that the power of kings is empty and worthless, and there is no king worthy of the name save Him by whose will heaven, earth and sea obey eternal laws.”

The other notable event (or more likely, story) from Knut’s reign was Lady Godiva’s naked ride through Coventry to protest her husband Leofric’s taxes. According to legend – this part almost certainly is pure fiction – only one person dared watch, a tailor called Tom (the origin of Peeping Tom). He subsequently went blind, which doesn’t say much for Ms Godiva.

EDIT: Reader Ash suggests, "I would think it is quite complimentary to Ms Godiva, if you ask yourself why he went blind..."

Monday, 19 May 2008

WRONG: Cannibalism is illegal

In the spring of 1874, Alfred Packer, sometimes called Alferd, a small, bedraggled prospector with the moustache of a much bigger man, stumbled into the Los Pinos Indian Agency claiming to have survived four months in the mountains by eating rosebuds and pine sap. Strangely, he wasn’t hungry, and asked only for whiskey. (The thing about him having the moustache of a much larger man may have literally been true, given what followed.)

When an Indian scout founded mysterious strips of meat along his back trail, the truth soon emerged: his group of six would-be prospectors had turned on each other as they starved and eaten the corpses in desperation. At Packer’s trial it was claimed the judge shouted, “Stand up, ya man-eatin' sonofabitch and receive your sentence! There was seven Democrats in Hinsdale County, but you… ya ate five of them!” Tragically for all historians, this outburst was actually the invention of a local joker who witnessed the trial. Packer escaped from jail and lived on the run for nine years, but was eventually caught and sentenced to death. (The sentence was reversed by the Colorado Supreme Court and he was paroled in 1901. He died six years later). Students at the University of Colorado continue to commemorate their state’s most popular cannibal at the Alferd Packer Grill (their genuine slogan: “Have a friend for lunch!”).

Cannibalism has been practised in many cultures and nations over the centuries, including Britain. A human femur, split lengthways with the marrow scraped out, was discovered in a cache of bones in Gloucestershire by University of Bristol archeologists, who estimate it to date from the beginning of the Roman occupation. Maoris in New Zealand occasionally ate their enemies until the mid-19th century in an attempt to destroy their essence even beyond death, and as recently as the 1960s, the eating of one’s enemies was still commonly practised in New Guinea. Unofficially, it still is.

More recently still, several serial killers have claimed to have indulged (and when you’ve confessed to sexually-motivated murder, why would you lie about your diet?). Two of the strangest cases, though, highlighted the inadequacy of the law in regard to eating people. In 1981, Issei Sagawa, a short, apparently unappealing Japanese student of literature in Paris, shot and ate his classmate Renée Hartevelt, reputedly fulfilling a lifelong desire to eat a beautiful woman. The French judge considered him mentally incompetent to stand trial and deported him to Japan, but after 15 months in a Japanese institution, he was decreed sane, and therefore released without having been tried for murder. He lives in Japan, and is something of a celebrity.

The German computer technician Armin Meiwes became briefly famous in 2003 when he was convicted of manslaughter after eating the penis of Bernd Brandes, then stabbing him and freezing his remains for later meals. As Brandes, whom Meiwes solicited on the internet, was a willing participant in the first meal and the killing, the trial centred on the debate over whether, in German law, the death counted as murder or “killing on demand”, a legal definition designed to separate mercy killing from murder. The cannibalism itself was not at issue, because it was not illegal: the only charge prosecutors could bring regarding it was “disturbing the peace of the dead”. Meiwes was convicted of manslaughter, then retried and convicted of murder.

The law in Britain simply doesn’t extend to eating human flesh. Grievous bodily harm and murder are straightforward crimes that require no dietary sub-clauses, but what if the person being eaten makes the incision himself, consents to the eating and doesn’t die as a result of it? Crucially, no law will have been broken.

And what if the “meal” was dead already? This last point may sound frivolous and grotesque, but in Cambodia in 2002, two crematorium workers were released without charge despite eating parts of their “customers”. Technically they had committed no crime.

Thursday, 15 May 2008

WRONG: Splitting infinitives is bad

To begin with, a few definitions. If you know this bit, skip to paragraph six.

As English teachers like to say, verbs are “doing” words, and an infinitive is the form a “doing word” takes when it isn’t being done by anything in particular. It's the verb in its unused form.

In English infinitives begin with "to", eg to run, to explode, to burble. They are always, therefore, made up of two words, which makes them infinitive phrases, not infinitives. This is important, and we'll get to why in a sec.

An adverb, meanwhile, shades the meaning of a verb (eg, run quickly, explode messily, burble charmingly). They typically, but not always, end in -ly. (Often and never are a couple that don't.)

The problem arises when putting infinitive phrases and adverbs together. Splitting an infinitive means putting an adverb between to and run, explode, burble, etc. Captain Kirk’s to boldly go is the most famous example (in the galaxy). The sticklers demand either boldly to go or to go boldly.

But what’s wrong with splitting infinitives? Nothing. The sticklers are wrong. As long as the split infinitive makes your meaning clearer or your sentence read more easily, you are innocent of any grammatical crime. Writers of Middle English happily used to split their infinitives (or used to happily split them, I should say), but the practice fell out of favour during the Renaissance (though Shakespeare still felt justified in using one). In later years it may have been that some grammar scholars, who based their prescriptive rules on those of Latin, felt that if a Latin infinitive couldn’t be split, being only one word (such as esse - to be), then neither should an English one. But as we've already seen, English doesn't have infinitives - it has infinitive phrases. So there. You can't split an infinitive, but you can split an infinitive phrase.

It is certainly the Latin-loving spoilsports we have to thank for the much-repeated (and completely wrong) belief that you can’t end a sentence with a preposition (eg with, to, from, by, over). The Right Reverend Robert Lowth, writing in 1762, declared it “an idiom which our language is strongly inclined to… but the placing of the preposition before the relative is more graceful as well as more perspicuous and agrees much better with the solemn and elevated style.” Well la-di-da, your Right Reverendship. Kingsley Amis, on the other hand, reckoned, “it is natural and harmless in English to use a preposition to end a sentence with.”

In either case, there are no rules of grammar, only traditions. Languages evolve with use, and the rules evolve with them. And the sticklers know the orifice up which their rules they can shove.

Monday, 12 May 2008

WRONG: The British Empire ended in 1947

John Dee, alchemist and adviser to Queen Elizabeth I, first mooted the idea of a British Empire in 1577, undeterred by the fact that Britain didn’t exist as a single political entity, and wouldn’t till 1707. Over the next four centuries Britain’s dominions grew to cover an area so great that it was claimed “the sun never set” on it. (“Because god doesn’t trust the British in the dark,” according to one spectator.)

In actual fact, after a close analysis of historical maps (well, a bit of scribbling on the back of an envelope and a glance at a map), I estimate that at the Empire’s height in the 1920s, the sun set on it for at least one hour 26 minutes a day. Most of it across Greenland.

George VI may have relinquished the title of Emperor shortly after Pakistan and India became independent in August 1947, but not having an Emperor doesn’t mean not having an Empire. There are no established rules, after all, about what constitutes an Empire and what doesn’t.

Let’s begin with what definitely isn’t the British Empire. The Commonwealth Of Nations, an organisation that promotes “international understanding and world peace”, consists of 53 member nations (nearly a third of the world’s population), almost all of which were once British dominions, but which have since gained independence. No Empire there.

Jersey, Guernsey and the Isle of Man are Crown Dependencies of the United Kingdom with a strong degree of independence [see WRONG: The British Isles have only one monarch], but they’re not the British Empire either.

There are also 14 non-independent British Overseas Territories subject to British sovereignty (not that that makes them an Empire):

• Anguilla.
• Bermuda, which isn’t one island but 138 of them.
• The British Antarctic Territory, a wedge of Antarctica agreed by treaty in 1961 to be used only for peaceful, scientific purposes. In 1978 Emilio Palma became the first person known to have been born there. It must be assumed the birth was both peaceful and scientific.
• The British Indian Ocean Territory, a series of islands that were the site of a disgraceful land grab when Britain illegally ousted the entire native population of the Chagos Archipelago in 1967 in order to lease the land to the US Navy in return for a subsidy on Polaris submarines. The Chagossians remain homeless despite a High Court decision that their eviction was illegal.
• The British Virgin Islands. Not named for any supposed chastity on the part of the inhabitants, or even after the “virgin” Queen Elizabeth I (source of “Virginia”), the islands were somewhat fancifully named Santa Ursula Y Las Once Mil Virgenes by Columbus after the myth of St Ursula and the 11,000 virgins who were supposedly martyred alongside her by the Huns.
• The Cayman Islands.
• The Falkland Islands.
• Gibraltar, ceded to the British in perpetuity under the terms of the Treaty of Utrecht of 1713.
• Montserrat, a volcanic island in the Caribbean.
• The Pitcairn Islands, four South Pacific outposts originally settled by the mutineers of HMS Bounty and their Tahitian companions. Pitcairn became famous again in 2004 when seven islanders went on trial for 96 charges of rape and the indecent assault of minors. Such was the mistrust of outsiders at the time, visiting Animal Park presenter Ben Fogle was denounced as a spy and deported.
• St Helena, which also includes the “nearby” islands of Ascension and Tristan da Cunha, is an Atlantic island some 2,000 miles from the nearest landmass. St Helena is famous for being the place of Napoleon’s final exile and death, but it was also home to 6,000 PoWs during the second Boer War. Tristan da Cunha, meanwhile, is officially the most remote inhabited island on Earth. It is 1750 miles from South Africa, 2088 miles from South America and even 1350 miles from St Helena. It is home to 269 citizens, who share only seven surnames and, it’s fair to assume, more than a few genes.
• South Georgia, a series of wildlife sanctuaries in the far south Atlantic staffed by scientists.
• The Sovereign Base Areas of Akrotiri and Dhekelia, two garrisons on the island of Cyprus.
• The Turks and Caicos Islands in the Caribbean.

So is there an Empire? Possibly. The one exception to the decolonisation trend of the 20th century occurred in 1955, when the Royal Navy claimed the barren North Atlantic outcrop of Rockall in the name of the Queen, mainly to prevent the Soviets grabbing it first and spying on the UK’s naval missile-testing programme. You don’t get more imperial than planting a flag in the name of the Sovereign. That soggy lump of granite, part of the County of Inverness according to the 1972 Island Of Rockall Act, is pretty much the only bit of land that you could get away with calling the British Empire. Not really what John Dee had in mind, and, for the record, the sun most emphatically does set on it.

A point worth noting is that not all British land is British. Not only are embassies technically foreign soil, but other land can be voluntarily ceded by the State. During World War II, Winston Churchill temporarily designated a suite in Claridge’s Hotel as Yugoslav territory so that the queen in exile could give birth on home soil and thereby maintain her son’s claim to the throne. Among the other lands with overseas sovereignty are an acre of forest in Runnymede, Kent, which was donated to America as a memorial to John F Kennedy, and Victor Hugo’s house in Guernsey, which belongs to the City of Paris.

Thursday, 8 May 2008

WRONG: Van Gogh only sold one work

It’s an appealing myth: the starving, passionate artist rising above the market’s total lack of interest in his work. In Van Gogh’s case, the starving and passionate bits are true (“These four days I have lived mainly on 23 cups of coffee,” he wrote in 1888, which at least explains his prodigious output), yet one of the most commonly repeated “facts” about him – that he only ever sold one piece – is not. The Red Vineyard was the only painting we know that Vincent absolutely definitely sold (to fellow artist Anna Boch), but whether he sold any others depends on your definition of “sold”, not to mention your definition of “paintings”.

You can rest easy that it does not depend on your definition of “he”.

We’ll take “sold” first. Does trading or barter count as a sale? Should the buyer have to part specifically with cash for it to count? There is strong evidence that Vincent exchanged art for paint and canvas, for example, and apparently even paid his doctor on at least one occasion with a painting.

Then there’s “paintings”. What about drawings in ink, made with a brush? Or watercolours? We know that Vincent sold the following:

• Five maps of Palestine, sold to his father for ten francs each.
• One “small drawing” (probably a watercolour) sold to Hermanus Tersteeg, his former boss at art dealers Goupil & Co, for 12 guilders.
• 12 pen studies of The Hague, commissioned by his Uncle Cornelis at a price of 30 guilders all in.

And that’s not all. There are hints of other sales:

• A portrait of a friend of the art dealer Julien Tanguy, sold for 20 francs and referred to in a letter from Vincent to his brother Theo.
• Vincent’s friend Gauguin claimed in his memoirs that he witnessed Vincent sell “a little still life, some rose shrimps on a rose paper” (possibly Still Life with Mussels and Shrimps) to a canvas dealer in Paris for five francs. Gauguin isn’t the most reliable witness, but such a sale was very likely, whether or not it was the particular painting Gauguin claims he saw. Van Gogh was a prolific artist, and given his obscurity during his lifetime, many works probably fell through the cracks of history.
• Theo wrote to London art dealers Sully & Lori on 3 Oct 1888, referring to the earlier sale of a self-portrait by “V Van Gogh”.

While we’re on the subject of obscure Van Goghery, not many Brits are aware that Vincent lived for years in England. His first visit was in 1873, when he worked at an art dealership in London for a year, lodging at 87 Hackford Road, Vauxhall (the house is still there). Typically, he fell in love with the landlady’s daughter, and equally typically, it didn’t end well. After a spell in Paris, Van Gogh returned to England in 1876, this time working first as a French, German and Arithmetic teacher in Ramsgate, Kent, then in Isleworth, Middlesex as a preacher.

Sunday, 4 May 2008

WRONG: Television was invented by John Logie Baird

“In 1923 a Scotch inventor projected televised shadows on a screen…” So begins a typical guide to television from 1941. Despite what you may have heard, however, John Logie Baird didn’t invent the television. He’s certainly prominent among those who can take credit for innovations in the field – his electro-mechanical “televisor” device, which involved shining light through spinning discs, transmitted an image of a ventriloquist’s dummy called “Stooky Bill” before anyone else had any success. But this was not electronic television.

Baird was also a pioneer of colour television, but his other inventions, including pneumatic shoes and glass razorblades, failed to catch the public’s imagination.

While Vladimir Zworykin and Kalman Tihanyi also deserve credit, the most interesting of TV’s pioneers was Philo Farnsworth. He made his preliminary designs for electron tubes as a 14-year-old living on his parents’ farm in Idaho and by 1929 – at the age of 23 – he had almost single-handedly developed the first true electronic television system. He came to dislike the uses to which TV was put, but, according to his widow, watched the moon landing and declared, “This has made it all worthwhile". He died in 1971, so his thoughts on Hollyoaks will never be known.

Friday, 25 April 2008

WRONG: You are legally obliged to pay all your debts

A peculiarity of contracts is that they can be valid – ie both parties agree to the terms – but still unenforceable. In the UK, gambling debts are the most famous example. Hard to believe though it sounds, you are not legally obliged to pay any losses incurred through gambling. This is why bookmakers laugh like hyenas and reach for the hammer at the mention of credit.

Prostitution is another instance. If a client refuses to pay for services rendered, the prostitute is out of luck, legally as well as financially, even though the transaction itself is not strictly illegal. The flip side is that if she takes her payment in advance, she can refuse to get down to business and be equally unassailable in law.

In the US, government spies’ employment contracts are officially unenforceable. In the case of Tenet Vs Doe in 2005, the Supreme Court ruled that “John Doe”, a Russian diplomat who spied for the CIA, couldn’t sue to claim benefits owed him by the Agency because such a lawsuit would violate the condition of secrecy that the original contract demanded. All of which makes you wonder why they bothered with a contract in the first place.

Outside of these specific cases, however, you have to pay up, or file for bankruptcy, which means you won’t be able to sit in Parliament. Another life’s ambition thwarted.

Wednesday, 23 April 2008

WRONG: Britain hasn't been invaded since 1066

History teachers – especially patriotic ones – like to say that the battle of Hastings marked the last time Britain was invaded. As with so many things, however, it depends on your perspective.

For example, the Channel Islands were invaded during WWII. Strictly, they're not part of the United Kingdom, being Crown Dependencies, with the Queen as their sovereign. Churchill decided in 1940 that they would be of more value to the UK as a drain on German resources than as a tactical location, and so abandoned them to five years of occupation, the last of them under siege and in near-starvation conditions. Even under a force of one armed German (as opposed to a one-armed German) for every five citizens, the locals still managed a cheeky resistance. When commissioned to design postage stamps for local use, artist Edmund Blampied put two artful curlicues in plain view on the threepenny stamp (see bottom right of this link) that to the locals – but not the Germans – clearly spelled “GR” (for George VI).

Despite the German surrender, the occupation is still the most successful attempt to invade Britain since the Norman conquest – even though the Channel Islands aren’t in Britain – because the others have ranged from merely short-lived to farcical.

Geographically if not politically, Scotland has always been part of Great Britain and suffered its own invasion (not counting the English ones, nor the Scots’ invasions of England). In 1263 King Haakon IV of Norway sailed to Scotland with the intention of recapturing the Hebrides, but following a battle at Largs on the Firth of Clyde, the Scots deviously prolonged diplomatic talks in order take advantage of the worsening weather, which played havoc with the invaders’ fleet. Norway renounced its claim to the Hebrides three years later.

In 1797 the French made a hilarious attempt to invade Wales. A 1400-strong force landed near Fishguard, but the largely conscripted troops lived up to the national stereotype and surrendered almost immediately when faced with the wrath of the local womenfolk. (The local menfolk may have observed the invaders with some sympathy.) One Jemima Nicholas, a 41-year-old cobbler, singlehandedly captured twelve using only a pitchfork. The traditional local dress of tall black hats and red cloaks, it would seem, led the invaders to mistake the women for Grenadiers. A huge store of looted Portuguese wine may have contributed to their confusion.

Finally, William of Orange gets an honorary mention for invading England, though with the famed good manners of the Dutch he waited till he was invited. The Protestant grandson of Charles I arrived at Brixham, Devon in 1688 with some 15,000 soldiers, and a month later the Catholic King James II had fled the country, leaving it to the Protestant William and his wife, Mary. The Dutch briefly renamed New York “New Orange” in their honour.

Monday, 21 April 2008

WRONG: There's more violent crime around the full moon

Before we get down to the statistics, let’s just skip right past werewolves. We’ll also ignore the moon’s spurious connection to menstruation. (The human menstrual cycle is 28 days. The lunar cycle is 29.5 days – there is no more connection between the two than there is between the moon and a chimp’s 37-day menstrual cycle.)

The belief in a connection between the moon and human behaviour is amazingly pervasive. Even the Sussex Constabulary believes that violent behaviour increases around the full moon. In 2007, Inspector Andy Parr announced that extra police would be deployed on full moon nights, and that, “From my experience over 19 years of being a police officer, undoubtedly on full moons we do seem to get people with, sort of, stranger behaviour – more fractious, argumentative.”

Even medical staff are susceptible: Dr David Mandell canvassed the surgical nurses in his hospital in Pittsburgh in 2005 and found that 69 percent of them believed that a full moon resulted in greater social disorder and a higher number of admittances.

So what’s the cause of this apparent misbehaviour? Some would argue that it’s the “tidal pull” of the moon – if it can make the seas rise, then surely it can have an effect on beings that are predominantly composed of water, can’t it?

No, it can’t.

Everything that has mass has gravity, and gravity’s strength decreases greatly over distance. As the diligent moon-studiers Ivan Kelly, James Rotton and Roger Culver noted in their thorough demolition of full-moon myths, “A mother holding her child will exert 12 million times as much tidal force on her child as the moon”. And yet we don’t put a toddler’s tantrums down to mum’s gravitational influence.

Kelly, Rotton and Culver compared over 100 studies on the effect of the moon on behaviour and found that there was no statistically significant difference during any phase of the moon in the rates of homicide, traffic accidents, assaults, psychiatric admissions or dozens of other areas (including aggression among professional hockey players – I said they were thorough). As for the effect on mental health, they found that phases of the moon accounted for 0.03 per cent of a difference in “lunatic” behaviours. Daffodils have a more statistically significant effect on mental health.

(That was a guess. I don't have any stats on daffodils.)

While there are many studies that appear to show a connection between the moon and crime, Kelly, Rotton and Culver’s meta-analysis demonstrates that there are just as many proving the opposite. Dr Cathy Owen of Sydney University, for example, undertook an exhaustive study of aggressive behaviour among inmates of five psychiatric clinics in Sydney over six months, from “low-level threats” to assault and suicide. She and her team concluded that there was “No significant relationship between level of violence and aggression and any phase of the moon.”

Dr David Lester looked at all the violent deaths in America over a two-year period, and while he noticed that suicide was more common on Mondays and that murders tended to happen more on Saturdays, Sundays and on national holidays, there was no detectable lunar influence whatsoever.

The perception of increased bad behaviour at the full moon is most likely down to two things. First, beliefs can linger and be reinforced by communities, even police forces and hospitals, simply because they are plausible, appealing and difficult to immediately disprove. If, as a rookie policeman, you are told by an experienced officer that the nutters all come out at full moon, you’ll be likely to accept the belief, then subsequently attribute any crazy behaviour at that time to the moon, and ignore or play down identical behaviour at other times of the month.

Second, the full moon may not affect human behaviour but it certainly enables it: common sense indicates that brighter nights will encourage opportunistic criminals. This may explain the Sussex Constabulary’s figures.

While we’re on the subject of moon myths, it’s popularly held that the moon doesn’t rotate in its orbit, because if it did, we’d be able to see the “dark” side. In fact, the moon’s rotation is precisely why we only see the man in the moon – to a greater or lesser degree – every night: the moon spins exactly once for every orbit it makes of the earth. If you’re having trouble visualising this, draw a face on a grapefruit (or tennis ball, or sea urchin – amazingly it makes no difference) and “orbit” it around your finger while keeping the face looking at the finger. You’ll notice that you have to revolve it to do so. If you didn’t, your finger would see the dark side of the grapefruit.

The dark side of the moon, by the way - or the far side as astronomers prefer - is only dark during the full moon – it sees as much of the sun as the familiar side, but we never get to look at it ourselves from earth. According to astronaut William Anders, who orbited the moon, “The backside looks like a sand pile my kids have played in for some time. It's all beat up, no definition, just a lot of bumps and holes.” Here it is.

Monday, 31 March 2008

WRONG: Glass is a liquid

Observing church windows, many people have noticed that ancient panes are thicker at the bottom than at the top, leading them to believe that glass has “flowed” downward with imperceptible slowness.


Medieval glaziers lacked the technology to make perfectly flat panes. Their methods, which involved spinning flattened, blown spheres, produced sheets of glass that were slightly thicker at one end. The heavier end was placed at the bottom of the frame simply because it’s more stable that way. The best medieval English glass was made from Lyme Regis sand and burned kelp. As was the typical medieval English peasant's lunch.

When glass is heated beyond its melting point, it becomes a liquid (because, well, that’s what melting means). Its molecules slide all over the place in disarray. As it cools, its viscosity increases, making it harder for the molecules to line up in an orderly, crystalline form. Compare this with water, the viscosity of which does not increase as much when its temperature lowers – at the freezing point, it crystallises into ice. (Water does do some weird things in extreme conditions, though. Look at this and this.)

The resulting hodge-podge of glass molecules is an “amorphous solid” – the molecules are bound tightly together, but not in a formal structure. Having said that, amorphous solids can flow – but for glass to slip in the way perceived in old windows would take millions of years and a great deal more heat than the average cathedral boiler provides.