Friday, 27 June 2008

WRONG: Germany surrendered to the Allies on May 7 1945

On 5 May 1945, a delegation of German military bigwigs flew to Brussels, had a spot of lunch, then continued by car to an old red schoolhouse in Rheims, France, that served as General Eisenhower’s headquarters. The intention was to surrender to the Allies.

Unfortunately, the officers, led by Admiral von Friedeburg, had not been given authority to sign by their superiors, so surrender was delayed while they sent a message – via British Army messengers – to the remains of the German government to get further instructions. (German High Command wanted to avoid having to surrender to the Soviets as well as the Western Allies, but Eisenhower demanded unconditional surrender.) General Gustav Jodl arrived the following day, but forced more delays until finally signing at 2.41am on 7 May. The ceasefire became effective at 11.01pm the following day.

And that would seem to be that – unless you want to be pedantic, which we do. First, there had been earlier surrenders. Himmler had proposed a conditional surrender in April, but Eisenhower replied that he wasn’t interested. Von Friedeburg had also signed a conditional surrender on behalf of the government on 4 May in front of Field-Marshal Montgomery, though it was later superseded by the unconditional one.

Then there were the Soviets to consider – Germany didn’t officially surrender to them until 9 May.

Yet all of this overlooks a point of order. While Admiral Doenitz, the German President (Hitler appointed him before his suicide, believing the Navy was the only branch of the military not to have betrayed him) had given his assent, it was only the military who had surrendered. Doenitz’s government, based in Flensburg, had not. So while the armed forces had surrendered, civilian Germany was still, in a sense, at war.

The Allies got around this by simply ignoring the Flensburg government. One term of the 7 May surrender was that “This act of military surrender,” (note the “military”) “…will be superseded by any general instrument of surrender imposed by, or on behalf of the United Nations and applicable to Germany and the German armed forces as a whole.” The Allies arrested the Flensburg government on 23 May, thereby dissolving the immediate problem, and subsequently created a new Allied Control Council, the power-sharing agreement between the Soviet Union, France, the UK and the USA. The Council’s Declaration Regarding The Defeat Of Germany And The Assumption Of Supreme Authority By Allied Powers signed on June 5 effectively eliminated the German government and cemented the surrender in law.

Thursday, 19 June 2008

WRONG: The Roman Empire fell to the Barbarian hordes

Edward Gibbon’s 18th-century History Of The Decline And Fall Of The Roman Empire is the classic work on, among other things, the history of the decline and fall of the Roman Empire. It ran to six volumes and was published over 13 years, so this blog entry may seem a little compact by comparison. The short version is that Rome wasn’t overrun by a tide of hairy Germans. Rather, if the Empire ended at all, it was handed over to a tide of relatively well-groomed Germans.

In the first century CE, the Empire stretched from Portugal to the Middle East, from England in the north to parts of Africa in the south. Gibbon’s thesis was that over the next few centuries the Romans became lazy, allowing their hired help too much of a free hand in the running of the unwieldy Empire. He singled out Christianity, and in particular the Christian afterlife, as a factor in weakening the Romans’ interest in the here and now. (He also argued that the Roman Catholic church had greatly inflated their accounts of martyrdom, for which various church critics hauled him over the coals. So to speak.) Other historians have blamed economic planning, plague and technological advances for the shift in the balance of power between Rome and its dominions.

While it’s tempting to date the end of the Empire to the fifth century CE, when the last emperor, Romulus Augustus, was deposed by the Hun Odoacer, it’s not as if Odoacer turned up with an army of Goths, beat down the gates of Rome and seized the throne. For one thing, Rome had long since been usurped as the capital city – by Odoacer’s time, the Empire’s business was conducted out of Ravenna.

The diplomat patrician Flavius Orestes promised a third of the Italian peninsula to the foederati, a division of the Roman army composed of assorted Huns, Goths and Scirians, if they deposed the Emperor Julius Nepos. When they succeeded, Orestes made his son Romulus emperor and rescinded on the deal. Odoacer led the foederati in revolt. Then he beat down the gates with an army of Goths and seized the throne.

Even then the Goths only took the Western Empire. The Emperor Diocletian had previously divided the empire into two; Byzantium became the capital of the Eastern Empire, a body that continued in full force for another thousand years. The Western Empire, meanwhile, was resurrected in 800CE by Charlemagne and Pope Leo III as the Holy Roman Empire, and lasted until the time of Napoleon. All of which militates against a “fall” of the Roman Empire; a broader view is that it simply evolved into something else.

Byzantium, by the way, was renamed Constantinople in honour of the Emperor Constantine. After 1453 it was called Istanbul (and sometimes Konstantiniyye) by the Ottoman rulers. It has had many names during its long history: in Iceland, it’s still known by the Old Norse (ie Viking) name Mikligarour, meaning Big City.

Friday, 13 June 2008

WRONG: Right-handers live longer than left-handers

Left-handers have not been treated well by language. The Romans considered right good and left bad, as indicated by their words for the two: dexter (right) led to dextrous, and sinister speaks for itself. In Old English, riht (from the Latin rectus) denoted anything straight or true, and left was a word meaning weak or worthless. The French gauche still means both “left” and “socially clumsy”. It was long believed that left-handers died younger, and many will still pass along the information as fact, but the death rate is actually more or less the same for both lefties and righties. The belief may have begun because many elderly left-handers were forced to become right-handed in childhood. Though the practice is more-or-less defunct, it artificially skewed the statistics and made it seem like there were more right-handers than lefties in the upper age brackets. Those viewing the statistics inferred wrongly that the lefties must have died earlier.

Having said that, it was recently demonstrated that the gene most closely associated with left-handedness also carries with it a slightly increased risk of schizophrenia. On the plus side, another study shows that left-handers are historically more likely to be high achievers, and yet another that lefty men are on average at least 15 per cent richer than right-handed men. The rule doesn’t appear to apply to women, which seems harsh.

Studies of stroke victims have opened a debate on the whole nature of left and right. In The Man Who Mistook His Wife For A Hat, Oliver Sacks reported the case of Mrs S, who had no concept of “left” at all. She ignored the food on the left half of her plate, made up only half her face and, if she lost something, turned round to the right until she found it. It wasn’t simply that she couldn’t see things on her left – she couldn’t even conceive of there being a “left” side to anything. Which begs the question of how she therefore managed to understand “right”.

Wednesday, 4 June 2008

WRONG: The internet was invented by a Brit

It’s widely believed that Sir Tim Berners-Lee invented the internet, but that’s not strictly true. He invented the World Wide Web, which lives on the internet. It’s the difference between inventing books and inventing paper.

So if Sir Tim invented the book, who invented the paper? In 1945, Vannevar Bush, Director of the US Office Of Scientific Research and Development, wrote an article for Atlantic Monthly called As We May Think, in which he proposed an electro-mechanical system called the Memex. This would allow for direct links between microfiched documents and give scholars access to “millions of fine thoughts”. (He presumably didn’t anticipate 2 Girls 1 Cup.) The article decribes a hypothetical scholar, sat at his Memex researching the history of the bow and arrow in a way that sounds remarkably like Googling.

Seventeen years later, the Defense Department’s Advanced Research Projects Agency (DARPA) Director Joseph Licklider updated Bush’s idea in a series of memos articulating his vision of a “Galactic Network” of interlinked computers. The concept survived, but sadly for us all, the name did not.

In 1969, thanks to the technical genius of Vint Cerf and Bob Kahn (among others), the first node of the first computer network – ARPANET – went live at UCLA. A small number of other agencies, including the British Post Office, joined in, creating the International Packet Switching Service in 1978. The movement continued to grow through the 80s into various educational and commercial networks which were finally linked together using Cerf and Kahn’s technological advances (“TCP/IP”, or Transmission Control Protocol/Internet Protocol).

So much for the paper. Now the book: those in the know were able to transfer information directly between computers, and even e-mail each other, but the World Wide Web didn’t come into being until 1990 at CERN, the European Organisation for Nuclear Research. The Centre employed some 3000 staff, with research data stored on hundreds of separate computers and networks, much of it inaccessible to other employees. In late 1989, Tim Berners-Lee proposed an information-management system based on hypertext, an interlinking form of text that was a direct descendant of Vannevar Bush’s Memex ideas. The result, one year later, was the first browser, a prototype of applications like Microsoft’s Internet Explorer.

After toying with names like The Information Mine (which he abandoned after noting the initials) and Information Mesh, he decided to emphasize the global and decentralised nature of his project and called it WorldWideWeb. (He later renamed his program Nexus to distinguish it from the networked “large universe of documents” it enabled, ie the World Wide Web.) The first ever web address was Sadly the original page no longer exists, but a later copy reveals the prophetic first words to be, “The WorldWideWeb (W3) is a wide-area hypermedia information retrieval initiative aiming to give universal access to a large universe of documents.”

At that point there were 600,000 internet users, most of them academics. Thanks in part to Berners-Lee’s astounding generosity in making the web free to all (he never made a penny from it directly, though by any normal commercial standards he would have been entitled to) within five years, there were 40 million. According to Internet World Stats, at the time of writing that number is close to a billion and a half.

But wouldn’t we all have been happier surfing the Galactic Network?

For no reason at all, here are some genuine websites from the history of the internet whose creators didn’t think through their URLs:

• – wisely changed after a few years to
• – dedicated to tracking down therapists, not rapists
• – for those seeking celebrities’ representation, not gifts for prostitutes
• – an Italian firm specialising in batteries that would also benefit from a hyphen: powergen-italia
• – home of the Mole Station Native Nursery, of course!

E-mail – and its use of the @ symbol – was invented in late 1971 by Ray Tomlinson of Bolt, Beranek and Newman Technologies. The first e-mail is lost to history, but Tomlinson recalls that, “Most likely the first message was QWERTYUIOP or something similar […] It is equally likely to have been ‘TESTING 1 2 3 4’. Probably the only true statements about that first email are that it was all upper case (shouted) and the content was insignificant and forgettable”.