The Beautiful Forgery

Ever heard of the “Romantic Movement”? It didn’t have anything to do with bringing home roses and chocolates for the missus; indeed it didn’t have anything to do with what we think of as “romance” at all. Wikipedia says that it was “an artistic and intellectual movement that originated in late 18th century Western Europe”. It was partly “a revolt against aristocratic, social, and political norms of the Enlightenment period”, but it was also (and more importantly) “a reaction against the rationalization of nature”.

There were two major scientific advances that led to the birth of the movement:

The first was medical science (and science in general), in that it seemed that scientists of the day were creeping ever closer to discovering the true “essence” of life. Whatever you want to call it – Life Force, Primal Essence, you name it – it seemed as if scientists of the day were mere inches from figuring out what that thing was and the Romantics feared that all manner of Bad Things would happen once Pandora’s Box was opened. One of the most famous pieces of Romantic literature – Mary Shelley’s Frankenstein – deals with this question directly: a “mad scientist” type figures out how to harness the power of life and uses his skills to create a monster. You probably know the rest of the story. In any case, Mary Shelley’s fear is hardly unknown to us. In fact, such fear may be even more prevalent today than it was two hundred years ago. It seems that medical science – with its DNA and stem cell research – might again be on the cusp of “harnessing life”. And it scares people now just as much as it did then.

The other scientific advance that kicked off the Romantic Movement was the Industrial Revolution. For centuries, people made things with their hands. But suddenly, factories were popping up all over Europe, factories that could do the work of thousands of people using machines that didn’t require wages or sleep. People had a real fear of technology – much like people in the 1960s and 1970s that feared that computers would take over their jobs. In fact, the fear of technology was so great that a political movement took root in England that went from factory to factory smashing up the machines. The movement made such an impression that to this day “Luddite” is a pejorative term for someone that has a (real or imagined) fear of technology.

Continue reading “The Beautiful Forgery”

How a Comet Destroyed An Entire Industry

The British Empire was the largest, most powerful empire in the history of the world. At its height, the Empire controlled over 14 million square miles of territory and 458 million people – both of which constituted a quarter of the entire planet. The old saying “the sun never sets on the British Empire” was indeed true – so much land was held in so many places that the sun was, in fact, shining on some piece of land held by the British at all times.

It all came to an end after World War II, though. Exhausted and broke after battling Hitler for so long, Britain could no longer afford the luxury of an overseas empire, and most of the territories held by Britain were eventually given their independence.

Although the Empire began falling apart in the late 1940s, Britain still acted as an Imperial power in many industries until the mid 1960s. In shipbuilding, medical research and aeronautical design the Brits still ruled. In fact, it was the last category – aeronautical design – the Brits were in fact ahead of both the Americans and the Soviets. For on January 22, 1952, the British Overseas Airways Corporation (BOAC), the forerunner of British Airways, began the first commercial passenger jet service using the de Havilland Comet jet.

Continue reading “How a Comet Destroyed An Entire Industry”

About “The Exorcist”

1973’s The Exorcist shines as one of the scariest movies ever made. And what makes the film so scary (to me) is what it is not. It’s not based on some silly “campfire legend” like the characters Jason Voorhees, Michael Myers or Freddy Krueger. It’s not based on some highly improbable occurrence, like an alien invasion, nuclear disaster or virus mutation. It’s not based on a gimmick (The Blair Witch Project), nor does it have a “lesson” or “moral” that it hits us over the head with (An American Haunting). It doesn’t involve characters that are either much larger than life or blatant charactures or stereotypes (The Haunting, Thirteen Ghosts, etc.). How many times have you watched a movie like that and picked out the “order of death” – “OK, there’s the black guy, he’s absolutely gonna die first… then the stoner guy, then the slut, then the jock… which leaves the pretty (but not too pretty) blonde girl as the lone survivor!”

No, The Exorcist is scary because it goes deep within our collective psyche. It’s a primal fear that yes, after all, The Church might have been right. Because after all, how can you believe in the Devil if you don’t believe in God? The family in The Exorcist was mostly just like us. Who chose that poor girl to be the battlefield between Good and Evil? Could we be next? And how can you fight an evil that you can’t even see or touch? Fighting Jason Voorhees is one thing… but how do you save your daughter from The Devil?

With today being Halloween, I thought I’d do a quick “Spooky History Blog” about my favorite horror movie. Enjoy!

Continue reading “About “The Exorcist””

The Biggest Loser in IT History

This story has been told and retold on the Internet so many times that it should be old-hat by now. Unfortunately, it’s almost always told incorrectly. I don’t claim to have any special knowledge of the subject, but I have read several books on the birth of personal computers; I have also seen the documentary “Revenge of the Nerds” and the TNT film “The Pirates of Silicon Valley”, so I think that entitles me to weigh in on the subject.

IBM long prided itself on making the best computers in the world. IBM made huge mainframe computers, elegant, power-hungry monsters that were as big as a refrigerator on the small end and as big as a tractor trailer on the large end. IBM made billions making “real” computers like these, so it’s not entirely surprising that they initially looked at the “personal computer” of the late 1970s as a toy. Much to their distress, however, companies like Apple, Commodore and Timex were shipping personal computers by the millions. While IBM scoffed at the notion that a personal computer could be useful for anything more than storing recipes, entire industries were being built up around the Apple II and the Commodore 64. IBM’s absence from the personal computer market thus started to become quite noticeable. After all, IBM was seen by most Americans as the computer company… yet they offered nothing for the home consumer, enthusiast or even smaller businesses that needed a computer but couldn’t afford one of IBM’s mainframes.

A working group was thus created within IBM to bring a personal computer to market as quickly as possible… which presented a problem. IBM had long prided itself on designing and manufacturing every single part of their computers. That was just “the IBM way”. But the working group quickly discovered that designing a brand-new personal computer from scratch would take years. Since IBM management wanted to enter the personal computer market as quickly as possible, designing a brand-new machine was therefore not an option. The new IBM PC was thus designed with off-the-shelf parts to allow the company to start shipping PCs as quickly as possible.

But then another problem surfaced… what operating system would the new IBM PCs use? It can take just as long (usually longer, in fact) to create an operating system than the hardware it runs on, and IBM didn’t have the time to create one of their own. So they looked to an outside source. And one of the first places they went was a small Seattle company called Microsoft.

Continue reading “The Biggest Loser in IT History”

All About Bubbles

Back in the mid-1980s, the “Baby Boomer” generation began entering middle age. Many began to look back to their childhoods for comfort or nostalgia, and one of the things that stuck out in their collective memories was baseball cards. These simple pieces of paper brought back powerful memories of lazy summer afternoons trading cards with friends, breathlessly opening new packs hoping for “THE card” and even putting the cards into the spokes of their bicycles to create faux motorcycle sounds as they pedaled down the street.

Unfortunately for the Boomers, most of their mothers had thrown the cards away ages ago, thinking them to be worthless. However, since the Boomers were entering their prime earning years, many had cash to spend on the hitherto “worthless” cards. Almost overnight, a huge market opened up for old baseball cards, and cards that might have sold for a quarter at garage sales now started commanding hundreds or even thousands of dollars.

The baseball card industry took immediate notice. They began producing baseball cards of all kinds: cards based on “classic” 1950s designs, cards branded under the names of 1950s manufacturers (like Bowman) that had long since gone out of business, cards with “low numbers” (which included the previous season’s rosters) and cards with “high numbers” (which incorporated last-minute trades), packs of cards with autographed cards or other pieces of memorabilia, and so on. And where there had been only two companies making baseball cards in the early 1980s (longtime market leader Topps and perennial second-banana Fleer), new companies like Donruss, Score and Upper Deck entered into the market.

Baseball card hysteria took off in the early 1990s. Most every town, regardless of size, seemed to have at least one card shop. And with so many cards available, markets for complimentary items like price guides and protective plastic casings helped fill the stores’ shelves as well. People began buying cards left and right, but not for the nostalgia value… they were hoping to pay for their kid’s college education with a few boxes of baseball cards. Card auctions were closely watched, as prices for cards seemed to rise ever higher.

Continue reading “All About Bubbles”

The Cocoanut Grove Fire

Are you reading this article at work? Via a public Wi-Fi hotspot at a coffee shop or airport? Anywhere in public? If so, look around you. You’ll probably see at least one door with one of those standard illuminated EXIT signs above it. You might notice that the exit door has a brightly colored handle, so as to be easy to see in case of a smoky fire. You might also have noticed that just about every door leading to the outside in a public building – be it an office, a church, a shopping mall or a school – pushes out from the inside, rather than pulls in, so that in case of emergency the people inside can exit the building as quickly as possible. You might have even noticed that buildings that have revolving doors always have at least one “normal” door next to the exit, so that if there’s some emergency people aren’t stuck waiting to walk through the revolving door.

All of these fire\emergency safety precautions might seem like common sense today. But it took one of the greatest tragedies in American history for such changes to become required by law. That tragedy was the Cocoanut Grove Fire of 1942.

The Cocoanut Grove nightclub was the place to be in early WWII-era Boston. The club had a maximum official occupancy of 460, but the club was so popular that it often had two or three times that amount inside. And it’s not difficult to see why everyone would want in: the club was a virtual paradise inside, a lush, tropical-themed club lined with artificial palm trees and cloth coverings on the walls and ceilings. The club even had a retractable roof that was opened in the summer months so that the club’s patrons – a lot of them soldiers and sailors on their last fling before setting off to fight in Europe – could dance under the stars. The club had a main floor which had your basic bar and dance floor setup, a dining room upstairs, and an intimate lounge downstairs. If a GI played his cards right, he could spend an entire evening there: after a nice dinner upstairs, he could go downstairs for drinks and dancing on the main floor, and then go down to one of the dark corners of the lounge for a make-out session if he’d been lucky.

Continue reading “The Cocoanut Grove Fire”

The Death of Superman

When I was a child, WTBS (a.k.a. TBS, the “Superstation”) was known as WTCG. At the time, WTCG was a small, unimportant UHF station in Atlanta that was infamous for running old black & white B-movies and reruns of ancient TV shows like Petticoat Junction, Felix The Cat and Mighty Mouse. Even though America had firmly moved in to the color TV era by this point, it sometimes seemed as if the only color you’d ever see on WTCG were the commercials or the occasional color episode of The Beverly Hillbillies!

One of the shows that WTCG ran religiously was the original Superman TV show – the one from the 1950s starring George Reeves. It was one of my favorite shows as a wee child, and I’d beg my mom to rush home from kindergarten so I wouldn’t miss a minute of Reeves dishing out truth, justice and the American Way. It’s ironic (and sad) then, that Reeves would be denied all of those things when it came to his own life.

George Reeves was born in 1914 in Woolstock, Iowa. His first film appearance was 1939’s Ride, Cowboy, Ride. Reeves would go on to become a somewhat successful “bit part” actor; he ended up being one of Vivian Leigh’s first suitors in the opening scene of Gone With The Wind. But by the 1950s, Reeves star had fallen, and he was reduced to taking the occasional part on television.

Continue reading “The Death of Superman”

The Heart’s Memory

Medical science might be on the verge of a truly amazing discovery. For decades, it’s been thought that the only organ responsible for human memory is the brain. And while it’s true that the lion’s share of memories and preferences are indeed kept in the brain, it turns out that the brain might not be the only organ where such things are kept.

It all began when two American doctors – working separately on either coast – noticed something odd about heart transplant patients. Once the patients had recovered, their personalities started to undergo subtle changes. For example, a woman named Claire Sylvia started drinking beer and eating green peppers and chicken nuggets, even though she’d never enjoyed doing so before. Bill Wohl, a middle aged man from Phoenix, was a dedicated businessman that rarely – if ever – exercised before his heart transplant; as soon as he was healthy enough to do so, he stopped working so much and took up extreme sports. An Englishman named Jim that had barely graduated from school and led the decidedly non-academic life of a truck driver suddenly began writing poetry. An unnamed woman that hated violence so much that she’d even leave the room when her husband watched football started watching football and swearing like a sailor while doing so. Another unnamed person – this time a 47 year-old male – suddenly developed a fascination with classical music, and could even hum obscure classical pieces he’d never heard before.

Continue reading “The Heart’s Memory”

Spring Heeled Jack

Just about everyone in the English-speaking world – and probably the entire world – is familiar with the story of Jack The Ripper, the mysterious serial killer that haunted London’s Whitechapel district in the second half of 1888. But many have never heard of another Jack that terrified the entire English nation decades before the Ripper. He was, in a way, much more frightening than Jack The Ripper, even if this Jack didn’t kill anyone. This is because hundreds of people saw him and were terrified by what they saw. Ladies and gentlemen… meet Spring Heeled Jack.

He was called Spring Heeled Jack because of his ability to effortlessly leap over walls that were 8, 10 or even 15 feet high. But that’s not what scared people. It was his appearance – like that of a devil – that put the fear of God into people. He was tall and thin, with claws for hands, pointed ears and eyes that glowed red in the night. Some even said that he could breathe white or blue flames. The few that were unlucky enough to actually be touched by Spring Heeled Jack reported that his skin was ice cold.

Reports of Spring Heeled Jack exist from as early as 1817, but he didn’t become a phenomenon until September of 1837, when reports of bizarre happenings hit the London press. A perfectly upstanding businessman reported that Jack had jumped over the tall wall of a cemetery and landed right in his path. Shortly thereafter, a group claimed that a man with similar features had attacked them, and one of the party even had her coat ripped by the unknown assailant. Another of the party – a barmaid named Polly Adams – wasn’t so lucky. She was found bloodied and unconscious in the same spot hours later with her blouse torn and deep scratches in her belly. A few weeks later, a girl named Mary Stevens was assaulted on Clapham Common by someone (or something) meeting Jack’s description, and in much the same fashion as Polly Adams had been. Jack returned the next day, this time by leaping from a wall to block the path of a moving horse carriage, causing it to crash. A few days later, Jack struck yet again… and this time he left physical evidence: police noted two footprints “around three inches deep” in the immediate area. This implies someone jumping from a great height. Upon further examination, one police officer noted “curious imprints” within the footprints which led him to believe that springs or some other gadget might have been involved. Sadly though, the concept of “forensics” hadn’t developed yet, so casts were never made of the prints.

Continue reading “Spring Heeled Jack”

The Bacteriaophage

Back in the 1920s and 1930s, millions of dollars were poured in to the study of bacteriophages – viruses that kill bacteria but are otherwise harmless to humans. Back then, diseases like cholera and dysentery were running rampant throughout the planet, and millions died from those two diseases alone. But then Alexander Fleming discovered the antibiotic properties of penicillin in 1928, and Western medicine dropped bacteriophage study almost en masse to move into the new and sexy world of antibiotics.

Looking back on it now, that was a pretty boneheaded move. The overuse and misapplication of antibiotics has helped to hasten the day when bacteria become resistant to many (if not most) types of antibiotics. You see, not every single bacterium is affected equally by an antibiotic. Some antibiotics merely weaken a bacterium until the antibiotic ceases to be administered. Other bacterium might be completely immune to an antibiotic. Regardless, the important thing is that those bacteria most able to survive against antibiotics are the ones that survive and multiply. And given the short life of bacteria in general, natural selection can work its magic in months or even days, instead of the centuries and millennia that humans tend to associate natural selection with. Staphylococcus aureus is not only one of the most common infections in hospitals, it’s one of the hardiest too, having developed resistance to penicillin as early as 1947. MRSA (methicillin-resistant Staphylococcus aureus) is now considered to be “quite common” in British hospitals. And to show you what a problem its become, MRSA was the cause of 37% of all fatal cases of blood poisoning in the UK in 1999; less than a decade earlier, only 4% of blood poisoning deaths in the UK were caused by MRSA.

Continue reading “The Bacteriaophage”