About “The Exorcist”

1973’s The Exorcist shines as one of the scariest movies ever made. And what makes the film so scary (to me) is what it is not. It’s not based on some silly “campfire legend” like the characters Jason Voorhees, Michael Myers or Freddy Krueger. It’s not based on some highly improbable occurrence, like an alien invasion, nuclear disaster or virus mutation. It’s not based on a gimmick (The Blair Witch Project), nor does it have a “lesson” or “moral” that it hits us over the head with (An American Haunting). It doesn’t involve characters that are either much larger than life or blatant charactures or stereotypes (The Haunting, Thirteen Ghosts, etc.). How many times have you watched a movie like that and picked out the “order of death” – “OK, there’s the black guy, he’s absolutely gonna die first… then the stoner guy, then the slut, then the jock… which leaves the pretty (but not too pretty) blonde girl as the lone survivor!”

No, The Exorcist is scary because it goes deep within our collective psyche. It’s a primal fear that yes, after all, The Church might have been right. Because after all, how can you believe in the Devil if you don’t believe in God? The family in The Exorcist was mostly just like us. Who chose that poor girl to be the battlefield between Good and Evil? Could we be next? And how can you fight an evil that you can’t even see or touch? Fighting Jason Voorhees is one thing… but how do you save your daughter from The Devil?

With today being Halloween, I thought I’d do a quick “Spooky History Blog” about my favorite horror movie. Enjoy!

Continue reading “About “The Exorcist””

The Biggest Loser in IT History

This story has been told and retold on the Internet so many times that it should be old-hat by now. Unfortunately, it’s almost always told incorrectly. I don’t claim to have any special knowledge of the subject, but I have read several books on the birth of personal computers; I have also seen the documentary “Revenge of the Nerds” and the TNT film “The Pirates of Silicon Valley”, so I think that entitles me to weigh in on the subject.

IBM long prided itself on making the best computers in the world. IBM made huge mainframe computers, elegant, power-hungry monsters that were as big as a refrigerator on the small end and as big as a tractor trailer on the large end. IBM made billions making “real” computers like these, so it’s not entirely surprising that they initially looked at the “personal computer” of the late 1970s as a toy. Much to their distress, however, companies like Apple, Commodore and Timex were shipping personal computers by the millions. While IBM scoffed at the notion that a personal computer could be useful for anything more than storing recipes, entire industries were being built up around the Apple II and the Commodore 64. IBM’s absence from the personal computer market thus started to become quite noticeable. After all, IBM was seen by most Americans as the computer company… yet they offered nothing for the home consumer, enthusiast or even smaller businesses that needed a computer but couldn’t afford one of IBM’s mainframes.

A working group was thus created within IBM to bring a personal computer to market as quickly as possible… which presented a problem. IBM had long prided itself on designing and manufacturing every single part of their computers. That was just “the IBM way”. But the working group quickly discovered that designing a brand-new personal computer from scratch would take years. Since IBM management wanted to enter the personal computer market as quickly as possible, designing a brand-new machine was therefore not an option. The new IBM PC was thus designed with off-the-shelf parts to allow the company to start shipping PCs as quickly as possible.

But then another problem surfaced… what operating system would the new IBM PCs use? It can take just as long (usually longer, in fact) to create an operating system than the hardware it runs on, and IBM didn’t have the time to create one of their own. So they looked to an outside source. And one of the first places they went was a small Seattle company called Microsoft.

Continue reading “The Biggest Loser in IT History”

All About Bubbles

Back in the mid-1980s, the “Baby Boomer” generation began entering middle age. Many began to look back to their childhoods for comfort or nostalgia, and one of the things that stuck out in their collective memories was baseball cards. These simple pieces of paper brought back powerful memories of lazy summer afternoons trading cards with friends, breathlessly opening new packs hoping for “THE card” and even putting the cards into the spokes of their bicycles to create faux motorcycle sounds as they pedaled down the street.

Unfortunately for the Boomers, most of their mothers had thrown the cards away ages ago, thinking them to be worthless. However, since the Boomers were entering their prime earning years, many had cash to spend on the hitherto “worthless” cards. Almost overnight, a huge market opened up for old baseball cards, and cards that might have sold for a quarter at garage sales now started commanding hundreds or even thousands of dollars.

The baseball card industry took immediate notice. They began producing baseball cards of all kinds: cards based on “classic” 1950s designs, cards branded under the names of 1950s manufacturers (like Bowman) that had long since gone out of business, cards with “low numbers” (which included the previous season’s rosters) and cards with “high numbers” (which incorporated last-minute trades), packs of cards with autographed cards or other pieces of memorabilia, and so on. And where there had been only two companies making baseball cards in the early 1980s (longtime market leader Topps and perennial second-banana Fleer), new companies like Donruss, Score and Upper Deck entered into the market.

Baseball card hysteria took off in the early 1990s. Most every town, regardless of size, seemed to have at least one card shop. And with so many cards available, markets for complimentary items like price guides and protective plastic casings helped fill the stores’ shelves as well. People began buying cards left and right, but not for the nostalgia value… they were hoping to pay for their kid’s college education with a few boxes of baseball cards. Card auctions were closely watched, as prices for cards seemed to rise ever higher.

Continue reading “All About Bubbles”

The Cocoanut Grove Fire

Are you reading this article at work? Via a public Wi-Fi hotspot at a coffee shop or airport? Anywhere in public? If so, look around you. You’ll probably see at least one door with one of those standard illuminated EXIT signs above it. You might notice that the exit door has a brightly colored handle, so as to be easy to see in case of a smoky fire. You might also have noticed that just about every door leading to the outside in a public building – be it an office, a church, a shopping mall or a school – pushes out from the inside, rather than pulls in, so that in case of emergency the people inside can exit the building as quickly as possible. You might have even noticed that buildings that have revolving doors always have at least one “normal” door next to the exit, so that if there’s some emergency people aren’t stuck waiting to walk through the revolving door.

All of these fire\emergency safety precautions might seem like common sense today. But it took one of the greatest tragedies in American history for such changes to become required by law. That tragedy was the Cocoanut Grove Fire of 1942.

The Cocoanut Grove nightclub was the place to be in early WWII-era Boston. The club had a maximum official occupancy of 460, but the club was so popular that it often had two or three times that amount inside. And it’s not difficult to see why everyone would want in: the club was a virtual paradise inside, a lush, tropical-themed club lined with artificial palm trees and cloth coverings on the walls and ceilings. The club even had a retractable roof that was opened in the summer months so that the club’s patrons – a lot of them soldiers and sailors on their last fling before setting off to fight in Europe – could dance under the stars. The club had a main floor which had your basic bar and dance floor setup, a dining room upstairs, and an intimate lounge downstairs. If a GI played his cards right, he could spend an entire evening there: after a nice dinner upstairs, he could go downstairs for drinks and dancing on the main floor, and then go down to one of the dark corners of the lounge for a make-out session if he’d been lucky.

Continue reading “The Cocoanut Grove Fire”

The Death of Superman

When I was a child, WTBS (a.k.a. TBS, the “Superstation”) was known as WTCG. At the time, WTCG was a small, unimportant UHF station in Atlanta that was infamous for running old black & white B-movies and reruns of ancient TV shows like Petticoat Junction, Felix The Cat and Mighty Mouse. Even though America had firmly moved in to the color TV era by this point, it sometimes seemed as if the only color you’d ever see on WTCG were the commercials or the occasional color episode of The Beverly Hillbillies!

One of the shows that WTCG ran religiously was the original Superman TV show – the one from the 1950s starring George Reeves. It was one of my favorite shows as a wee child, and I’d beg my mom to rush home from kindergarten so I wouldn’t miss a minute of Reeves dishing out truth, justice and the American Way. It’s ironic (and sad) then, that Reeves would be denied all of those things when it came to his own life.

George Reeves was born in 1914 in Woolstock, Iowa. His first film appearance was 1939’s Ride, Cowboy, Ride. Reeves would go on to become a somewhat successful “bit part” actor; he ended up being one of Vivian Leigh’s first suitors in the opening scene of Gone With The Wind. But by the 1950s, Reeves star had fallen, and he was reduced to taking the occasional part on television.

Continue reading “The Death of Superman”

The Heart’s Memory

Medical science might be on the verge of a truly amazing discovery. For decades, it’s been thought that the only organ responsible for human memory is the brain. And while it’s true that the lion’s share of memories and preferences are indeed kept in the brain, it turns out that the brain might not be the only organ where such things are kept.

It all began when two American doctors – working separately on either coast – noticed something odd about heart transplant patients. Once the patients had recovered, their personalities started to undergo subtle changes. For example, a woman named Claire Sylvia started drinking beer and eating green peppers and chicken nuggets, even though she’d never enjoyed doing so before. Bill Wohl, a middle aged man from Phoenix, was a dedicated businessman that rarely – if ever – exercised before his heart transplant; as soon as he was healthy enough to do so, he stopped working so much and took up extreme sports. An Englishman named Jim that had barely graduated from school and led the decidedly non-academic life of a truck driver suddenly began writing poetry. An unnamed woman that hated violence so much that she’d even leave the room when her husband watched football started watching football and swearing like a sailor while doing so. Another unnamed person – this time a 47 year-old male – suddenly developed a fascination with classical music, and could even hum obscure classical pieces he’d never heard before.

Continue reading “The Heart’s Memory”

Spring Heeled Jack

Just about everyone in the English-speaking world – and probably the entire world – is familiar with the story of Jack The Ripper, the mysterious serial killer that haunted London’s Whitechapel district in the second half of 1888. But many have never heard of another Jack that terrified the entire English nation decades before the Ripper. He was, in a way, much more frightening than Jack The Ripper, even if this Jack didn’t kill anyone. This is because hundreds of people saw him and were terrified by what they saw. Ladies and gentlemen… meet Spring Heeled Jack.

He was called Spring Heeled Jack because of his ability to effortlessly leap over walls that were 8, 10 or even 15 feet high. But that’s not what scared people. It was his appearance – like that of a devil – that put the fear of God into people. He was tall and thin, with claws for hands, pointed ears and eyes that glowed red in the night. Some even said that he could breathe white or blue flames. The few that were unlucky enough to actually be touched by Spring Heeled Jack reported that his skin was ice cold.

Reports of Spring Heeled Jack exist from as early as 1817, but he didn’t become a phenomenon until September of 1837, when reports of bizarre happenings hit the London press. A perfectly upstanding businessman reported that Jack had jumped over the tall wall of a cemetery and landed right in his path. Shortly thereafter, a group claimed that a man with similar features had attacked them, and one of the party even had her coat ripped by the unknown assailant. Another of the party – a barmaid named Polly Adams – wasn’t so lucky. She was found bloodied and unconscious in the same spot hours later with her blouse torn and deep scratches in her belly. A few weeks later, a girl named Mary Stevens was assaulted on Clapham Common by someone (or something) meeting Jack’s description, and in much the same fashion as Polly Adams had been. Jack returned the next day, this time by leaping from a wall to block the path of a moving horse carriage, causing it to crash. A few days later, Jack struck yet again… and this time he left physical evidence: police noted two footprints “around three inches deep” in the immediate area. This implies someone jumping from a great height. Upon further examination, one police officer noted “curious imprints” within the footprints which led him to believe that springs or some other gadget might have been involved. Sadly though, the concept of “forensics” hadn’t developed yet, so casts were never made of the prints.

Continue reading “Spring Heeled Jack”

The Bacteriaophage

Back in the 1920s and 1930s, millions of dollars were poured in to the study of bacteriophages – viruses that kill bacteria but are otherwise harmless to humans. Back then, diseases like cholera and dysentery were running rampant throughout the planet, and millions died from those two diseases alone. But then Alexander Fleming discovered the antibiotic properties of penicillin in 1928, and Western medicine dropped bacteriophage study almost en masse to move into the new and sexy world of antibiotics.

Looking back on it now, that was a pretty boneheaded move. The overuse and misapplication of antibiotics has helped to hasten the day when bacteria become resistant to many (if not most) types of antibiotics. You see, not every single bacterium is affected equally by an antibiotic. Some antibiotics merely weaken a bacterium until the antibiotic ceases to be administered. Other bacterium might be completely immune to an antibiotic. Regardless, the important thing is that those bacteria most able to survive against antibiotics are the ones that survive and multiply. And given the short life of bacteria in general, natural selection can work its magic in months or even days, instead of the centuries and millennia that humans tend to associate natural selection with. Staphylococcus aureus is not only one of the most common infections in hospitals, it’s one of the hardiest too, having developed resistance to penicillin as early as 1947. MRSA (methicillin-resistant Staphylococcus aureus) is now considered to be “quite common” in British hospitals. And to show you what a problem its become, MRSA was the cause of 37% of all fatal cases of blood poisoning in the UK in 1999; less than a decade earlier, only 4% of blood poisoning deaths in the UK were caused by MRSA.

Continue reading “The Bacteriaophage”

One-Hit Wonders

Like pornography, “one-hit wonders” are hard to define, yet people know them when they see them.

A “one-hit wonder” is technically defined as “a band that has a single hit song in a nation’s official music charts, then fades into obscurity forever”. But, in reality, it’s a bit more complicated than that.

For example, it’s often implied that the “one hit” is huge, like Los del Rio’s “Macarena”, Baha Men’s “Who Let the Dogs Out?” or Chumbawumba’s “Tubthumping”. This is to differentiate it from a long-running, well-respected indie band who just happened to have one song peak at #39 in the mainstream charts. I call this the “Pixies Clause”, because although the Pixies had a long career and several hits on the US Alternative charts, most mainstream music fans only remember them for “Where Is My Mind?” (a song, incidentally, that was never a single).

But even this is open to interpretation. The Swedish band a-ha landed at #8 on VH1’s “100 Greatest One-Hit Wonders” list of 2002, not to mention countless “One-Hit Wonders of the 80s” lists. But the band actually had two Top 20 singles in 1985: “Take On Me” reached #1 while the arguably better “The Sun Always Shines on TV” reached #20. Similarly, Great White are often considered one-hit wonders for their #5 hit “Once Bitten Twice Shy”, even though “The Angel Song” also made it to Billboard’s Top 40.

Geography is integral with one-hit wonders. A band can be hugely successful in one country but still be considered a one-hit wonder in another. Sweden’s The Cardigans had ten Top 40 singles in the UK, yet are thought of as “one-hit wonders” in the US for their hit “Lovefool”, which became popular after being featured in Leonardo DiCaprio’s version of Romeo + Juliet. Other geographically-hindered bands in the US include Nena, Frankie Goes to Hollywood, Dexy’s Midnight Runners and Crash Test Dummies. Flipping it around, Brownsville Station and Alphaville are considered one-hit wonders in the UK, even though both had more than one hit single in the US.

Continue reading “One-Hit Wonders”

Industry “Standards”

Anyone familiar with the IT industry is surely aware of the hundreds of “industry standards” that have come and gone over the years: USB, FireWire, PictBridge, 802.11g, Bluetooth, Ethernet, PCI, ISA, RS-232… the list goes on and on. Most of these standards are (were) well thought-out systems created by engineers working with designers and marketing departments. But that’s not always the case. Industry standards are sometimes determined by available components or corporate warfare… or even one man’s random decision! And you can find all three of those reasons in the chequered history of the phonograph record.

As you probably know, the first commercially viable recording and playback system was invented by Thomas Edison in 1877. The system used a needle to cut grooves into a spinning wax cylinder. The only problem with the system was the the cylinder was turned by a hand crank. This meant that you could record something at 30 cranks per second (cps), while your neighbor might record something at 50cps, while the guy down the street might use 60cps. It wasn’t long before Edison’s engineers were asking him to create a “cranks per second” standard so that any recording would play back correctly on any machine. Edison found a machine and played with it for a while before setting on 80cps… “because it sounded right”. No scientific testing, no focus groups, no careful study of the results… just Edison playing around with the machine for 15 minutes.

Continue reading “Industry “Standards””