The Gay Dolphin Gift Cove is a very cheesy gift store in Myrtle Beach, South Carolina. I made my first visit there in 2009 with some friends – read my review here.
While there, we discovered this ugly-ass pillow:
We went back in 2011, and the pillow was still there:
And the pillow was still there when Lisa and I went in 2012:
My friend William went there earlier this year and reported that the ugly pillow was nowhere to be found. Our hearts sank. How could the ugly pillow be gone? Who would have bought it?
In my heart of hearts, I knew the pillow was still there… William just didn’t look hard enough. So when we went back to Myrtle this week, I searched the store high and low for it, but the ugly pillow was gone. Lisa thinks the store put it in one of their infamous “grab bags”, and she’s probably right. But I want to think that the sad pillow is still there, waiting on a sucker someone to buy it.
* * *
On a happier note, the Atlanta Braves Nacho Cheese Gift Pack is STILL THERE:
I’m 99% certain this is the exact same gift pack I referred to in my original review from 2009, and pictured in this post. It has the same “best by” date of 08/31/2008, and has the same 12/06 date code on the price tag:
Meaning, of course, that this “gift” pack has been sitting on the shelf at Gay Dolphin for almost 8 YEARS!
With the vote for Scottish independence coming very soon, I thought this little tale was timely.
In 1707, the parliaments of England and Scotland voted to dissolve themselves and create a new parliament made up of members from both countries. “England” and “Scotland” effectively ceased to exist, and a new country was born: the “United Kingdom”.
Which is really odd if you think about it. Hadn’t they been at war with each other off and on for, like, 800 years? Why the sudden change of heart? Why would England and Scotland – two longtime foes – suddenly become friends?
* * *
By the late 1600s, most of Europe’s maritime powers had founded colonies. Spain controlled much of South America. Portugal had Brazil, parts of India as well as a bunch of economically important islands. The Dutch had New Amsterdam in North America and most of the Spice Islands. And the English had North America and a few outposts in the Caribbean.
Many in Scotland wondered why they didn’t have a colony of their own. But it wasn’t as simple as just getting in a boat and putting up a flag somewhere. There was little point in having a colony just to have one. The Spanish made millions off South American silver, while the Dutch made money off spices and tea, and the English money from tobacco and sugar. What the Scots needed was a colony that could provide some sort of economic gain.
And gain was sorely needed in Scotland. Economically it was a pipsqueak compared to England, an advantage the English used at every level to keep Scotland subjugated. England’s Navigation Acts kept independent Scottish trade to a minimum, not that it really mattered, since Scotland’s navy was tiny compared to England’s. Most imported goods therefore had to be bought from England, and England required the use of pounds sterling, not Scottish money, which drained the economy even more. A couple of civil wars had squandered a lot of human and financial capital, and several years of crop failures pushed Scotland’s economy to the brink.
The “Company of Scotland Trading to Africa and the Indies” was created by the Scottish parliament in 1695. Capital was raised for the venture in Amsterdam, Hamburg and London. But not really. England’s East India Company complained to the crown they they, not some Scottish upstart, had been given a monopoly on trade to the Indies. And the East India Company was one company you did not mess with. At the apex of its power, the Company ruled much of India – an area much larger than the United Kingdom, with several times as many people. It ruled other places – the East India Company created modern Singapore, for instance. The Company had an army of 200,000 men, its own church, currency and government, and accounted for over half the trade in the entire world. What the East India Company wanted, the East India Company got.
Immense pressure from King William III and East India investors caused Company of Scotland investors in London, then Amsterdam, and finally Hamburg to abandon their pledges. The Company of Scotland tried looking elsewhere for money, but Europe’s other banks and investment centers got the hint from London, and no one stepped up to the plate.
So the plan to trade with Africa and India was abandoned, and a new idea was formed: a Scottish banker named William Paterson wanted to create a “gateway” between the Atlantic and Pacific oceans. He noticed that the land in what is now Panama was very thin. Just as people later got the idea to build the Panama Canal, Patterson’s plan was to built a seaport on the Atlantic side of the coast, another on the Pacific side, and build a road connecting the two. Goods could therefore be sent safely by land instead of ships having to navigate the treacherous waters around Cape Horn or the Strait of Magellan. This would shave weeks off shipping times for goods, and for this merchants would pay a small fee, which would earn money for Scotland. Easy, right?
In his day, Johann Pachelbel (1653-1706) was a respected and popular composer of “Southern German” baroque music. He left a large body of secular and sacred work, such as this pretty Chaconne in F Minor:
Sadly, though, Pachelbel’s work was almost completely forgotten. Oh sure, some of his music would be played from time to time, especially his organ works. But for a couple hundred years, his name was lost in the sea of Bachs, Händels, Telemanns and Scarlattis. Few classical music scholars knew much about him or his work, to say nothing of the general public.
All that changed in 1970, when French conductor Jean-François Paillard recorded a slow, majestic version of Pachelbel’s Canon in D:
Just for fun, contrast Paillard’s overwrought, saccharine version with what many music scholars think the piece actually sounded like in Pachelbel’s day:
In any case, the piece became popular with classical music fans almost overnight, and went mainstream when it was prominently featured in the 1980 film Ordinary People. Since then, the work has become a staple of weddings and 100 Most Beautiful Pieces of Music box sets you see at stores like Bed, Bath & Beyond.
Pachelbel married twice. His first marriage ended when his wife and first son died in a plague outbreak in 1683. Pachelbel remarried a year later, and had two daughters and five sons with his new wife. Two of those sons – Wilhelm Hieronymus and Karl Theodor – became composers like their dad. But history remembers the second son as “Charles Theodore Pachelbel”, not Karl Theodor. And that’s because Charles became one of the first European composers – certainly the first European composer with name recognition – to move to the American colonies.
Exactly why Charles made the move is a complete mystery. We know for sure that he moved to Gotha when he was two, and Nuremberg when he was five. After his father died in 1706, the historical record falls almost silent, except that Charles probably lived in England for a time: his name appears on a list of subscribers to a volume of harpsichord music published in London. And how weird is it that customs or parish records from the time have been lost, but a list of magazine subscribers has survived?
We know that Charles Pachelbel was living in Boston by 1733 because he was asked to consult on the installation of a new pipe organ at Trinity Church in Newport, Rhode Island (the oldest Episcopal church in the state, by the way). Pachelbel lived there for approximately two years, having been hired as church organist. In 1736, he performed two concerts in New York City.
He moved to Charleston some time after March 9, 1736 (the second New York City concert) and February 16, 1737, when he married a woman named Hanna Poitevin at St. Philip’s Church, the oldest Anglican church in South Carolina. This was probably Pachelbel’s second marriage, as there are records which indicate that he already had a daughter. But what happened to her (or a possible wife) is unknown.
Charles Pachelbel lived in “Charles Towne”, as it was known, for the rest of his life. He held what is thought to be the very first public concert in the city on November 22, 1737. He became organist at St. Philip’s in 1740, and opened a singing school, probably the first music school in South Carolina, a year before his death. In 1750 he contracted a disease – recorded as a “lameness of the hands” – and died shortly thereafter. His wife lived on for 19 years, dying on September 6, 1769. He had at least one son – Charles, born on September 10, 1739 – but absolutely nothing is known about him or any of his descendants.
Very little of Charles Pachelbel’s music survives. One of the few pieces is this beautiful Magnificat:
Still, it’s amazing to think that Pachelbel’s son lived just a few hours away from me. I know full well that Johann Pachelbel existed at the same time the American colonies existed… but I’ve just never put 2 and 2 together on this one.
I sent an email to the good people at St. Philip’s in Charleston asking for any additional information they may have about Pachelbel, and will update this article if they reply with anything interesting. I specifically asked if they knew where he was buried, because the current St. Phillip’s isn’t the one Pachelbel knew. The first building was built in 1680 but was destroyed by a hurricane in 1710. A new building – the one Pachelbel knew – was built by 1723, but burned to the ground in 1835. The current building was completed in 1836.
* * *
There is (or was) a music group from New York City called “Anonymous 4”. I always assumed that the group got its name because they specialized in medieval and early Renaissance music written by unknown authors… and there were four of them, Hence, Anonymous 4:
By the way, that chant is in 15th century ENGLISH:
Edi beo thu, hevene quene, Folkes froure and engles blis, Moder unwemmed and maiden clene, Swich in world non other nis. On thee hit is wel eth sene, Of all wimmen thu havest thet pris; Mi swete levedi, her mi bene And reu of me yif thi wille is.
Come to find out, however, Anonymous IV was a real person, and a very important one, too.
Anonymous IV wrote a treatise about the Notre Dame School of Polyphony, at the time the epicenter of European music:
As the name suggests, no one knows who Anonymous IV was. He was almost certainly male, and almost certainly a student at Notre Dame in Paris. He was very likely English, because his works were discovered at Bury St Edmunds in England. Because of historical references in his work, they can be dated to the 1270s or 1280s.
It’s through Anonymous IV that we know Léonin and Pérotin, the two earliest European composers known by name. Anonymous even helpfully named specific works by them, greatly helping music scholars assign authorship to previously anonymous works. Although Léonin and Pérotin had both been dead for decades by the time Anonymous IV wrote about them, his description seems to indicate that they were still popular at the time, not unlike Elvis is today.
But there’s more than that. Anonymous IV mentions early music theorist Franco of Cologne, and describes several types of chants in detail, like organum and discant. He talks about the rules of music – why things were written they way they were – as well as how notation worked, and various genres that were popular in his day.
It’s all breathtakingly interesting stuff, and you can read a copy of his work (or download it in PDF, EPUB, Kindle and other formats) for free here.
When I was a small boy, I was in awe of my mother and grandmothers, particularly because they seemed to know every detail of those Old School social rules. For instance, every Mother’s Day our church offered carnations to the ladies of the congregation, and my mom knew to take a pink one (because her mother was alive), while my grandma knew to take a white one (because her mother had passed on). Both instinctively knew to wear them on their left side, just as they instinctively knew when to send thank you cards, how long they had to send them, and how much writing to actually put on the card itself… just a quick thanks? A long paragraph or two of sincere gratitude for a gift or thoughtful action? They always just… knew, somehow.
Many of these rules have fallen by the wayside, but there’s one rule they absolutely, positively keep: no white after Labor Day. I can imagine my grandmother now: “Son, I’m 94 years old. I’ve come to accept women preachers and gay marriages… but I’ll be damned if I’m going to wear white shoes in October. There are some rules you just don’t break.”
Society doesn’t seem to take the “no white after Labor Day” rule very seriously any more. I bet millennials don’t even know it is a rule: that’s the beauty of Generation X: we knew about the rules like “no white after Labor Day”, but we broke ’em anyway. But some people used to take the rule seriously. So seriously, in fact, that it caused a riot.
* * *
Straw hats were popular with men in the early 20th century. In Europe and Asia, the tradition of wearing hats made of straw or reeds – but only in summer! – dates back to the Middle Ages. And why not? They keep the sun out of your face, and unlike felt or wool hats, they’re breathable, keeping you cool in the summer. Although hilariously unfashionable now, they were kind of dapper:
In New York City, the custom was to wear straw hats until September 1st, but no later. At some point in the early 1900s, for reasons unknown, the cut-off date shifted to September 15th. It also became something of a popular prank – not just in New York, but throughout the country – for teen boys to sneak up on people wearing such hats after the cut-off date and knock the hats to the ground and stomp on them. Don’t ask me why: these things just happen.
In 1922, some teens decided to jump the cut-off date by two days. On September 13, a pack of teen boys in the Mulberry Bend area – one of the worst parts of Five Points, arguably the worst neighborhood in Manhattan – went around knocking off people’s hats and stomping on them, as was the custom. That is, until they tried knocking the hats off a bunch of dock workers. These guys fought back, and soon a full-scale brawl was underway between the two groups. The fight was so big that it shut down traffic on the Manhattan Bridge until police could come and break it up.
But police couldn’t arrest everyone involved. Groups of teen boys would scatter from police and start the hat stomping anew in other neighborhoods. The next day, the riots intensified. Some teens even roamed the streets carrying big sticks with nails sticking out the business end. Up to 1,000 teens caused trouble on Amsterdam Avenue, beating up some so badly that they needed medical attention. Cops didn’t take the matter seriously, partly because of “boys will be boys”, but also because if they broke up the rioters in one area, they would just splinter off into other areas. Rioters, emboldened. even snatched the hat off a police sergeant, who hilariously fell face-first into a mud puddle while chasing the lawbreakers.
The riots kind of died down by themselves by September 16th. Although “hat violence” continued for several years – one man was murdered in a hat stomping incident in 1924, and several “hat hooligans” were arrested in 1925 – the full-scale riot of 1922 was unique: never again did groups of youthful social enforcers take to the streets. Within a few years, straw hats themselves went out of fashion altogether, taking with them the odd custom of teens knocking them off people’s heads.
I have a big ‘ol pile of sciencey-type stuff on my desk I’ve been meaning to post… so let’s do this thing:
You probably don’t think about it much, but people who create international websites have a constant stream of pains-in-the-ass to deal with on a daily basis. We all know that British English spells some things differently than American English (“colour” vs. “color”), and that British people write their dates differently than Americans do (“10 January 2014” vs. “January 10, 2014”). But have you ever thought about the grammar quirks of the hundreds of other countries? This video from the guys at Computerphile breaks it all down, and makes you appreciate the developers at Facebook so much more:
Interested in cryptography but don’t know how it works? Check out this video, also from Computerphile, about how public key crypto works. It’s amazing how clever people can be sometimes:
You may not know this, but American soldiers often use a dozen or more radios while out on patrol. You might think this is due to parochialism between the services: the Army has their radios, while the Navy, Air Force and Marines have their own systems ‘cos they think theirs is “better”. And yes, there is a bit of that. But it’s mostly due to physics: the type of radio you need to briefly communicate with warplanes hundreds of miles away isn’t the same kind of radio you need for frequent communications with headquarters a dozen miles away, which isn’t the kind of radio you need for near-constant communication with fellow soldiers only yards away. The US Army spent $6 billion working on a “soft radio” that could quickly be reconfigured to meet any possible need. And, as Ars Technica notes, it was a total disaster. Read this article, not just for the tech behind it all, but how changing the scope of a project during the project can lead to disaster.
Having said that, while we always think of the Pentagon when it comes to massive-scale government waste, things are not always what they seem. Back in the 1980s, there was a “scandal” in which the Pentagon allegedly paid $600 for a hammer. Come to find out, the real story was a bit more complicated: the Pentagon needed to make some repair kits for a specialized device. The company contracted to make the kits had to do some R&D for the project: they were repairing a custom-made product after all. When it came time to bill the government, the folks at the Pentagon for some reason insisted that the contractor spread the R&D costs equally across the project. So instead of paying $5 million in R&D and $15 each for hammers, the R&D costs were $0 and the hammers appeared to cost $435 (which the media later conflated to $600). It’s an accounting thing. If you were to take your car to a mechanic, and he put four $2 spark plugs in your car and charged you $75 for the labor, most people would understand the difference between parts and labor. Journalists back in the 80s apparently did not, and bemoaned the Pentagon spending almost $21 each for $2 spark plugs,
I’m bringing all this up because of the recent stories about how the Army “wasted” $5 billion on a camouflage pattern that didn’t work. Again, this is more about lazy journalism than wasted tax dollars: the Army did try a new camouflage pattern, and no, it didn’t work. But they actually spent well under $100 million on the design. That’s hardly chump change. But the other $4.9 billion was spent on new uniforms and other gear with the new camo pattern on it. The Army actually used most of that equipment… so how much did it actually waste? It’s like when a company decides to change its logo. You often hear that “[company’s] new logo cost $100 million”. Of course, the actual logo probably cost around $100,000 to design, with another million or two thrown in for research (to make sure the new logo doesn’t have an offensive or negative meaning in dozens of countries) and legal fees (to make sure it doesn’t look too much like some other company’s logo). The other $95 million is for new uniforms, new paint jobs on trucks or planes, new stationary, updating the company’s website, etc. But even if a company decides to go back to their previous logo a few years later, how much is actually wasted? Certainly the cost for the new logo and research and legal fees. But employees need uniforms, trucks need to be repainted, and the website was probably going to need some other update anyway.
There is, however, an amusing aspect to this: when the Army decided to change their camo, they had a complicated, bureaucratic process for doing so. When the Marines decided that they needed new camo, they went to their sniper school in Virginia and asked a couple of guys to come up with a better color scheme. So the snipers went to a nearby Home Depot and found some paint samples. Done and done.
The idea of blood transfusions has been around for a remarkably long time. It wasn’t tested scientifically until the 1600s, however. It was a disaster, mostly because doctors of the time were trying to transfer blood between species – putting goat blood into a human, for instance. In 1817 a British physician named James Blundell finally hit upon the idea of human to human transfusion. This seemed to work a tiny bit better: in some cases it worked, but in most cases it did not. Why that was became a huge mystery for medicine, and it wasn’t until 1900 that blood types were discovered. Despite humanity knowing about blood types for over a century, we know very little else about it. Why do European populations have different blood type ratios than Africans and Asians? Why do we even have blood types at all? If you don’t have an answer to that question, rest easy: science doesn’t either. This amazing story from Ars Technica talks about how much we know, and really, how much we don’t know, about the blood in our veins.
In medieval times, secular courts often transferred “undesirable” cases to church courts, where the accused would be subjected to “trial by ordeal”. You probably learned about these in history class, but if not you’ve probably seen a movie or TV show where someone accused of a crime was made to do something horrific, like stick their arm into boiling oil to retrieve an object at the bottom of the pot. The theory was that an innocent would be protected by God, while a guilty person would not. So, after a few days, the person’s wounds would be checked, and the person’s guilt or innocence determined by how well they’d healed. It sounds barbaric, but economist Peter Leeson argues (PDF) that such trials were actually a clever bit of game theory devised by priests, probably based on the Judgment of Solomon. If you’re not up on your Old Testament (specifically 1 Kings 3:16-28), two new mothers come to King Solomon for help. One mother claimed that the other had accidentally smothered her son while sleeping, and had secretly swapped the other’s son for hers. Solomon thought about it for a while, and called for a sword. He intended to cut the baby in two, so that each mother could have half the baby. One mother said this was fine with her, while the other cried out for Solomon to stop, that the other woman could have the baby. Solomon reckoned the latter woman to be the baby’s mother, since a baby’s true mother would rather give the baby away than see it hurt. It’s one of the earliest written examples of game theory, and Leeson thinks those “trials by ordeal” were just a variation on that. Leeson has done statistical analysis of such trials, and his theory appears to be correct, Incidentally, Leeson thinks that priests knew who was guilty before the “ordeal” even happened (which is the whole point), and for innocent people the “boiling” oil was actually “uncomfortably warm” instead.
For decades, historians and anthropologists have tried to paint Native Americans as eco-aware pacifists who didn’t know of war and strife before those eeeeviiillll Europeans showed up. Washington State University archaeologist Tim Kohler disagrees, and in a recent paper he argues that the most violent period in Native American history happened some time in the 1200s. Kohler even argues that, as far as “genocides” go, what happened to the Mesa Verde people of modern day Colorado was worse, percentage-wise, than what happened in the 20th century in China or the Soviet Union. It’s an interesting read, to be sure.
Kids are funny: they do stupid stuff over and over again until they’re hurt or humiliated, and it’s only then that they learn not to do it again. All of us probably remember being repeatedly told “not to play on the railing, ‘cos you’ll fall off and get hurt” or “don’t run with scissors in your hand” and totally ignoring that advice… until you fell off the railing and broke your arm, or fell with the scissors and cut yourself.
This isn’t quite the same thing, but I had a similar thing with Limburger cheese.
Originally from Duchy of Limburg, an interesting corner of the Holy Roman Empire where modern day Germany, Belgium and the Netherlands meet, Limburger cheese is one of the foulest-smelling foods ever invented. When fresh, it’s a harmless hard cheese, not unlike feta. But then a bacteria is added which actually decomposes it into a creamy cheese… that positively reeks of ammonia. It smells… well, I can’t even describe it. Imagine if a soldier or homeless person wore the same boots for 6 months without taking them off once. Now, imagine the soldier or homeless person taking the boots off and sticking them into a giant pile of monkey diarrhea… while getting a perm… in a slaughterhouse. It’s about that bad, really.
German and Belgian immigrants brought Limburger with them to the US in the early 1880s… and Americans started making fun of it immediately, Seriously: it’s possible that the very first Limburger cheese joke was made on Ellis Island. It was called “the cheese you can find in the dark”. Vaudeville acts of German or Yiddish immigrants – even young Groucho Marx – were said to speak “Limburger English”. Mark Twain used Limburger in a short piece called “The Invalid’s Story”, in which a man wants to take a dead friend home by train, but is mistakenly given a box full of guns. The box is placed next to a shipment of Limburger, which begins to stink… so the protagonist thinks it’s his dead friend stinking up the rail car.
In real life, an Irish woman in New York City tried to commit suicide in 1895 because her German husband ate Limburger all the time and tried to “get amorous” with her with it on his breath. That same year, a strike broke out at a dairy in Newark when a Swedish worker smeared Limburger all over some equipment as a prank, causing anti-Swedish sentiment to boil over, which caused the Swedes to walk off the job.
Speaking of pranks, for decades comedies and cartoons had Limburger whenever something foul-smelling was needed, especially in Warner Brothers cartoons. Penelope Pussycat tried to escape from Pepé Le Pew by hiding in a Limburger factory to throw off her scent. A cartoon dog had Limburger dumped on him while reading the “a rose by any other name” line from Shakespeare in 1949’s A Ham in a Role, the “last cartoon of the Golden Age of American Animation”. And, of course, Tom and Jerry had Limburger in damn near every episode.
For some reason, this cheese was available everywhere when I was a kid. No joke: you could go to a Piggly Wiggly on Route 207 in East Bumble, Alabama, and they’d have it by the lunch meat (next to the Oscar Mayer braunschweiger, which I actually like, but never see anyone buy, either). And every single time I saw it, I just had to smell it.
“It can’t be as bad as I remember it,” I’d think. But it always was. Worse even.
“Mr. Head awakened to discover that the room was full of moonlight. He sat up and stared at the floor boards – the color of silver – and then at the ticking on his pillow, which might have been brocade, and after a second, he saw half of the moon five feet away in his shaving mirror, paused as if it were waiting for his permission to enter. It rolled forward and cast a dignifying light on everything. The straight chair against the wall looked stiff and attentive as if it were awaiting an order and Mr. Head’s trousers, hanging to the back of it, had an almost noble air, like the garment some great man had just flung to his servant; but the face on the moon was a grave one. It gazed across the room and out the window where it floated over the horse stall and appeared to contemplate itself with the look of a young man who sees his old age before him.”