Michelangelo’s David was censored by Queen Victoria.

  • Michelangelo’s David
Michelangelo’s David
Andrea De Santis/ Unsplsh
Author Bennett Kleinman

August 30, 2023

Love it?

The statue of David is among Michelangelo’s greatest masterpieces, but the sculpture isn’t without detractors. One such critic was none other than Queen Victoria, who reigned over England from 1837 until 1901, centuries after the original David was sculpted in 1504. In 1857, a plaster replica of the David was shipped to Great Britain as a gift to Victoria from Leopold II, the grand duke of Tuscany. While the queen accepted the diplomatic gesture with grace, she was, according to some anecdotal reports, left aghast by the statue’s blatant nudity. The work was sent to be displayed at London’s Victoria and Albert Museum, where curators crafted a plaster fig leaf to obscure the reproductive organ that some found offensive. This leaf was attached with the help of “strategically placed hooks” that allowed it to be lowered in place in anticipation of any visits from female members of the royal family, as the statue otherwise remained uncensored. The leaf — a replica of which can still be found in the museum’s collection — was last used during the first half of the 20th century, leaving the replica of David fully nude as Michelangelo initially intended.

However, even the original David statue endured years of unceremonious censorship. Upon its unveiling in Florence in 1504, the sculpture was covered with a garland made of 28 copper leaves to cover its nakedness, and the garland remained around David’s waist until at least the mid-16th century. Concurrently, around the year 1541, the Vatican implemented a “Fig Leaf Campaign” to censor nudity in art that it deemed offensive. At first, the Catholic Church sought to cover up the naked figures in Michelangelo’s painting “The Last Judgment,” as well as similar works. Thankfully, many of these once-censored works of art have since been restored to their original condition.

During WWI, Americans called sauerkraut “liberty cabbage.”

  • Harvest of cabbage, WWI
Harvest of cabbage, WWI
piemags/NSC/ Alamy Stock Photo
Author Kevin McCaffrey

August 15, 2023

Love it?

Americans have a long tradition of rebranding any foods that bear the name of a rival nation during times of conflict. When France refused to support the United States’ war in Iraq in 2003, for example, the cafeteria menus in three congressional office buildings in Washington, D.C., changed the name of French fries — which, by some accounts, were actually invented in Belgium — to “freedom fries,” and French toast became “freedom toast.”  

The U.S. pulled a similar move while at war with Germany during World War I: Sauerkraut’s German origins led Americans to rename the condiment “liberty cabbage.” Other foods that we think of as classically American yet bear the names of German cities were also affected. The word “hamburger” comes from Hamburg, Germany, so during the Great War it was rechristened “liberty steak.” The seemingly all-American hot dog, meanwhile, was called a “frankfurter” at the time, and as the connection to Frankfurt, Germany, couldn’t stand, it was rebranded “liberty sausage.” (The term “hot dog” is also sneakily of German origin, as it comes from “dachshunds,” aka “little dogs.”) And speaking of dogs, in 1917, the American Kennel Club changed the official name of German shepherds to “shepherd dog,” and in England the breed was renamed “Alsatian.”

The U.S. government poisoned alcohol during Prohibition.

  • Bootleg liquor circa 1930s
Bootleg liquor circa 1930s
GL Archive/ Alamy Stock Photo
Author Nicole Villeneuve

August 17, 2023

Love it?

It’s been dubbed the “noble experiment,” a name that came from then-President Herbert Hoover calling Prohibition “a great social and economic experiment, noble in motive and far-reaching in purpose.” In January 1920, the United States banned the manufacture, sale, and transportation of “intoxicating liquors,” a move made in the wake of temperance movements that sought moral and social reform throughout the 1800s — in spite of some failed attempts at similar regional programs around the country. 

Despite the ban, the demand for alcohol remained high. Bootlegging — illegal production, smuggling, and selling of liquor — thrived. Illegitimate drinking spots known as speakeasies flourished, with approximately 30,000 of them existing in just New York City alone by decade’s end. As the defiance against liquor laws intensified, authorities managed to mitigate smuggling from other countries; the bootleggers, however, responded by stealing massive quantities of industrial alcohol — used in automotive fluids, paints, and medical supplies — and refining it to make it drinkable (it became the country’s top source of liquor). Then, the U.S. government tried to beat the rum-runners at their own game. In 1926, President Calvin Coolidge’s administration mandated that manufacturers add even more dangerous chemicals to their industrial products — substances such as gasoline, formaldehyde, and the easily lethal methanol — to dissuade the underground industries and their customers.

The consequences were immediate — and in many cases, fatal. Consumption of alcohol continued despite fear of the additives, and some estimates claim the chemical additions caused around 10,000 deaths in the U.S. Public health officials, including New York medical examiner Charles Norris, lambasted the move. Eventually, the official end of the program came with the repeal of the 18th Amendment in December 1933, marking the end of Prohibition. 

The Founding Fathers actually declared independence on July 2.

  • Independence Hall
Independence Hall
Keith Lance/ iStock/ Getty Images Plus via Getty Images
Author Kevin McCaffrey

June 26, 2023

Love it?

Independence Day is celebrated in an explosion of fireworks in the U.S. every year on the Fourth of July, but America’s Founding Fathers actually voted to break free from Great Britain two days earlier, on July 2, 1776. The process began nearly a month before that, when the Second Continental Congress introduced a motion calling for independence on June 7, 1776. After weeks of debate, a five-person committee set out to write a statement in favor of breaking from Great Britain. The star-studded committee included future Presidents and longtime rivals Thomas Jefferson and John Adams, as well as Benjamin Franklin of Pennsylvania, Roger Sherman of Connecticut, and Robert R. Livingston of New York. On July 2, 1776, the Second Continental Congress officially voted to declare independence. In fact, Adams was adamant that the ensuing celebrations should be held on the day of the vote: He wrote that July 2 would be “the most memorable Epocha, in the History of America.” 

What actually happened on July 4, 1776, was that the Second Continental Congress formally adopted the Declaration of Independence, which was signed later by 56 leaders from across the colonies. The two days in between July 2 and July 4 were spent working on edits to the document. Almost a quarter of Jefferson’s original words were cut, much to his chagrin — he referred to the revisions as “mutilations.” On this point even Adams agreed with him, observing that “they obliterated some of the best of it.” 

New England had a series of “vampire panics” in the 19th century.

  • Banning a vampire
Banning a vampire
Photo credit: Sunny Celeste/ Alamy Stock Photo
Author Nicole Villeneuve

June 2, 2023

Love it?

Vampires haunted our imaginations long before Bram Stoker’s classic 1897 novel Dracula popularized the vision of the blood-sucking monster we know today. They’ve existed as fearsome supernatural forces throughout centuries of folktales and superstitions in just about every culture across the globe — including the United States. In fact, in the late 18th century and throughout the 19th century, many residents in rural Rhode Island, Connecticut, Massachusetts, Maine, and Vermont believed vampires were preying on their communities. The panic was fueled not only by superstitions, but also by a lack of understanding of medical science and infectious disease. 

In the early 18th century, what we now know to be tuberculosis began ravaging rural New England at epidemic rates, killing an alarming number of people. Victims became gaunt and pale, and often coughed blood, and some residents became convinced that vampires were at fault (though plenty of others remained skeptical). The bacterial lung infection is highly transmissible and at the time often consumed entire families, further fueling the belief that vampiric forces were returning to claim more victims. The vampires weren’t necessarily seen as corporeal beings, raising from the dead under the cover of night; according to author and folklorist Michael E. Bell, the belief was that vampires maintained a sort of spiritual connection with the families, and exploited it to drain their life force. 

In an attempt to stop the deadly spread, a vampire hunt took place throughout the region. Graves were exhumed and bodies were sometimes turned over or parts rearranged; other times, vital organs were removed and even burned, a practice believed to prevent the undead from rising from the grave and causing further harm. Roughly 80 cases of exhumation have been recorded between the late 1700s and 1892, though many more are presumed to have taken place. As outlandish as it may seem now, the New England vampire panic remains a relevant reminder that the monsters we create are often a reflection of our own fears, unknowns, and anxieties. 

King Louis XIV reportedly owned 413 beds.

  • Louis XIV at Versailles
Louis XIV at Versailles
Photo credit: pictore/ iStock
Author Bennett Kleinman

June 2, 2023

Love it?

With a reign totaling 72 years and 110 days, King Louis XIV held the throne longer than any monarch in the history of France — or the world, for that matter. The Sun King, as he was known, embraced opulence to a historic degree throughout his reign (from 1643 to 1715). Case in point: His majesty reportedly owned no fewer than 413 beds, which were considered status symbols at the time. Few people had the wealth, let alone space, to afford and display such a vast collection of luxurious furniture, but the French monarch was a uniquely ostentatious individual. The beds — 155 of which were characterized as boasting greater importance than the others — were dispersed throughout France’s various royal palaces for the personal use of Louis and his family. Mind you, these beds weren’t just meager cots, but rather ornately adorned furnishings with features such as fabrics from the far reaches of Persia (modern Iran) and China, as well as gold plating, high pillars, and intricate embroidery.

In 17th-century France, royal bedrooms were often treated like reception areas rather than private quarters. Thus, Louis XIV invited his many guests and dignitaries to stand behind a special railing in his bedroom while he held court. Lavish beds such as the one at the Palace of Versailles were frequently used by Louis for official business, and he would sprawl out atop the gilded linens during these stately meetings. It was considered a true honor for those guests to gaze upon Louis XIV as he dozed, and it was deemed particularly thrilling to watch the king fall asleep or wake up from his deep slumbers.

There’s an ‘old’ Zealand.

  • Province of Zeeland
Province of Zeeland
Photo credit: JeroenVerhulst/ Getty Images
Author Michael Nordine

June 2, 2023

Love it?

Some “new” places have become so much more well known than their namesakes that most of us don’t even think to wonder what those namesakes are. New York is probably the most famous of these, but it’s hardly the only one. Case in point: There’s an “old” Zealand, and it’s nowhere near the “new” one. Dutch navigator Abel Janszoon Tasman first sighted New Zealand’s South Island (the larger of the country’s two main islands, also officially named Te Waipounamu) in 1642, and cartographers gave it the Latin name Nova Zeelandia (known as “Nieuw Zeeland” in Dutch) in honor of the maritime province Zeeland in the Netherlands. (The spelling was eventually anglicized to “New Zealand.”) 

Zeeland, the least populous of the Netherlands’ 12 provinces, is a group of islands and peninsulas located northwest of Antwerp, Belgium, and features a lion emerging from water on its coat of arms. It could hardly be farther away from the country named after it: The two are 11,000 miles apart, and the maximum distance between any two points on Earth is about 12,450 miles. New Zealand, meanwhile, continues to rank among the best countries in the world by several metrics — it’s No. 2 on both the Global Peace Index and Corruption Perceptions Index, among a number of other impressive rankings. It’s also in good company among other “new” places across the world, many of which mark a symbolic passage from the Old World to the new one.

John F. Kennedy’s waffle recipe is stored in the National Archives.

  • Waffle making ingredients
Waffle making ingredients
MarieKazPhoto/ iStock
Author Bennett Kleinman

August 23, 2023

Love it?

John F. Kennedy’s political legacy is still felt today, but the 35th U.S. president also made a lasting culinary impact. Though he delegated the actual cooking to others, JFK’s favorite waffle recipe was widely requested and shared during his time in office, and is now stored in the National Archives. The dish includes standard ingredients such as butter, sugar, and eggs, but mixes sifted cake flour with whipped egg whites to produce a light and fluffy texture that Kennedy enjoyed. The president, who was also fond of orange juice, poached eggs, and broiled bacon at breakfast, topped the waffles with hot maple syrup and melted butter to finish off the decadent treat. 

JFK’s waffle recipe is likely the most famous of any president’s, but Thomas Jefferson may have played a key role in waffles being popularized stateside. Culinary lore suggests it was Jefferson who helped ignite the waffle craze in the 1790s upon his return from France, when he arrived with four waffle irons in his luggage. Jefferson even reportedly served waffles to explorer Meriwether Lewis at the White House prior to the Lewis and Clark expedition, though some historians at Jefferson’s Monticello estate claim the founding father’s impact on colonial waffle culture was minimal. President William Howard Taft also openly expressed a love for waffles during his presidency, though was known to detest eggs and enjoyed a steak for breakfast almost every morning.

The man who discovered oxygen also invented seltzer.

  • Fresh seltzer water
Fresh seltzer water
Irina Petrakova_6767/ Shutterstock
Author Bennett Kleinman

August 24, 2023

Love it?

Chemist Joseph Priestley may not be a household name, but his discoveries impact most of our everyday lives. Born near Leeds, England, in 1733, Priestley not only invented carbonated water (plus the pencil eraser) during his career, but he also independently discovered the atomic element oxygen. Of these accomplishments, his creation of seltzer came first, in 1767, when he lived near a brewery and was fascinated by the gaseous vapors it produced. Priestley mixed sulfuric acid and chalk to form carbon dioxide (though he didn’t know what it was at the time) and used the compound to add bubbles to still water. Shortly thereafter, he earned the prestigious Copley Medal for his publication “Directions for Impregnating Water With Fixed Air.” The beverage was later named “seltzer” in honor of the natural springs found in the German town of Selters.

When it comes to oxygen, Swedish chemist Carl Wilhelm Scheele actually studied the element in 1772, predating Priestley. However, Scheele’s findings weren’t published until 1777, allowing Priestley to conduct groundbreaking studies in the interim. On August 1, 1774, Priestley experimented by heating the red mercuric oxide of a candle to produce a then-mysterious colorless gas that was capable of supporting life. Two months later, Priestley presented his findings to French chemist Antoine Lavoisier, who conducted tests of his own, which proved to be more thorough and scientifically accurate. Priestley pushed back on Lavoisier’s subsequent findings, instead embracing archaic scientific theories such as the existence of a fire-like element called phlogiston. Lavoisier persisted, however, and named the new gaseous element “oxygen,” after the Greek “oxy genes,” meaning “acid-forming.”

The U.S. President never had a “red phone” during the Cold War.

  • Phone in the oval office
Phone in the oval office
The Color Archives/ Alamy Stock Photo
Author Kevin McCaffrey

August 2, 2023

Love it?

The ominous “red phone” on the desk of the U.S. President has been portrayed in movies and political commercials, and even makes an appearance in Jimmy Carter’s presidential museum. There’s just one issue: It never existed. Not only was the Cold War-era emergency hotline between the U.S. and Soviet Union not a red phone, but it wasn’t a telephone at all. After a 1963 meeting in Geneva, a communication system was created that linked teletype machines to send written messages via transatlantic cable. The catalyst for the Washington-Moscow “nuclear hotline” was the 1962 Cuban Missile Crisis, which led both the United States and the Soviet Union to fear a scenario in which they could not communicate quickly enough during an emergency, leading to nuclear war.

What’s more, the U.S. side of the hotline wasn’t even in the White House; it was installed at the Pentagon headquarters in Virginia on the other side of the Potomac. In 1986, the communication system was upgraded to high-speed fax, and in 2008 both sides shifted into the 21st century with email. In 2013, President Barack Obama’s administration added a new channel to the hotline to be used for communication specifically about cybersecurity. Technicians still test the hotline on an hourly basis, sending messages back and forth. As to where the red phone myth began, most evidence points to Hollywood. In 1964, two movies, Dr. Strangelove and Fail Safe, referenced a “red phone” in the context of the Soviet government and a nuclear emergency.