WWII soldiers carried emergency chocolate that tasted intentionally bad.

  • WWII Ration D bar
WWII Ration D bar
Credit: Craig Brown/ Alamy Stock Photo
Author Timothy Ott

December 18, 2025

Love it?

In 1937, Captain Paul Logan of the U.S. Army Quartermaster General’s office visited the Hershey Chocolate Corporation with the mission of creating a chocolate bar for use as an emergency ration for soldiers. Among the requirements, this bar needed to supply a hearty dose of energy and have a higher melting point than normal chocolate, so as to remain solid in sweltering conditions. The bar was also meant to taste “a little better than a boiled potato,” in Logan’s words, meaning it had to be unappetizing enough to prevent soldiers from scarfing down the ration when no emergency was afoot.

Following those instructions wasn’t as simple as it sounds. Hershey chemist Sam Hinkle came up with a suitable formula of chocolate liquor, skim milk powder, cocoa butter, oat flour, vanillin, and barely enough sugar, but the resulting paste-like mix was too thick to be poured into molds and challenging to extract after setting. However, Hershey was eventually able to churn out 100,000 of what became known as “D ration” bars per day by 1939, and by the end of 1945, production for these and other military-designated chocolates was up to around 24 million units per week.

As for the order to ratchet up the unappetizing quotient, it’s safe to designate this mission accomplished. The dense ration bars were difficult to bite into with their high melting point, and the lower sugar level meant taste buds were overwhelmed by the bitterness of dark chocolate. As a result, these unappealing bricks were typically saved for emergencies as intended, but there turned out to be an unforeseen use for them as well. With communities across war-torn Europe sometimes desperate for any kind of food, U.S. regiments found they could win over suspicious locals by handing out the chocolate rations.

The ancient Roman calendar started the year in March.

  • Romulus, founder of Rome
Romulus, founder of Rome
Chronicle / Alamy Stock Photo
Author Darren Orf

December 19, 2023

Love it?

Today’s modern calendar comes with a lot of quirks, not least of which are the names of the last four months. Although September, October, November, and December seemingly begin with numeral prefixes, the numbers don’t match their place in the calendar. October, for example, is not the eighth month of the year, but the 10th. So what’s going on here? Well, blame the Romans

Ancient Rome’s original calendar, which according to myth was created by the city’s legendary founder Romulus in the eighth century BCE, contained only 10 months. The year lasted 304 days, beginning in March (named for the Roman god Mars) and ending in December, the 10th month (marking the annual harvest). But because this calendar woefully underestimated the true length of a year, it was replaced with a new calendar by the Roman king Numa Pompilius around 713 BCE. The new calendar put the year at 355 days long, divided into 12 months, based on the cycles of the moon. It added the month of January to the start of the year and tacked on February at the end, though the latter eventually moved to its current position between January and March. This change threw the numerically named months (which at that point also included Quintilis and Sextilis) out of whack. Strangely, no one seemed to mind, and this quirk of the calendar has been with us ever since.

George Washington was passionate about mules.

  • Saddled mule
Saddled mule
Credit: Historic Collection/ Alamy Stock Photo
Author Michael Nordine

December 18, 2025

Love it?

Among the many things George Washington couldn’t tell a lie about was his love of mules, which come from breeding a female horse (mare) with a male donkey (jack). Considered the best of both worlds, the humble mule can do as much work as a horse while requiring less food and water; it’s also less stubborn and more intelligent than a donkey. Washington believed mules were the future of agriculture and set out to breed them on a large scale — something that hadn’t been done in the fledgling United States.

But there was a problem with Washington’s plan of revolutionizing America’s beasts of burden: The best donkeys came from Spain, which wouldn’t export the prized creatures without a royal exemption from King Charles III himself. As the hero of the American Revolution and soon-to-be president, however, Washington was fairly well connected.

According to an account from Thomas Jefferson, Charles III eventually learned of Washington’s interest in Spanish donkeys and was happy to find “two of the very best to be procured & sent you as a mark of his respect.” Only one of them survived the long journey despite Washington’s detailed transport instructions (such as “let the Jacks be put separate & with no other Creatures, lest they should get kicked, & hurt themselves or hurt others”), but Royal Gift, as the mule-loving military hero named him, eventually arrived at Mount Vernon safe and sound. Within 15 years, Washington had nearly 60 mules working the land, earning him the title of “Father of the American Mule.”

Razor blades used to be disposed of in holes in bathroom walls.

  • Razor blade collection
Razor blade collection
Credit: Cutting30/ Shutterstock
Author Michael Nordine

June 24, 2024

Love it?

Should you find yourself renovating your bathroom anytime soon, be on the lookout for something strange: razor blades, which used to be tossed into holes in the bathroom wall. Depending on when your home was built, you might even still have one of these (vaguely terrifying) slots either in your medicine cabinet or in the wall above your sink. They were fairly common in the early 20th century, when disposable blades first became widely available. Disposable razors made it safer and easier to shave at home, but problems arose when it came to actually disposing of the blades: Razor blades are sharp, not to mention potential biohazards, and simply tossing them in the trash ran the risk of accidentally nicking yourself with a used blade.

Enter the bathroom wall solution, which essentially delayed the problem by years or decades and passed the burden to someone else. The question of razor blade disposal became less of an issue once better, more durable razors were introduced to the market, especially the kind you throw away entirely, handle and all. Thanks to these innovations, homes built in the 1970s or later are unlikely to include a disposal slot — which, unless you’d like to be greeted by hundreds of rusty blades when you knock down your bathroom wall, can only be a good thing.

What became the Statue of Liberty began as a monument for Egypt.

  • The Statue of Liberty
The Statue of Liberty
Credit: ClassicStock/ Alamy Stock Photo
Author Michael Nordine

December 18, 2025

Love it?

One of the greatest gifts America ever received was originally envisioned for another nation. Before creating “Liberty Enlightening the World,” as the Statue of Liberty is officially known, French sculptor Frédéric Auguste Bartholdi set to work on “Egypt Carrying the Light to Asia,” which was meant to be placed at the entryway of the Suez Canal in Port Said, Egypt. It would have looked fairly similar to Lady Liberty, with an “Upper Egyptian” (Saeid Misr) wearing a robe and holding a torch. Bartholdi was inspired by a trip to Abu Simbel, the site of two iconic temples devoted to Ramesses II, and planned the sculpture to stand 86 feet high on a 48-foot pedestal. 

However, the statue was deemed too costly to produce, and the Port Said Lighthouse was erected instead. Bartholdi then repurposed his design after turning his attention to America due to a proposal by Édouard de Laboulaye, a French historian and abolitionist who wanted to honor the century-old alliance between the U.S. and France, as well as America’s successful effort to abolish slavery. The monument, renamed the Statue of Liberty, was constructed in France and presented to Levi Morton, then the U.S. ambassador to France and later vice president under Benjamin Harrison, in a ceremony in Paris on July 4, 1884. Following its completion the next year, it was disassembled and shipped to New York City, where it still stands today.

Christmas Day used to be a popular day to get married.

  • Victorian wedding reception
Victorian wedding reception
Credit: duncan1890/ DigitalVision Vectors via Getty Images
Author Michael Nordine

December 11, 2025

Love it?

Nowadays, December is the most common month for engagements worldwide, but not for weddings — that would be October, at least in the U.S., when roughly 17% of betrothed couples tie the knot. The final month of the year is one of the least popular times to get married — December ceremonies account for only about 5% of U.S. weddings total — but that wasn’t always the case. In Victorian Britain, for instance, Christmas Day was a particularly popular occasion for weddings.

People didn’t get hitched during the holidays for the most romantic of reasons. Rather, it was often because Christmas and Boxing Day (December 26) were the only two consecutive days of the year that young, working-class couples were certain to be off work in the 18th and 19th centuries. Most people worked six days a week and couldn’t afford the kind of grand nuptials that have since become common, so group weddings were regularly performed. Christmas ceremonies saw a resurgence during World War II, when soldiers on leave for the holidays took the opportunity to tie the knot before returning to active duty. 

Tinsel used to be made out of real metal.

  • Christmas tree decorated with tinsel
Christmas tree decorated with tinsel
Credit: ullstein bild Dtl./ ullstein bild via Getty Images
Author Sarah Anne Lloyd

December 10, 2025

Love it?

Once upon a time, no Christmas tree was complete without a healthy coating of tinsel hanging from the boughs. Those shimmering threads are less common now, though far from gone — especially in households that like an extra touch of sparkly holiday magic. But today’s tinsel, which is usually made of PVC plastic, has a different look and feel than vintage varieties, which were made of real metal. Before you go scouring antique malls looking for a little Christmas cheer, however, know that older tinsel isn’t always better, because sometimes, that metal was lead.

Tinsel made of tin-laminated brass or silver-plated copper began gracing Christmas trees in wealthy American homes in the late 19th century. At the turn of the 20th century, mass production drove the price down and made the decoration accessible to more households. Lyon, France, was a tinsel manufacturing powerhouse, but factories struggled to keep up with U.S. demand due to metal rationing during World War I.

Lead became a popular material for tinsel after a German company received a patent for lead tinsel in 1904, and American companies followed suit. (At the time, the United States was the world’s largest producer and consumer of refined lead.) It remained popular for decades, so much so that tinsel was sometimes referred to as “lead icicles.” Scientists already knew that lead could be toxic, but activism in the 1970s started drawing attention to the hazards the substance posed to children, and the FDA pulled lead tinsel from the market in 1972. Luckily, a Dow Chemical engineer had patented an iridescent plastic tinsel in 1969, so a replacement was already waiting in the wings.

In 1915, a three-minute long-distance phone call cost the equivalent of more than $600.

  • Man using telephone, circa 1915
Man using telephone, circa 1915
Credit: imageBROKER.com/ Alamy Stock Photo
Author Sarah Anne Lloyd

December 10, 2025

Love it?

Most people don’t think twice about making long-distance phone calls today; now that 98% of Americans use mobile phones, people don’t even change their area code when they move. But cross-country conversations used to carry a hefty price tag, and they were at their priciest when the technology first emerged. 

In January 1915, about six months after the first transcontinental telephone line was installed, a three-minute call from New York City to San Francisco cost a whopping $20.70, the equivalent of more than $600 today. The shorter the distance, the smaller the price: If you were on the East Coast and wanted to talk to someone in Denver, for instance, you’d have to justify spending $11.25 for the call, around $360 today. Phones were still a luxury item in the United States in 1915, with around one phone for every 10 people. The telegraph was a far more cost-effective way to deliver an urgent message; it cost about a dollar to send a telegram from NYC to San Francisco, the equivalent of around $30 today.

Over time, long-distance rates came down. By 1960, there were around four phones for every 10 people, and a cross-country call was $2.25, or around $25 today. It’s still hard to imagine spending that much just to dial a phone with a different area code, but at least by the mid-20th century the average person could call their grandma once in a while.

Humans ate popcorn at least 6,700 years ago.

  • Corn kernels
Corn kernels
Credit: Yuriy T/ Alamy Stock Photo
Author Sarah Anne Lloyd

December 10, 2025

Love it?

Corn, or maize, is among the world’s oldest domestic crops, first cultivated from a wild grass called teosinte around 9,000 years ago in what’s now southern Mexico. A few thousand years later, corn arrived in South America, where at least 6,700 years ago, popcorn was born.

Archaeologists discovered evidence of the oldest known popcorn, along with several other varieties of corn, near Peru’s northern coast in 2012. Because South America was outside the area where teosinte naturally grows, ancient farmers could breed new types of corn far more efficiently without worrying about cross-pollination from wild plants. 

In order for corn to pop, it needs to have a relatively tough outer shell that can trap heat and moisture inside for long enough to build pressure. Today, we typically remove the kernels from the husk and dry them before cooking them — often in the microwave. Popcorn predates even basic pottery in the region, so the first people to snack on popcorn probably just held the cob directly over the fire. 

Some presidents didn’t take their presidential oath on a Bible.

  • Inauguration of Lyndon B. Johnson
Inauguration of Lyndon B. Johnson
Credit: World History Archive/ Alamy Stock Photo
Author Michael Nordine

December 4, 2025

Love it?

Presidents of the United States are typically sworn in with their hand on a Bible, but American heads of state aren’t always typical. John Quincy Adams, for instance, placed his hand on a book of law, and according to some reports, Thomas Jefferson used his own handcrafted book, The Philosophy of Jesus of Nazareth. Theodore Roosevelt was sworn in after his predecessor William McKinley was assassinated and thus didn’t have time to use any book at all, and at his first inauguration, Calvin Coolidge had a Bible on the table but didn’t place his hand on it, as doing so would have gone against the puritan belief that physical objects shouldn’t be accorded the same respect as God. Lastly, Lyndon B. Johnson, who hastily took the oath aboard Air Force One in the immediate aftermath of John F. Kennedy’s assassination, thought he was using a Bible but was actually using a Catholic missal.

The presidential oath of office is simple and to the point: “I do solemnly swear (or affirm) that I will faithfully execute the Office of President of the United States, and will to the best of my ability, preserve, protect, and defend the Constitution of the United States.” Like many of America’s governmental traditions, being sworn in on a Bible can be traced back to George Washington. The founding father went so far as to kiss the book, as did several of his successors.