More than 60 nations signed a pact to outlaw war after WWI. 

  • Kellogg-Briand Pact, 1928
Kellogg-Briand Pact, 1928
Sueddeutsche Zeitung Photo/ Alamy Stock Photo
Author Rachel Gresh

January 24, 2024

Love it?

International relations changed dramatically after World War I. Public opinion swayed heavily against war, and peacekeeping measures were at the top of political agendas. One of the most ambitious attempts at deterring future conflict was signed on August 27, 1928: the Kellogg-Briand Pact. It outlawed war between the nations that signed, including France, the United States, the United Kingdom, Germany, Japan, Italy, Belgium, Poland, Czechoslovakia, Ireland, Canada, Australia, New Zealand, South Africa, and India. 

The pact was named after U.S. Secretary of State Frank B. Kellogg and French Minister of Foreign Affairs Aristide Briand, who spearheaded the act. Briand had proposed a bilateral agreement between France and the U.S. to deter aggression (especially from Germany), but eventually other countries were invited to join the alliance. The response was overwhelming. The world was eager to maintain peace, and the pact was signed by 15 nations initially, followed by 47 additional countries in the following years. Signatories of the “Paris Pact” (the act’s nickname) were to renounce war as a national policy and settle disputes by peaceful means. While acts of aggression were outlawed, self-defense was permitted, which led to the pact’s eventual downfall. What’s more, there weren’t any real legal consequences for violations, and nations soon began justifying international aggression as self-defense, and without any repercussions, beginning with the Japanese invasion of Manchuria, China, in 1931. Although the pact failed to prevent World War II, it remains in effect today, and its goals are still relevant. In 1945, following the end of the war, key ideas such as renouncing the use of war and promoting peaceful solutions were incorporated into the United Nations Charter, which was signed by many of the same signatories as the Kellogg-Briand Pact.

Hans Christian Andersen ruined his friendship with Charles Dickens by overstaying his welcome.

  • Hans Christian Andersen
Hans Christian Andersen
stocksnapper/ iStock
Author Sarah Anne Lloyd

January 22, 2024

Love it?

Famed British novelist Charles Dickens and Danish fairy-tale writer Hans Christian Andersen (known for The Ugly Duckling and The Little Mermaid) could have been lifelong friends. They met in 1847 at a swanky party; Andersen told Dickens he was “the greatest writer of our time,” and Dickens, in turn, sent Andersen several books he signed as “his friend and admirer.” The pair were close pen pals for the next decade, but their relationship quickly went south in 1857, when Andersen visited Dickens for what was supposed to be two weeks, but stretched on for five.

Dickens later wrote in a letter to his friend William Jerdan — one of multiple letters to multiple people about the visit — that “whenever [Andersen] got to London, he got into wild entanglements of cabs and sherry, and never seemed to get out of them again until he came back here, and cut out paper into all sorts of patterns and gathered the strangest little nosegays in the woods.”

While Andersen, who was notoriously difficult to be around, remembers the visit fondly, for the Dickens family it was both peculiar and exhausting. Andersen was moody, anxious, and sensitive to rejection. Dickens’ daughter Katey called Andersen a “bony bore,” and one morning, Dickens’ wife Catherine found the visitor face down on the lawn, crying and clutching a bad review of his most recent book. After Andersen’s departure, Dickens left a note on the mirror in the guest room: “Hans Andersen slept in this room for five weeks — which seemed to the family AGES.” In the following years, Andersen continued to write letters to Dickens, but Dickens stopped responding.

Ohio’s abbreviation used to be just “O.”

  • Ohio welcome sign
Ohio welcome sign
Credit: fotoguy22/ iStock via Getty Images Plus
Author Timothy Ott

May 22, 2025

Love it?

With the 1831 publication of Table of the Post Offices in the United States, the Post Office Department acknowledged the population’s reluctance to completely spell out state names on mailing envelopes (such as the always-vexing “Massachusetts”) and for the first time unveiled a list of approved abbreviations. The list largely followed consistent guidelines: States with one-word names were abbreviated with their first and last letters (e.g., Connecticut became “Ct.”); states with two-word names were marked by their initials (New York became “N. Y.”); and territories were shortened to two letters with a “T” (Arkansas Territory became “Ar. T.”). But there were a few anomalies among the group: Alabama and Illinois were denoted by their first two letters (“Al.” and “Il.”); Michigan Territory retained its first three (“Mic. T.”); Missouri used a middle letter (“Mo.”) to avoid confusion with Mississippi (“Mi.”); and Ohio simply became “O.”

When the U.S. Post Office released updated lists in 1874 and 1943 to account for the country’s expansion, it did away with abbreviations for Ohio and other states with short names. Then, in mid-1963, the department established new two- to four-letter abbreviations for all states — with the exception of Ohio, Iowa, and Utah — to fit the templates of new mail processing equipment. When the four-letter designations were still found to be too long, the Post Office shortened them one more time in October 1963 to a consistent two letters, and Ohio, Iowa, and Utah became “OH,” “IA,” and “UT.” And that’s where things stand today. The only state abbreviation that has changed since 1963 is Nebraska‘s; originally given the two-letter abbreviation of “NB,” the Cornhusker State was rebranded as “NE” in 1969 to avoid confusion with the Canadian province of New Brunswick.

The creator of Fabergé eggs made a Fabergé potato.

  • Mosaic Fabergé egg
Mosaic Fabergé egg
PA Images/ Alamy Stock Photo
Author Sarah Anne Lloyd

January 22, 2024

Love it?

Even if you don’t know the name of Russian jeweler Peter Carl Fabergé, you’re probably familiar with his Fabergé eggs, most of which were made for the Romanov family in their last few decades as Russian rulers. Like most things associated with the Romanovs, these Easter eggs were over-the-top opulent, finely crafted with intricate diamond patterns, hidden treasures, and elaborate bases. One single egg is worth around $33 million today.

Fabergé didn’t just make eggs, however. His other designs, including fruits and flowers in tiny crystal vases, also fetch a high price. But among all his dainty, ornate work, one unexpected subject stands out: a potato, just under 4 inches long, in a realistic, irregular shape. Crafted around 1890, the Fabergé potato is a polished box carved from pink agate, decorated with a gold-mounted lid and a fleur-de-lis clasp. It’s one of his lesser-known works, but it bears the mark of the Russian jeweler Michael Perkhin, the master goldsmith Fabergé used to create his eggs at the time. Anything that’s a Fabergé is extremely valuable, whether it’s an enameled egg with a tiny, perfect gold replica of a palace inside, or a humble spud. Indeed, the potato box was sold by the auction house Christie’s for $93,750 in 2016.

Memorial Day was originally called “Decoration Day.”

  • Decoration Day in Philadelphia, 1876
Decoration Day in Philadelphia, 1876
Credit: Heritage Images/ Hulton Archive via Getty Images
Author Nicole Villeneuve

May 15, 2024

Love it?

As the American Civil War came to an end in 1865, communities across the U.S. honored fallen soldiers through local ceremonies at burial sites. On May 30, 1868, the first national ceremony of this kind took place on a day that would come to be known as Memorial Day — though at the time, it was called “Decoration Day.”

A few weeks before the ceremony, John Logan, head of the Grand Army of the Republic, a Union veterans organization, issued a proclamation urging Americans to decorate Civil War soldiers’ graves with springtime’s “choicest” blooms. Logan stated that the May 30 commemoration would be “designated for the purpose of strewing with flowers or otherwise decorating the graves of comrades who died in defense of their country.” About 5,000 people gathered at Arlington National Cemetery for the first official Decoration Day observance. Along with flowers, each grave was adorned with a small American flag.

By the end of the 19th century, Decoration Day ceremonies were taking place on May 30 throughout the country. The name had started to evolve by this time, too; people began using the term “Memorial Day” instead. That moniker, however, didn’t become common until after World War II, and Congress didn’t make the name change official until 1967. A year later, Congress passed the Uniform Monday Holiday Act, declaring that certain federal holidays would be observed on Mondays, including Memorial Day, which was to be commemorated annually on the last Monday in May. Today, the holiday honors all Americans who have died in military service.

The first rockets date back to 1232.

  • Incendiary arrows in China
Incendiary arrows in China
Credit: ullstein bild Dtl./ ullstein bild via Getty Images
Author Sarah Anne Lloyd

May 15, 2025

Love it?

In the Middle Ages, China was a technology leader in explosives. Gunpowder was invented in China as early as the ninth century CE, and the Chinese were also the first to use it to propel weapons — the earliest known rockets. 

During the Battle of Kai-Keng (also called the Siege of Kaifeng) in 1232, Jin dynasty soldiers deployed simple, early rockets — fire arrows propelled with gunpowder — to defend themselves against Mongol invaders. To launch the arrows, soldiers used a tube capped on one end and filled with gunpowder, then attached to a stick for stability — basically a large bottle rocket. Once the tube was ignited, the fire, smoke, and gas that escaped out the open end propelled the tube and the stick toward a target.

Soon after finding themselves on the receiving end of fire arrows, the Mongols started developing rockets of their own. The idea soon spread to Europe, and with a growing number of inventors experimenting with rocketry, the technology continued to develop.

Though the Chinese fire arrows are often cited as the first example of rocketry, some earlier inventions used the same basic principle. In ancient Greece, one inventor launched a wooden pigeon using steam as a propellant. Throughout the next several centuries, similar experiments followed, including in China, where chemists developed firecrackers and other fireworks. 

The term “brunch” first appeared in 1895.

  • Family enjoying brunch, circa 1800s
Family enjoying brunch, circa 1800s
Credit: Universal History Archive/ Universal Images Group via Getty Images
Author Anne T. Donahue

January 24, 2024

Love it?

When British author Guy Beringer coined the word “brunch” in 1895, weekend dining was changed forever. In an essay for Hunter’s Weekly titled “Brunch: A Plea,” Beringer introduced the concept of a breakfast-lunch hybrid, suggesting that readers forfeit their heavy Sunday meals in favor of something lighter, served earlier in the day. The United States soon followed suit; in 1896, The New Oxford, a Pennsylvania newspaper, described the latest dining “fad” in which guests ate after 11 a.m. Initially considered an upper-class experience, brunch was largely reserved for households that had the time and resources to host guests for a leisurely midday meal.

Hollywood helped bring brunch to the mainstream in the 1930s. Movie stars taking transcontinental train trips frequently stopped in Chicago for a late-Sunday-morning bite, and hotels were happy to accommodate. Restaurants soon followed, and by 1939, The New York Times declared Sunday a “two-meal day.” The American public gladly obliged, and not only for their chance to socialize outside a church setting. While Beringer had originally advised diners to substitute tea and coffee for whiskey and beer with the meal, by the middle of the 20th century the brunch crowd was sipping on signature cocktails such as bloody marys and mimosas.

Abraham Lincoln and Harry Truman both served as postmasters.

  • Abraham Lincoln as postmaster
Abraham Lincoln as postmaster
Credit: Harriet Putnam/ Alamy Stock Photo
Author Timothy Ott

May 15, 2025

Love it?

Abraham Lincoln and Harry S. Truman are presidents of different eras and renown, but they nevertheless share the distinction of being the only two commanders in chief to have served as a town postmaster.

For the 24-year-old Lincoln, the 1833 appointment to postmaster for New Salem, Illinois, supplied a steady pay amid uncertain times, as well as perks that included a free daily newspaper. Lincoln had already failed in a bid for a seat in the state Legislature, and the postmaster job provided an opportunity for him to develop his connections around town by way of personal delivery service, the mail usually carried in his hat. 

The job also led to an event that burnished his reputation as a man of unflagging integrity. After the New Salem post office closed in 1836, Lincoln found himself in the care of some $16 to $18 in leftover funds. When a post office agent dropped by a few months later to collect the balance, Honest Abe produced the exact coin pieces from a sock, having refused to touch the money since it fell under his watch.

Unlike Lincoln, Truman never actually undertook the responsibilities of postmaster, although his connection to the role also reflected favorably on his character. Appointed to the position for Grandview, Missouri, in December 1914, Truman immediately passed along the day-to-day tasks — and the paycheck — to a widow who needed the money to support her family. Although he owned a farm and held another job with the town, the not-yet-wealthy future president certainly missed the postmaster salary of approximately $50 per month, noting in his autobiography that it “would have paid two farmhands.”

The world’s oldest computer is more than 2,000 years old. 

  • The Antikythera mechanism
The Antikythera mechanism
Credit: Hercules Milas/ Alamy Stock Photo
Author Bess Lovejoy

May 15, 2025

Love it?

When it was unearthed in 1900 among the ruins of an ancient Greco-Roman shipwreck off the Greek island of Antikythera, the instrument now known as the Antikythera mechanism was more or less ignored. The sponge divers who brought the corroded bronze lump to the surface had no idea they had just recovered an ancient marvel now considered the world’s oldest analog computer, dating back to around 100 BCE. Indeed, they were more interested in the bronze and marble sculptures, coins, and other treasures scattered around the shipwreck. The Antikythera mechanism even broke apart (perhaps because of exposure to the air) a few months later at the National Archaeological Museum in Athens, but it still took more than half a century before its remnants received sustained scholarly attention. 

In 1976, Derek J. de Solla Price, a Yale professor of history of science, published “Gears From the Greeks,” an academic text on the Antikythera mechanism based on more than 20 years of research. Price argued that the fragments of corroded bronze had once been a “calendar computer,” and explained how their complex system of gears and dials had been used to calculate the positions of the sun, moon, and planets at any day in time, past or future. The mechanism was also able to track the Egyptian and Corinthian calendars and the Olympiad cycle, calculate eclipses, and more, although some of its functions remain mysterious.

Of course, this wasn’t a computer in the digital, electronic sense; the gears were cranked by hand, and the whole thing was housed in a wooden case about the size of a mantel clock. However, the Antikythera mechanism is often referred to as an analog computer, or an extraordinarily sophisticated calculator. It was called the “most complex scientific instrument from the ancient world” by historian Alexander Jones, and nothing like it appears in the historical record for another thousand years. While scholars still debate some of its intricacies, its existence shows that ancient technology can be leaps and bounds beyond what we once imagined. 

The Eiffel Tower used to be painted yellow.

  • Repainting the Eiffel Tower, 1932
Repainting the Eiffel Tower, 1932
Credit: Smith Archive/ Alamy Stock Photo
Author Michael Nordine

May 15, 2025

Love it?

If you’ve ever wondered whether the color of the Eiffel Tower has its own name, the answer is oui: “Eiffel Tower brown,” which consists of three shades and was chosen for the way it blends into the Paris cityscape. But the iconic landmark has undergone several hue changes since the late 19th century, and was even yellow for a time. 

That makeover took place in 1899, when la tour Eiffel, as it’s called in French, received a coat of five colors that spanned from yellow-orange at its base to light yellow at the top. The structure is repainted once every seven years or so, and by 1907 it was entirely yellow-brown. In fact, between 2019 and 2022, in anticipation of the 2024 Paris Olympics, the tower was repainted with the same gold hue it had in 1907.

Other colors the Eiffel Tower has sported over the years include its original hue of Venetian red, which was applied in the workshop before the tower was actually assembled, as well as ochre-brown and brownish-red. Though most of those don’t sound as spectacular as a monument of the tower’s stature deserves, recreations of the colors suggest they were aesthetically pleasing enough for the landmark to somehow appear in the background of every cafe and hotel room in the city.