It took the Oxford English Dictionary editors five years just to reach the word ‘ant.’ 

  • Well-worn Oxford Dictionary
Well-worn Oxford Dictionary
Credit: jozef mikietyn/ Alamy Stock Photo
Author Sarah Anne Lloyd

October 31, 2024

Love it?

The Oxford English Dictionary, also known as simply the OED, is a massive reference book containing not just words — including obsolete ones — and their definitions, but also detailed information on the words’ history and usage. The project was born in 1857, when the Philological Society of London, a group dedicated to the history of literature and words, established a committee to collect words that didn’t appear in existing dictionaries at the time. Their goal was to document the English language going back to Anglo-Saxon times, which ended around 1000 CE — so quite a lot of territory to cover.

Work started in earnest in 1879, after Oxford University Press signed on to finance and publish the dictionary, at the time called the New English Dictionary (NED). The staff buckled down and got to work reading and researching; editor James Murray estimated the dictionary would take about 10 years to compile. In 1884, after working on the dictionary for five years, the first fascicle (meaning a part of a book) came out. It only covered the words “a” through “ant.”

The project was clearly more ambitious than the Philological Society had originally imagined. Murray started working on the dictionary full time, and over the next several years he was joined by another editor and two co-editors. The last fascicle was published on April 19, 1928, nearly 50 years after work began. The original plan was for the dictionary to contain 6,400 pages over four volumes; ultimately, the first edition of the NED contained 12 volumes with 15,487 pages, covering a whopping 414,825 words. Today, the OED continues to document the growing English language, and includes more than 600,000 entries, with new words and meanings added regularly.

Men have been shaving since the Stone Age.

  • Razor-tweezer, circa 1550–1295 BCE
Razor-tweezer, circa 1550–1295 BCE
Credit: Penta Springs Limited/ Alamy Stock Photo
Author Michael Nordine

September 25, 2024

Love it?

Long before the age of electric and five-blade razors, men still felt compelled to shave. The practice is so old, in fact, that it dates back to the Stone Age, roughly 100,000 years ago. The planet was quite cold at the time; somewhat counterintuitively, this made facial hair a liability rather than a way of keeping warm. A long, damp beard doesn’t exactly feel good in the rain or snow, and in extreme enough conditions it could trap enough moisture to freeze against the skin and cause frostbite.

This necessitated the first instances of shaving, though our ancestors’ methods were unsurprisingly primitive. Lacking more advanced tools, the first people brave enough to maintain their facial hair did so with seashells that essentially functioned as tweezers — they simply pulled their hair out. They later moved on to using obsidian shards, which sounds less painful and more effective. Copper razors appeared in ancient Egypt between 2686 BCE and 2181 BCE, with some featuring decorative flourishes such as goose heads and hippo-shaped handles, suggesting this more primitive method of shaving lasted longer than most of us would like to imagine.

People used to send their children through the U.S. mail.

  • Letter carrier with child in mailbag
Letter carrier with child in mailbag
Credit: Bygone Collection/ Alamy Stock Photo
Author Sarah Anne Lloyd

September 11, 2024

Love it?

On January 1, 1913, the United States Post Office began offering parcel service. While private freight companies had already existed for quite some time, the program allowed many more people, including folks in rural communities, to get goods shipped to their front doors. Immediately, Americans started shipping pretty much anything they could think of. One of the first packages sent using the service was a brindle bulldog. College kids started mailing their laundry home. More than one Flushing, Queens, resident received an opossum. But the most brazen early parcel customers trusted the Post Office with was the most precious cargo of all: human children.

The first recorded baby delivered via parcel post was James Beagle, an 8-month-old resident of Glen Este, Ohio. His journey wasn’t long: A carrier picked up the “well wrapped” infant from his parents on January 25 and, per the address on an attached card, delivered him to his grandmother just a few miles away. The postage cost 15 cents, and his parents insured him for $50.

This practice was never officially authorized, and in February 1914, the second assistant postmaster general announced that babies could not be transported by mail. But this didn’t stop postal employees, particularly rural ones, from occasionally breaking the rules. Just a month later, a 14-pound baby was shipped 12 miles from her grandmother in Clear Spring, Maryland, to her mother in Indian Springs. On February 19, 1914, 5-year-old May Pierstorff was mailed about 75 miles from her home in Grangeville, Idaho, to her grandparents’ place, which cost 53 cents in postage and was, apparently, cheaper than a train ticket. (In that case, she was chaperoned by a cousin who worked for the mail service.) In 1915, 6-year-old Edna Neff was mailed a whopping 720 miles from Pensacola, Florida, to her father’s home in Christiansburg, Virginia.

That same year, on August 31, 1915, 3-year-old Maude Smith — with a shipping label sewn to her dress, appropriate postage affixed, and snacks in hand — was placed by her mail carrier on a train from Caney to Jackson, Kentucky, to visit her sick mother. When she arrived at her destination, she had a note from a postal clerk to a local postmaster pinned to her dress: “I doubt the legality of the sending, but it was put on the train and I must deliver and report.” The U.S. Post Office actually investigated that case, and although it’s unclear what the outcome was, Smith was one of the last children ever to be mailed.

Yellow pencils used to be sold as a luxury item.

  • HF_FOD_yellow-penciles-luxury_a4800b
History Facts
Author Sarah Anne Lloyd

September 19, 2024

Love it?

Today, yellow pencils are just normal, generic pencils, but when they were popularized in the late 19th century, they were considered the height of luxury. Before then, lacquered pencils were often a sign of low-quality wood that needed to be covered up, and they were usually finished in darker colors such as black or maroon. A decent pencil, meanwhile, would be either plain or varnished wood. But that all changed with the introduction of luxury pencils made with the finest, purest graphite, which came from a mine on the border of China and Siberia. 

The German pencil manufacturer Faber (now Faber-Castell) was the first company to get its hands on graphite from the region, and it allowed for extremely fine-tuned pencil formulas, with 16 different degrees of hardness and softness. This was a big deal in the pencil world, and Faber boasted in its catalogs that “Siberian graphite” was “a household word amongst artists, engineers, designers and draftsmen generally.” Around the same time, pencil-maker Franz von Hardtmuth decided to develop an expensive luxury pencil to compete with Faber’s Siberian graphite. He created a pencil with 17 grades of hardness — one more than Faber’s — and started dressing it up to bring to market. The new pencil got 14 coats of yellow lacquer and tips sprayed in gold paint, and was named the Koh-I-Noor 1500, after the famed large diamond.

Yellow was an auspicious color: It was known as the Chinese color of health and good fortune, so it winked at the sought-after Asian graphite, although it’s unclear where Hardtmuth’s graphite actually came from. Combined with the pencil’s black tip, it also displayed the colors of the Austro-Hungarian flag. The Koh-I-Noor 1500 pencil hit the market in 1888, and, even with the higher price tag, it was a smash hit. Other pencil companies, particularly those eager to associate themselves with Asian graphite, also started painting their pencils yellow; by 1895, even Faber had a “Yellow Siberian” pencil. The American-made Dixon Ticonderoga No. 2 pencil — the yellow one ubiquitous in classrooms today — debuted in 1913.

Isaac Newton stuck a needle in his own eye as part of an experiment. 

  • Isaac Newton analyzing a ray of light
Isaac Newton analyzing a ray of light
Credit: © mikroman6—Moment/Getty Images
Author Sarah Anne Lloyd

March 11, 2026

Love it?

If you’re brave enough, there is no more convenient scientific subject than your own body: You’re intimately familiar with it, you get near-instant feedback, and you always have it with you (though, to be clear, we are not suggesting you do this at home). Inventors who have used themselves as guinea pigs include polio vaccine inventor Jonas Salk, LSD creator Albert Hofmann, and one of history’s most famous scientists, Sir Isaac Newton. (And no, this is not about the apple that probably didn’t fall on his head.)

Newton had a keen interest in the study of light and vision, especially the way humans perceive color. He wondered if colors were created from inside the eye or by forces outside of it, and if it was the former, if directly manipulating one’s eye could create the sensation of color.

To find out, Newton tried an experiment with his own sight. He took a bodkin (a kind of large sewing needle), inserted it into his eye socket below his actual eye, and pressed it into the back of the eyeball. Then he started to record what he saw, writing, “There appeared severall [sic] white darke & coloured circles.”

It was a neat effect, but not a scientific breakthrough. Newton gained more insight and acclaim through far less invasive experiments with prisms; by demonstrating that a beam of white light could be refracted and then come back together, he found that colors are innate properties of light, not tricks of the eye.

Cars used to have foot-operated headlight switches.

  • Vintage car with headlights on
Vintage car with headlights on
Credit: © soleg—iStock/Getty Images Plus
Author Sarah Anne Lloyd

March 4, 2026

Love it?

If you bought a classic car today, you might be surprised to find a small metal button on the floor next to the gas and brake pedals. This is the headlight dimmer switch, and it’s wired directly into the headlights to switch between the high and low beams. It’s an oddity now, but for much of the 20th century, this foot dimmer was the standard in American automobiles.

Floor buttons were common in the 1920s because they were cheap to make and easy to install. (They corroded pretty easily since they were exposed to anything that got tracked into the car, but they were also pretty simple to fix, even for nonmechanics.) While some cars had special low beams activated on the steering column as early as the 1950s, if you were driving a stick shift, having both your hands free was especially handy, so the foot control was preferred for anything that required adjusting while driving.

The design began to change during the oil crisis of the 1970s, when gas prices spiked and drivers became concerned with fuel economy. This contributed to the rise of front-wheel drive and more compact cars, which consolidated controls at the front of the vehicle. European cars already had high beam controls on the steering column, so it was a logical location switch. Ford’s Econoline van and F-150 truck were the last two vehicles to use foot-operated lights, with the feature lasting into the 1990s.

Julius Caesar never said, ‘Et tu, Brute?’

  • Death scene of Caesar
Death scene of Caesar
Credit: Adam Eastland Art + Architecture/ Alamy Stock Photo
Author Michael Nordine

September 19, 2024

Love it?

A lot of history’s famous quotes are either misattributed or were never spoken in the first place. In addition to the fact that Gandhi never said, “Be the change you wish to see in the world,” and no one aboard Apollo 13 ever uttered the phrase, “Houston, we have a problem,” Julius Caesar didn’t say, “Et tu, Brute?” (“You too, Brutus?”) as he was stabbed to death by a group of Roman senators that included his supposed bestie. The line comes from Shakespeare’s play The Tragedy of Julius Caesar and is followed by its protagonist’s last words, “Then fall, Caesar” — as though the betrayal made him lose his will to live more than the stab wounds.

Caesar’s actual last words — or whether he even had the breath to speak any — are unknown. Most ancient scholars, including Roman historians Plutarch and Cassius Dio, believe he said nothing at all, but mention that other sources claim he spoke in Greek: “καὶ σύ, τέκνον,” roughly translating to “You too, my child?” In any case, March 15 — the date of Caesar’s assassination in 44 BCE, better known as the Ides of March — has since become associated with doom and foreboding.

Doctors dressed in black until the late 1800s.

  • Doctor examines patient’s health
Doctor examines patient's health
Credit: maodesign/ iStock
Author Sarah Anne Lloyd

September 11, 2024

Love it?

Visually, there’s nothing that screams “doctor” more than a white coat. Even as some physicians eschew the garment for attire that’s a little more personal, the symbolism remains; many medical schools even have “white coat ceremonies” for their incoming students. Before the late 19th century, however, doctors wore solemn, formal black attire, both communicating the seriousness of their work and, possibly, because they were so often working with the dying. Being a doctor didn’t hold the same association with science as it does today, either: A medical degree was pretty easy to come by, and patent medicine scams were rampant. 

In the middle of the 19th century, the idea that germs caused disease — something taken for granted now — started to take hold, which ushered in a new era in medicine. White lab coats emerged right along with it, a symbol of cleanliness, scientific rigor, and saving lives. Surgeons were the first to wear them starting in the late 1800s, and then other doctors in hospitals followed suit. Eventually, the attire reached private practice. It became an easy way to distinguish doctors practicing science-based medicine from those just selling snake oil, and white coats were the standard uniform by 1915, not too long after American medical schools underwent major reforms and the federal government started regulating drug claims. Ironically, the spread of germs is now leading people to ditch the white coat, since the cuffs of long-sleeved jackets can accumulate pathogens — so don’t be surprised if the next time you see your doctor, they’re in scrubs and a vest instead.

Victorian wallpaper was toxic.

  • Paris green paint
Paris green paint
Credit: © Chris Goulet
Author Bess Lovejoy

March 4, 2026

Love it?

In the Victorian era, wallpaper could — and sometimes did — kill. The culprit was in the colors — specifically, a set of wildly popular pigments known as arsenical greens. Beginning in the late 18th century, chemists discovered how to make vivid green dyes from copper arsenite. The first, Scheele’s Green, wasinvented in 1775 by German Swedish chemist Carl Wilhelm Scheele. It was followedin 1814 by a brighter, more durable version often called emerald green (also known as Schweinfurt green, Paris green, or Vienna green), invented by German industrialist Wilhelm Sattler. Both contained arsenic — and both became fashionable favorites.

By the mid-19th century, Britain was producing tens of millions of rolls of wallpaper each year. Thanks to new printing technologies (and gas lighting that replaced candles), brightly colored papers could decorate even modest homes. Lush, leafy patterns in brilliant green became especially trendy, embraced by designers associated with the Pre-Raphaelite and Arts and Crafts movements. Not all wallpaper was toxic, but papers printed with arsenical pigments posed real risks.

Arsenic was hardly a secret poison. Victorians used it to kill rats and were well aware it could be deadly if swallowed. Its toxicity has been well known since ancient times, in fact. But many manufacturers insisted that arsenic bound up in wallpaper pigment was harmless unless someone literally licked the walls. Still, some critics suspected otherwise. Doctors in Britain and the United States began reporting mysterious illnesses linked to rooms papered in green; women and children seemed especially susceptible. In moist conditions (read: most of England), arsenic pigments were particularly harmful because they could produce toxic fumes in the form of arsenical gas. Pigment also flaked off the wallpaper and settled into surrounding dust.

Wallpaper wasn’t the only problem. Arsenical greens colored dresses, artificial flowers, children’s toys, candles, sweets, and even food wrappers. One 19th-century household guide, Beeton’s Housewife’s Treasury of Domestic Information, warned against “brilliant green” for its “pernicious influence on the health.”

Public anxiety grew, and although some designers dismissed the “arsenic scare” as hysteria, consumer pressure gradually forced manufacturers to abandon the pigments. Victorian wallpaper didn’t always kill — but when it shimmered an irresistible emerald, it sometimes carried a hidden, deadly cost.

Maryland was the only state to refuse to enforce Prohibition.

  • A bar in Baltimore, Maryland-maryland
A bar in Baltimore, Maryland
Credit: De Luan/ Alamy Stock Photo
Author Kerry Hinton

September 5, 2024

Love it?

On January 16, 1919, Congress ratified the 18th Amendment, starting a one-year countdown to the end of the manufacture, transportation, and sale of alcohol in the United States. Although many Americans opposed Prohibition, 46 of the 48 U.S. states at the time ratified the amendment (only Connecticut and Rhode Island rejected it). The Volstead Act made Prohibition official when it went into effect on January 17, 1920. Drinking alcohol was still technically legal, however, and the public’s willingness to flout the law without any real risk opened the door to the illicit activity and violent excess of the Prohibition era. 

Although Maryland was the sixth state to ratify the 18th Amendment, its state government did not pass legislation to enforce the federal law. The 18th Amendment stipulated that “Congress and the several States shall have concurrent power to enforce this article by appropriate legislation.” But Maryland was the only state that never passed an enforcement law, refusing to commit any resources to policing the ban. Throughout the 1920s, Maryland earned a reputation as a “wet” state, and the Chesapeake Bay was a hot spot for bootleggers importing illegal liquor from overseas. 

Maryland’s anti-Prohibition stance was led by local politicians such as Governor Albert Ritchie, who called the 18th Amendment an “unwarranted invasion by the Federal Government of the liberties of the Maryland People.” But the state was not alone in disliking the ban on booze, which was widely unpopular by the end of the 1920s. Prohibition inadvertently created a host of new societal problems, including a sharp rise in crime and a severe loss of federal tax revenue. The ratification of the 21st Amendment ended the failed experiment in 1933, and while most Americans celebrated Repeal Day with gusto, the beer may have tasted slightly sweeter at bars and taverns across the Free State.