William Shakespeare invented the name Jessica.

  • Engraving of Jessica in “The Merchant of Venice”
Engraving of Jessica in "The Merchant of Venice"
Credit: Kirn Vintage Stock/ Alamy Stock Photo
Author Michael Nordine

October 16, 2025

Love it?

William Shakespeare wrote dozens of plays — at least 36, by most counts — and coined nearly as many phrases that are still in use. We wouldn’t call things we don’t understand “Greek to me” were it not for Julius Caesar, wouldn’t refer to jealousy as a “green-eyed monster” without Othello, and wouldn’t find ourselves in a “brave new world” without The Tempest, among other examples. Nor would we have the name Jessica, which the Bard invented around 1597 while writing The Merchant of Venice

Thought by some to have been derived from Iscah, the name of Abraham’s niece in the Bible, the name Jessica first appeared as the daughter of Shakespeare’s villainous moneylender Shylock. She’s no fan of him, however, and she absconds with a chest of her father’s gold while eloping with Lorenzo against Shylock’s wishes. This betrayal motivates Shylock’s vengeful insistence on exacting a pound of flesh from a Venetian merchant in lieu of the money he owes him. 

Jessica was among the most popular names for baby girls throughout the 1970s and ’80s in the U.S., though its popularity in America has waned in the decades since. In England and Wales, however, Jessica was the most popular name for baby girls as recently as 2005.

Chinese checkers was invented in Germany.

  • Kids playing Chinese checkers
Kids playing Chinese checkers
Credit: Bettmann Archive via Getty Images
Author Sarah Anne Lloyd

May 21, 2024

Love it?

Chinese checkers is a classic board game, featuring several marbles that move along a series of holes grouped into a six-pointed star shape. As in traditional checkers, the marbles can move to empty spaces or jump over adjacent pieces, with the goal of getting all the pieces to one side of the board. Contrary to its name, however, Chinese checkers has nothing to do with China — it’s a variation of a game called Halma (meaning “jump” in Greek). Halma features a square board, and a star-shaped version was invented in Germany around 1880. Originally called Stern-Halma, the star-shaped game was published by the German game and puzzle company Ravensburger in 1892. 

The game arrived in America in the late 1920s under the name Hop Ching checkers, and, later, Chinese star checkers. It was advertised as “a game from the Orient for all ages,” but this backstory was invented entirely for marketing, to give the product an air of mysticism. While six-pointed stars such as the one on the Chinese checkers game board have a long history in many cultures, including some Asian spiritual traditions, the board’s star shape actually originated in Germany, decades before the game’s fictional association with China. Nevertheless, Chinese checkers is the name that stuck in the American lexicon. 

Birthdays weren’t always celebrated because few people even knew their birth date.

  • Family celebrating child’s birthday, 1950s
Family celebrating child's birthday, 1950s
Credit: H. Armstrong Roberts/ Retrofile RF via Getty Images
Author Nicole Villeneuve

October 16, 2025

Love it?

Birthdays are often a big deal in the modern world, marking milestones such as being old enough to drive or vote, or acknowledging the start of a new decade of life. But for most of human history, a birthday was just another day, and many people didn’t even know when theirs was.

Ancient societies sometimes recorded births within noble or wealthy families for lineage or inheritance purposes, but systematic recordkeeping was rare.  It wasn’t until the 1530s in England that churches were mandated under King Henry VIII to document baptisms. Similar practices appeared in colonial America, though coverage varied widely by region and denomination. In Massachusetts, for example, Puritans kept birth and baptismal records as early as the 1600s, while in many Southern states, records remained patchy into the 20th century.

For centuries, tracking your age was inconsistent, as was celebrating it. Ancient feasts marked special occasions for elite members of society (sometimes even with cake), but it wasn’t until the mid-19th century that birthdays were commonly commemorated with parties, especially for children. With the rise of industrialization, strict work schedules, pay periods, and age-based job requirements meant tracking calendars closely, making people more aware of their age and birthday. In the early 1900s, state-level birth registration became more widespread in America: By 1933, all U.S. states were participating in official birth recordkeeping, and by the mid-1940s, most Americans had birth certificates.

The oldest recording of a president is the voice of Benjamin Harrison.

  • Edison wax cylinder phonograph
Edison wax cylinder phonograph
Credit: Science History Images/ Alamy Stock Photo
Author Bennett Kleinman

October 16, 2025

Love it?

Some presidential voices are instantly recognizable, such as John F. Kennedy’s distinctive New England accent or Ronald Reagan’s folksy tone. But we’ll never definitively know what presidents such as George Washington or Abraham Lincoln sounded like, since there are no audio recordings of their voices. The oldest existing recording of a U.S. president is the voice of Benjamin Harrison, the 23rd commander in chief. Harrison served from 1889 to 1893, and the audio recording dates to around his first year in office. 

Harrison’s voice was captured on a wax cylinder phonograph, a recording device developed by Thomas Edison in the late 1880s. It captures him recounting the first Pan-American Congress, a diplomatic event attended by leaders from several countries in the Americas. The recording features Harrison saying, “As president of the United States, I was present at the first Pan-American Congress in Washington, D.C. I believe that with God’s help, our two countries shall continue to live side-by-side in peace and prosperity. Benjamin Harrison.” Since Harrison, the voice of every U.S. president has been recorded on tape, with the exception of his direct successor, Grover Cleveland.

It’s worth noting that while Harrison’s voice is the oldest surviving audio recording of a U.S. president, it likely wasn’t the first. On April 18, 1878, Thomas Edison visited President Rutherford B. Hayes at the White House and brought along his phonograph to demonstrate how the device worked. It’s believed that Edison recorded Hayes’ voice in the process, though any evidence of this recording has since been lost.

Medieval people sometimes slept in two shifts.

  • Medieval painting of a dormitory
Medieval painting of a dormitory
Credit: Photo Josse/Leemage/ Corbis Historical via Getty Images
Author Michael Nordine

October 16, 2025

Love it?

Most adults need about seven hours of sleep per night, but there’s nothing in the rule book about getting all seven hours at once. For the thousand or so years that encompassed the Middle Ages, in fact, people in Western Europe sometimes slept in two shifts: once for a few hours usually beginning between 9 p.m. and 11 p.m. and again from roughly 1 a.m. until dawn. The hours in between were a surprisingly productive time known as “the watch.” People would complete tasks and chores, relieve themselves (sometimes directly into the fire keeping them warm), check on any farm animals they were responsible for, pray, socialize, and be intimate.

Sleep was also communal at the time, with entire families often sharing a single bed; sometimes visitors or even strangers passing through the area bunked together. People often luxuriated in bed and conversed with one another during the waking hours at night, apparently in a more relaxed, informal manner than they would during the day — being half-asleep on the same pillow does lend itself to a certain closeness, after all. They would then fall back asleep until morning. 

Biphasic sleep fell out of practice in Europe with the Industrial Revolution, as the rise in artificial lighting altered people’s circadian rhythms by allowing them to stay up later. Because folks still had to wake up at the same early hour, however, they began getting all their rest in one shift rather than two.

Walking was a competitive sport in the 19th century.

  • Pedestrianism competition
Pedestrianism competition
Credit: INTERFOTO/ Alamy Stock Photo
Author Sarah Anne Lloyd

May 21, 2024

Love it?

At 1 a.m. on March 10, 1879, the arena at Gilmore’s Garden in New York City (later renamed Madison Square Garden) was absolutely packed with screaming fans of America’s latest sports craze: pedestrianism. That’s right, competitive walking. At the venue, fans outside tried to shove themselves in, breaking windows and scaling the roof. It was no less chaotic inside, where ticketholders scrambled on top of tables, chairs, and each other’s shoulders to get a better view. That day marked the start of the Astley Belt, essentially the Super Bowl of walking. Contestants had to circle the 1/8-mile track for six days straight and reach a distance of at least 450 miles, and whoever traveled farthest was declared the winner. Athletes were not permitted to leave the track, and instead had tents or cottages where they were allowed to get a little rest or medical attention. 

Americans’ fascination with pedestrianism can be traced back to one man, a New York Herald employee named Edward Payson Weston who had a penchant for long-distance walking. Recognizing his gift for endurance, he made a bet with a friend on the 1860 presidential race, in which the loser had to walk all the way from Boston to Washington, D.C., for the inauguration. Because Weston bet against Abraham Lincoln, he found himself on a 10-day trek through ice and snow that made him a media darling. He started organizing endurance walks against other people, which grew into pedestrianism.

The sport reached the peak of its popularity in the 1870s and 1880s, at which time it was far more than a novelty. Pedestrianism spawned America’s first celebrity athletes, complete with trading cards and brand endorsement deals. Weston was the first; he was so famous that scientists published studies on his urine. Many later superstars were immigrants and people of color: One of the last great pedestrian celebrities was Frank Hart, a Haitian immigrant with a record-breaking career that included a 565-mile, six-day walk. Plenty of women participated in the sport, too — as the March 1879 Astley Cup marched on in midtown Manhattan, five women were competing in their own six-day walk up in Harlem. 

Kodak accidentally discovered the U.S. was testing the atom bomb.

  • Gadget atomic bomb
Gadget atomic bomb
Credit: Everett Collection Inc / Alamy Stock Photo
Author Sarah Anne Lloyd

May 15, 2024

Love it?

When the United States government detonated the first atomic bomb, nicknamed Gadget, near Los Alamos, New Mexico, on July 16, 1945, they did it in secret — or as secretly as you can test something that creates an explosion reaching 40,000 feet into the air. It was known as the Trinity Test, but as far as the public knew, an Air Force weapons stash had accidentally exploded. Soon after, Kodak started getting complaints that its X-ray film was unusable, due to mysterious exposed black spots, called “fogging,” and one research scientist’s quest to find the source of the problem led him to make a startling discovery.

Kodak had already taken great pains to protect its highly sensitive X-ray film from radioactivity. During the 1940s, packaging cardboard was often sourced from wartime plants that also handled radium. To avoid that, Kodak had its packaging produced in mills where it had full control of the raw materials. So after carefully testing the fogged film, Kodak researcher Julian H. Webb was surprised to find that the packaging was to blame. He traced the contamination to a strawboard mill along the Wabash River in Indiana and isolated the issue to a batch produced on August 6, 1945, concluding that the damage wasn’t from radium, but from an unknown radioactive material. He soon got word that another mill in Tama, Iowa, about 350 miles away, had similar contamination. After ruling out the straw used to make the packaging, he concluded that the wind had blown in contaminated precipitation from somewhere else.

By the time Webb realized this, the U.S. had already dropped two nuclear bombs on Hiroshima and Nagasaki, Japan. But after carefully analyzing the material, Webb determined that the radiation must have come from a detonation within the United States. It’s not clear when exactly he concluded that it was a direct result of the Trinity Test, but he was communicating with the lab at Los Alamos by 1947. (The general public learned that the Trinity Test occurred after the bombings of Hiroshima and Nagasaki in August 1945.) When Webb published his report in 1949, he made the connection clear: “The most likely explanation of the source of this radioactive contaminant appears to be that it consisted of wind-borne radioactive fission products derived from the atom-bomb detonation in New Mexico on July 16, 1945.”

Boats used to be powered by horses.

  • Toronto’s first horse-powered ferry
Toronto's first horse-powered ferry
Credit: History and Art Collection/ Alamy Stock Photo
Author Michael Nordine

October 9, 2025

Love it?

Horses tend to be known for their land-based achievements, but that doesn’t mean equines aren’t sometimes aquatic. In fact, for more than a century, beginning in the 1810s, horse-powered ferries were a common form of transportation in the U.S. Also known as team boats, they were most often used in lakes and rivers — even a team of Clydesdales can’t cross the Atlantic — and worked by having a small group of horses (usually between three and five) walk in a circle on the deck while attached to a wheel that turned the boat’s gears. 

Another form of horse boat, invented in 1819, allowed the horses to stand above a turntable and drive the wheel backward by walking in place. This was both easier on the animals and allowed more space for passengers on the deck. Horse ferries were especially popular in the Northeast in general and New York in particular, though they became less common by the end of the 19th century. The last known team boat remained in service on the Tennessee River until the late 1920s and was propelled by a single blind horse.

The Supreme Court hasn’t always had nine justices.

  • U.S. Supreme Court
U.S. Supreme Court
Credit: Douglas Rissing/ iStock
Author Michael Nordine

May 15, 2024

Love it?

The U.S. Supreme Court was established by the Judiciary Act of 1789, creating the third coequal branch of government and a high court to which all others are inferior. Notably missing from that federal statute: any stipulation that the highest court in the land should have nine justices. In fact, the bench has ranged from as few as five to as many as 10 justices over the last two and a half centuries. As you might expect, the process by which that number has changed hasn’t exactly been apolitical. The Supreme Court had six justices when George Washington signed the act into law on September 24, 1789, and that was almost reduced to five in 1801 before being increased to seven in 1807 — two changes that were motivated by legislators hoping to either weaken or strengthen the president’s power to appoint judges.

After that, SCOTUS was comprised of seven justices for three decades before being expanded in 1837, allowing President Andrew Jackson to add two more people to the bench. Its membership then increased by one during the Civil War to ensure that majority decisions would be written by justices who were pro-Union and anti-slavery, before being restored to the current number, nine, after Ulysses S. Grant was elected president in 1868. Court expansion has been a hot topic at several points since, most notably when Franklin D. Roosevelt tried and failed to expand the Supreme Court as part of the New Deal.

There was a ‘wheelbarrow craze’ in the 19th century.

  • Woman pushing wheelbarrow, c. 1900
Woman pushing wheelbarrow, c. 1900
Credit: M&N/ Alamy Stock Photo
Author Sarah Anne Lloyd

October 9, 2025

Love it?

The first thing you need to know in order to understand Victorian England’s “wheelbarrow craze” is that pedestrianism, or competitive walking, was all the rage in the U.S. and U.K. in the late 19th century. The sport produced some of the first celebrity athletes, complete with collectible cards and brand endorsements.

Amid this trend, a Scottish former circus performer named Bob Carlisle saw the potential to make a name for himself after seeing an American walking celebrity who was doing a tour of Britain. In 1879, Carlisle announced he’d be walking from Land’s End in Cornwall to John O’Groats in northern Scotland and back, a journey of more than 1,600 miles. But the former tiger tamer wasn’t content to win on walking alone, and decided to push a wheelbarrow the entire way — a stunt that garnered large crowds and plenty of press attention on the way.

Copycats started to appear during his journey, including an 8-year-old and an 11-year-old walking little wheelbarrows about 13 miles from Newcastle to Sunderland in England. The papers started calling it a “wheelbarrow craze” in the autumn of 1886, when two competing wheelbarrowists named James Gordon and Sawdust Jack started racing to London. Gordon had the longer journey from Dundee, Scotland, but Sawdust Jack departed from Newcastle with a heavier wheelbarrow and a much later start. Only Gordon actually finished, but the crowds went so wild upon his return to Dundee that they smashed his wheelbarrow and nearly killed him. (Carlisle, who was at sea at the time, missed the whole thing.)

The race for long-distance wheelbarrowing fame and fortune drew all kinds of people, including a singer hoping to pick up gigs along the way, a one-armed man with an adapted wheelbarrow, a miner with a 336-pound load, and a woman pushing not a wheelbarrow, but her 8-month-old infant in a perambulator. Alas, as with most fads, the public eventually lost interest, and crowds started dwindling during the summer of 1887.