Since the establishment of the office in 1789, 45 people have served in 47 presidencies. Each president has brought their own brand of political discourse to the role, and historians tend to remember these leaders primarily for their major historical achievements and policy decisions.
But behind the presidents’ political legacies lie plenty of lesser-known details we don’t hear about as often, whether it’s a past life as an executioner, a penchant for skinny-dipping, or a fierce dislike of broccoli. Here is a surprising fact about every U.S. president.
Credit: Photo 12/ Universal Images Group via Getty Images
George Washington
George Washington’s second inaugural address, delivered on March 4, 1793, was the shortest ever at less than two minutes long and only 135 words. (The average length of an inaugural address is 2,350 words.)
John Adams
John Adams was the first U.S. president to inhabit the White House. He moved into the unfinished and largely unfurnished residence in November 1800, with only six rooms completed.
Credit: Buyenlarge/ Archive Photos via Getty Images
Thomas Jefferson
Thomas Jefferson helped popularize ice cream, French fries, and macaroni and cheese in America. He brought recipes from Europe that his staff prepared for guests at the White House.
James Madison
James Madison was the shortest president. He measured 5 feet, 4 inches (163 centimeters) tall and is estimated to have weighed about 100 pounds.
Monrovia, the capital of Liberia, was named after James Monroe, making it the only capital city outside the U.S. named for an American president.
John Quincy Adams
John Quincy Adams had a daily routine that involved early morning nude swims in the Potomac River. He maintained his habit of swimming in the Potomac into his late 70s.
Andrew Jackson was involved in at least a dozen duels during his lifetime (some put the number as high as 100), most of which were resolved without bloodshed. But Jackson did kill one opponent and was shot twice himself.
Martin Van Buren
Martin Van Buren was the first U.S. president who spoke English as a second language. He grew up in a Dutch community in upstate New York, and his first language was Dutch.
Advertisement
Advertisement
Credit: Bettmann via Getty Images
William Henry Harrison
After only one month in the White House, William Henry Harrison became the first president to die in office, serving the shortest tenure in U.S. presidential history.
John Tyler
John Tyler fathered 15 children with two wives, more than any other president. His first child was born in 1815, when Tyler was 25, and his last in 1860, when the former president was 70.
Some of James K. Polk’s contemporaries considered him to be a rather boring man. This likely wasn’t helped by his wife, Sarah, who banned hard liquor and dancing at the White House.
Zachary Taylor
In 1848, the Whig National Convention nominated Zachary Taylor as president of the United States. Letters were sent to Taylor requesting his acceptance, but he didn’t respond — he had instructed the local post office not to deliver his mail to avoid postage fees, and didn’t get official word of his nomination for weeks.
Credit: Library of Congress Prints and Photographs Division Washington, D.C.
Millard Fillmore
Millard Fillmore came from a large, poor family and received little formal education until he was 18. When he became president, he and his wife, Abigail, established the first permanent library in the White House.
Franklin Pierce
Franklin Pierce was the first U.S. president to recite his inaugural speech entirely from memory — all 3,336 words of it.
Andrew Johnson was supposed to be assassinated on the same night as Abraham Lincoln, as he was serving as Lincoln’s vice president at the time. But his would-be assassin, George Atzerodt, lost his nerve and got drunk instead.
Ulysses S. Grant
Ulysses S. Grant’s real name was Hiram Ulysses Grant. He wound up with the moniker we know today due to a mistake by his benefactor on his application form to West Point, which he was never able to correct.
As part of his “Great Western Tour” in 1880, Rutherford B. Hayes became the first president to visit the West Coast while in office.
James A. Garfield
When President James A. Garfield was shot by an assassin on July 2, 1881, he was attended to by Charles B. Purvis — the first Black physician to treat a sitting president.
Advertisement
Advertisement
Credit: Education Images/ Universal Images Group via Getty Images
Chester A. Arthur
Chester A. Arthur appreciated the finer things in life. His administration spent $30,000 (around $2 million today) lavishly refurbishing the White House, and his love of clothing — he reputedly owned 80 pairs of trousers — earned him the nickname “the Gentleman Boss.”
Grover Cleveland
Grover Cleveland was the only president to officially serve as an executioner. As the sheriff of Erie County, New York, he performed the role of hangman on a handful of executions.
Photo credit: Library of Congress Prints and Photographs Division Washington, D.C.
Benjamin Harrison
Benjamin Harrison was the first president to have electric lighting installed in the White House, but he and his wife Caroline were afraid to touch the light switches for fear of electrocution.
William McKinley
William McKinley’s favorite pet was a double yellow-headed Mexican parrot that he named Washington Post and taught to whistle “Yankee Doodle Dandy.”
Credit: Culture Club/ Hulton Archive via Getty Images
Theodore Roosevelt
Teddy Roosevelt was disappointed with his official presidential portrait by artist Théobald Chartran, which his family jokingly said made him look like a “mewing cat.” Just before the end of his presidency, Roosevelt had the painting removed from the White House and burned.
William Howard Taft
In 1910, William Howard Taft became the first U.S. president to throw the ceremonial first pitch of a baseball season.
Considering it to be good for his health, Calvin Coolidge had his head rubbed with petroleum jelly every morning while he ate his breakfast in bed.
Herbert Hoover
While living in China, Herbert Hoover and his wife were caught up in the Boxer Rebellion, an uprising against foreigners that occurred in 1900. While they were trapped during the Siege of Tientsin, Hoover took a leading role in the defense, building barricades around residential areas.
Credit: Bettmann via Getty Images
Franklin D. Roosevelt
FDR suffered from a mild case of triskaidekaphobia, a fear of the number 13. He didn’t like Friday the 13th, wouldn’t travel on the 13th day of any month, and avoided hosting 13 guests at a dinner party.
Harry S. Truman
The “S” in Harry S. Truman’s name doesn't stand for anything — it’s simply the letter “S” and not an initial for a middle name. According to Truman, the letter was chosen as a compromise between the names of his grandfathers, Anderson Shipp Truman and Solomon Young.
In 1954, Dwight D. Eisenhower had a putting green installed outside the Oval Office — but squirrels kept digging holes in it. Growing increasingly irate, he had his groundskeepers live-trap the squirrels, which were then released in nearby Rock Creek Park.
John F. Kennedy
When JFK and his crew were stranded on an island during World War II, they carved a rough rescue message onto a coconut shell. Kennedy later had the coconut turned into a paperweight, which sat on his desk throughout his presidency.
Lyndon B. Johnson owned an amphibious car, which he used to prank people. Exclaiming to all on board that the brakes had failed, he’d head straight into a lake — at which point the terrified passengers would learn that the car could actually float.
Richard Nixon
Richard Nixon was a talented musician. He played five musical instruments — saxophone, violin, clarinet, accordion, and piano — and even composed a piano concerto.
While still in his late 20s, Gerald Ford worked as a fashion model. He even made it onto the cover of Cosmopolitan magazine.
Jimmy Carter
During a fishing trip in 1979, President Jimmy Carter was “attacked” by a swamp rabbit. He was in a small boat when the agitated rabbit swam toward him, possibly trying to escape a predator. The press had a field day with the story, dubbing it the “killer rabbit” or “banzai bunny” incident.
Credit: Bettmann via Getty Images
Ronald Reagan
Over the course of seven summers, the young Ronald Reagan was a lifeguard at Lowell Park on the Rock River. During that time, he saved 77 lives — a number he kept track of by cutting a notch in a log each time he pulled someone from the water.
George H.W. Bush
George H.W. Bush had a passionate dislike for broccoli. While reports that he banned it from the White House and Air Force One may have been exaggerated, Bush is on record as having said, “I'm president of the United States and I'm not going to eat any more broccoli.”
As a child, Barack Obama spent four years living in Indonesia. His family pets included a large turtle and an ape named Tata — and his stepfather secretly bred crocodiles in the house.
Donald Trump
Donald Trump’s Secret Service code name is Mogul, while Melania Trump is Muse. Both code names were selected during Trump’s first term as president.
While he was a student at the University of Delaware, Joe Biden was put on academic probation for a prank in which he sprayed the dorm director with a fire extinguisher.
Most people know Albert Einstein as the face of genius, but there was much more to this famous figure than his groundbreaking work in physics. The German-born, Nobel Prize-winning scientist was curious, compassionate, and principled, and he thought deeply about what it means to live a purposeful, ethical life. These weren’t just abstract ideas — they were guiding principles that informed much of what he did and spoke about. Here, distilled from the many nuggets of wisdom Einstein shared over his 76 years, are five life lessons we can all take to heart.
When asked about the process behind his scientific discoveries in a 1929 interview, Einstein credited a perhaps surprising trait: imagination. “Imagination is more important than knowledge,” he told The Saturday Evening Post. “For knowledge is limited, whereas imagination encircles the entire world.”
Einstein’s famous thought experiments — including picturing himself racing alongside a beam of light when he was 16 years old — showed that visualizing the impossible could help unlock new scientific truths. Other pursuits also gave his imagination room to meander. Playing violin often helped him work through complex problems, and sailing, which he loved but did not exactly excel at, gave him time to let his quiet mind wander.
For Einstein, imagination wasn’t an escape from science but a key component to his breakthroughs. Facts played an important role, certainly, but it was all that daydreaming that first led him into the unknown.
For Einstein, curiosity wasn’t just about asking questions (although he did plenty of that), but also about remaining open to change and new perspectives throughout life. “I have no special talents. I am only passionately curious,” he famously said. That openness helped him question long-held assumptions about the universe, and in turn he upended ideas of absolute space and time with his theory of relativity; reimagined gravity; and engaged critically with emerging quantum theory, investigating its limits while refining his own views when needed.
Einstein’s curiosity extended beyond science, too. He drew insights from philosophy, literature, music, and social issues, showing that learning could happen anywhere. He believed that education could — and should — be indefinite as long as we keep asking questions and embrace the unexpected.
In a 1955 interview, Einstein cautioned against measuring life by status or titles. “Try to become not a man of success, but try rather to become a man of value,” he said to Life magazine. “He is considered successful in our day who gets more out of life than he puts in. But a man of value will give more than he receives.”
Einstein lived this principle by using his influence to advance causes beyond physics, such as speaking out against racism in the United States, advocating for peace, and guiding generations of younger scientists through his teaching. Even his scientific contributions followed this ethic: He was driven less by personal acclaim than by a desire to expand humanity’s understanding of the universe.
“Nothing truly valuable arises from ambition or from a mere sense of duty,” Einstein wrote in a 1947 letter to an admirer. “It stems rather from love and devotion towards men and towards objective things.” Indeed, the famous scientist struggled with his fame, referring to it in a 1922 letter to his friend as “buffoonery.”
Einstein often trusted his instincts even before he had the research to back them up. “I believe in intuitions and inspirations. I sometimes feel that I am right. I do not know that I am,” he toldTheSaturday Evening Post. Those leaps of faith, tested rigorously later, were what prompted his most famous ideas, such as the theory of relativity.
Intuition also guided his personal choices: When a traditional academic path was difficult to navigate at first, he instead worked at the Swiss Patent Office, carving out quiet hours to get lost in his thoughts. Einstein didn’t worry about intuition’s lack of scientific proof. To him, it was an intrinsic part of the process — the spark lighting the way into unexplored territory.
Advertisement
Advertisement
Credit: Bettmann Archive via Getty Images
Have the Courage To Question Authority
Einstein’s most radical discoveries began with doubts, a hallmark of his rebellious streak. Why should time be fixed? Why must space be absolute? Even at a young age, his independent mind clashed with the rigid expectations of school. Later, his refusal to accept conventional wisdom drove breakthroughs that overturned physics as we knew it. “Blind respect for authority is the greatest enemy of truth,” he famously said.
This same spirit shaped Einstein’s public life. He vocally condemned militarism and nationalism, often at a personal cost — he was watched closely by the FBI for years. To him, questioning authority was not about defiance for its own sake, but about choosing truth over tradition. Authority demands obedience, but progress, a hallmark of Einstein’s life, happens when someone dares to ask why.
Credit: Museum of the City of New York/ Archive Photos via Getty Images
Author Bess Lovejoy
September 17, 2025
Love it?77
Few traditions feel as universal as gathering around a frosted cake, lighting candles, and singing “Happy Birthday.” While the ritual seems timeless, the story of why we eat cake on our birthdays stretches back thousands of years — winding through ancient temples, Roman banquets, German children’s parties, and American kitchens.
The word “cake” comes from the Old Norse kaka, but cakes in the ancient world looked quite different from today’s airy, sugar-laden desserts. Early cakes were dense, breadlike creations sweetened with honey, enriched with eggs or cheese, and flavored with nuts, seeds, or dried fruits such as raisins or figs. Archaeological and textual evidence shows that cakes were baked in Mesopotamia more than 4,000 years ago, and the Roman writer Cato described cakes wrapped in leaves and served at weddings and fertility rites.
But cakes weren’t just food — they were often sacred offerings. The Greeks presented honey cakes and cheesecakes to their gods, sometimes decorated with candles. One common offering to Artemis, goddess of the moon and the hunt, was the amphiphon, a round cheesecake topped with glowing candles meant to mimic the moon. Romans, too, baked cakes for religious purposes, including the libum, a mixture of cheese, flour, and egg baked on bay leaves as an offering to household gods. In these early forms, cakes linked the human and divine, symbolizing gratitude, fertility, or cosmic cycles.
Though cakes were plentiful, birthdays were only inconsistently celebrated in antiquity. While there’s some evidence for these annual celebrations in ancient Sumer, they weren’t common in ancient Greece. The Romans were the first to mark personal birthdays, though only for men — women’s birthdays weren’t celebrated until the Middle Ages. Roman citizens honored relatives and friends with feasts, and men turning 50 received a special cake made with wheat flour, nuts, honey, and yeast.
Still, birthday cake traditions were limited and inconsistent during ancient times. For centuries, cakes remained associated with weddings, festivals, and offerings to the gods rather than private anniversaries.
The modern birthday cake owes its clearest debt to 18th-century Germany. There, children’s birthdays were marked with Kinderfeste — festive gatherings that included a sweet cake crowned with candles. Each candle represented a year of life, plus one extra for the year to come, a tradition that still survives. The candles were lit in the morning, replaced throughout the day, and finally blown out in the evening after dinner. Much like today, children were encouraged to make a wish as the smoke carried their hopes skyward.
This ritual blended ancient practices, such as Greek candlelit offerings and Roman birthday cakes, into something recognizably modern: a sweet centerpiece, flames to mark the passage of time, and the magical moment of blowing out candles.
Even with Kinderfeste in Germany, birthday cakes weren’t for everyone, as the food was considered a luxury treat. For most of history, cakes required expensive ingredients such as refined sugar, fine flour, and butter. They were labor-intensive to make and decorated with painstaking artistry. In the early 19th century, American cookbook author Catharine Beecher suggested apple bread pudding as a child’s birthday treat — simple, hearty, and inexpensive. However, children’s birthday parties weren’t common celebrations in the U.S. until after the Civil War.
The Industrial Revolution played a decisive role in democratizing cake. As ingredients became cheaper and more widely available, bakeries could mass-produce cakes and sell them at affordable prices. And for home cooks, birthday cakes reached new heights of extravagance by the early 20th century. In 1912, the popular Fannie Farmer Cookbook included a full-page photo of a “Birthday Cake for a Three-Year-Old,” complete with angel cake, white icing, candied rose petals, sliced citron leaves, and sugar-paste cups for candles. These elaborate confections reflected both rising prosperity and a more ambitious form of cake decorating.
Today’s birthday cakes are often simple — such as a frosted sheet cake with a name piped across the top (a tradition that began around the 1940s) — but the symbolism has a long and rich history. Cakes mark abundance, indulgence, and festivity. Candles represent the passage of time, each flame a year lived, each puff a wish cast into the unknown. The act of gathering around the cake, singing together, and celebrating life’s milestones connects us across centuries to ancient rites of gratitude and hope.
From moonlit cheesecakes for Artemis to a 3-year-old’s angel cake in Fannie Farmer’s kitchen, birthday cakes tell a story about how humans celebrate time, community, and the sweetness of life. Every slice is part of a lineage that is both sacred and ordinary — proof that sometimes the simplest rituals carry the deepest history.
World War I lasted four long years, and the unprecedented scale of the conflict demanded rapid innovation and resourcefulness. The brutal war of attrition, characterized by trench warfare, created many problems to be solved, from the desperate need to treat wounded soldiers to the challenge of feeding armies and maintaining communications across vast distances.
The pressures of wartime necessity sparked a wave of creativity that led to the development of numerous technologies and products — some of which went on to become staples in our everyday lives. Here are seven products that came out of World War I that we largely take for granted today.
Before the First World War, wristwatches were worn almost exclusively by women as fashion accessories. Most men used pocket watches, which had been around since 1700, but these were impractical for trench warfare. During World War I, wristwatches grew in popularity, initially among the officer classes. New watch designs emerged that were larger, stronger, and often featured luminous dials for ease of reading in low-light conditions — vital for coordinating attacks and artillery barrages.
Rank-and-file soldiers from Britain saw their officers wearing wristwatches, and soon started buying their own. By the time the United States entered the war a year before it ended, troops were being issued wristwatches as part of their gear. These new accessories not only were practical, but also became a symbol of courage and bravery, helping establish wristwatches as a mainstream product after the war.
Credit: I C Rapoport/ Archive Photos via Getty Images
Pilates
After the outbreak of World War I, Joseph Hubertus Pilates, a German physical trainer and inventor, was interned as an enemy alien on the Isle of Man. During his three-plus years at the internment camp, Pilates developed a regimen of muscle strengthening through slow and precise stretching and physical movements, using minimal equipment.
To allow those who were confined to their beds to exercise, Pilates used springs and straps from the beds as resistance training, greatly aiding their rehabilitation. He later opened a fitness studio in New York City in 1925, offering the exercise system he developed during the war to the general public. He went on to patent 26 exercise apparatuses, and his eponymous Pilates regime gained worldwide popularity.
Today, trench coats are associated with an array of colorful fictional characters such as Dick Tracy, Columbo, Silent Bob, and Hellboy. But these waterproof, heavy-duty, and typically double-breasted coats were initially developed for British army officers.
Similar coats existed before World War I, but manufacturers — most notably Burberry and Aquascutum (both of which claim to have invented the trench coat) — modernized and modified the design to keep officers warm and dry in the trenches (hence the name). Shoulder straps were included for the attachment of epaulettes or other rank insignia; D-rings were added for attaching map cases, swords, or other equipment to the belt; and the addition of a gun flap buttoned at the chest offered extra protection in combat. And so the modern trench coat was born — a practical piece of attire that remains in fashion today.
Modern plastic surgery originated in Britain, a nation that saw some 735,487 troops discharged following major injuries in World War I. Many of these were facial injuries, which were initially covered using masks and patches. Then came Harold Gillies, a young ENT surgeon from New Zealand, who developed techniques to rebuild faces using plastic surgery. In 1916, he was tasked with setting up Britain’s first plastic surgery unit, where he began treating disfigurements using tissue from elsewhere on the patient’s body.
The pioneering work of Gillies and his team laid the foundations of modern plastic surgery, which today continues as a life-altering medical treatment and a sought-after cosmetic product.
Ever since the mid-19th century, various inventors had been working on combinations of hooks, clasps, and eyes to find a smooth, convenient fastening device. None of their attempts quite worked, due to design flaws and other issues, until Swedish American engineer Gideon Sundback invented the “hookless fastener” during World War I.
Improving upon previous designs, the inventor came up with a product very similar to the modern zipper. His hookless fasteners were soon put on money belts, which became an instant success among U.S. sailors, whose uniforms didn’t have pockets. Then, in 1918, the U.S. Navy ordered fasteners for 10,000 flying suits. The zippers caught on, quickly becoming a standard part of all kinds of bags and apparel.
In 1914, researchers at the consumer goods company Kimberly-Clark were touring pulp and paper plants in Europe when they came across a material that was five times more absorbent than cotton and cost half as much to produce. The company took the material back to the U.S. and trademarked it as cellucotton.
When America entered World War I in 1917, Kimberly-Clark used the material to produce wadding for surgical dressing. Red Cross nurses soon discovered that the new material made an excellent sanitary pad — superior to the flannels and other products they were previously using. When the war ended and demand for surgical dressing dried up, Kimberly-Clark heard about the Red Cross nurses and repurposed its product for commercial use. In 1920, Kotex hit shelves, forever changing the market.
Surgical dressing wasn’t the only use of cellucotton during World War I. By flattening out the material, Kimberly-Clark developed a thin, highly absorbent crepe paper used as a filter in gas masks — a vital piece of equipment in trench warfare. When the war ended, the company once again found itself looking to repurpose a product. It cleverly remarketed its cellucotton sheets as makeup and cold cream removers, and launched the sheets in 1924 with the name Kleenex.
It wasn’t long before people began recommending the product as an ideal tissue for blowing noses during colds. So, in another astute marketing move, Kimberly-Clark started promoting Kleenex as “the handkerchiefs you can throw away” — creating the brand that’s now synonymous with disposable tissues.
Seventy years ago, Elvis was shaking up the airwaves, Lucille Ball had Americans laughing in their living rooms, and Presidents Harry Truman and Dwight D. Eisenhower were charting very different visions for a postwar United States. The space race was just beginning to heat up and prosperity fueled a culture of optimism for many. It was a decade of technological innovation and imagination, with a flood of brand-new gadgets, fashions, and conveniences promising to make everyday life sleeker, faster, and more modern than ever before.
Of course, not everything from the 1950s stood the test of time. Many of the common items that once defined the era have quietly slipped into obscurity, nudged aside by modern technology, shifting tastes, or changing lifestyles. Let’s take a trip down memory lane and revisit some of the most iconic items of the 1950s — things that once felt essential or cutting-edge, but today are charming relics of a very different time.
Credit: Bettmann via Getty Images
Rotary Phone
The rotary dial telephone became widely used in homes starting in the 1920s, but it wasn’t until the 1950sthat it truly became a fixture of everyday American life. Postwar prosperity and suburban growth meant that more and more families could afford a phone, and by the mid-1950s, two-thirds of U.S. households had at least one telephone.
Using a rotary phone was a slow process: You placed your finger in the hole corresponding to the number you were dialing and rotated the dial clockwise until it hit the metal stop, released it to let it return to its original position, and repeated the move for each digit in the phone number. The Bell System introduced touch-tone dialing in 1963, which eventually made rotary phones obsolete. Though they’ve vanished from daily use, the distinct clicking and spinning of a rotary dial remains one of the most iconic sounds of midcentury life.
Credit: ClassicStock/ Archive Photos via Getty Images
Poodle Skirt
Few garments symbolize the 1950s quite like the poodle skirt. Introduced in 1947 by designer Juli Lynne Charlot, the embellished circle skirt quickly became a staple of teenage fashion in the 1950s. These wide, swingy garments were typically made of felt and featured appliqué designs, with the poodle becoming the most iconic motif. The breed, often depicted with a leash or in playful poses, was a nod to the era’s fascination with French culture and sophistication. Its whimsical charm captured the youthful exuberance of the time.
While the poodle was the most popular design, variations soon emerged, including frogs, racehorses, and pink elephants. These personalized touches allowed wearers to express their individuality and style. In the early 1960s, fashion trends shifted toward sleeker, more streamlined silhouettes, and the poodle skirt gradually faded from everyday wear.
Advertisement
Advertisement
Credit: Bettmann via Getty Images
3D Comic Books
In the early 1950s, comic books embraced the 3D trend, offering readers a thrilling new way to experience their favorite characters. Using red-and-blue anaglyph glasses, fans could watch superheroes, monsters, and space adventures “leap” from the page. The first 3D comic wasThree Dimension Comics #1, released in September 1953 by St. John Publications. Featuring Mighty Mouse, it sold more than 1.25 million copies in its first printing, an amazing accomplishment at the time.
Despite the initial popularity, the 3D comic book trend was short-lived. By the mid-1950s, the novelty had worn off and the format faded from mainstream use. However, the legacy of 3D comics endures, occasionally resurfacing in novelty reprints and special editions for collectors and enthusiasts.
Before automatic ice makers or flexible plastic trays, households relied on metal ice cube trays with a lever-operated insert. First patented in 1949, these lever-operated aluminum ice trays replaced flexible aluminum trays that were twisted to release scored ice cubes. By the 1950s, lever-style metal trays had become a standard feature in another relatively new appliance: the home refrigerator with a freezer compartment.
Marketed as “Magic Touch” and “Honeycomb” trays, the lever-operated gadgets were considered a convenient and necessary household tool for making ice. Pulling the handle cracked the ice into neat blocks — a noisy but satisfying process that reflected the era’s fascination with functional, modern appliances. But by the 1970s, plastic trays and built-in ice dispensers had largely replaced the aluminum trays.
In 1960, Sam Cooke sang, “Don’t know much about algebra / Don’t know what a slide rule is for,” capturing the feelings of generations of students who struggled with math and the classic tool designed to help them calculate equations. But before the pocket calculator revolutionized mathematics, the slide rule was an essential tool for students.
Invented in the 17th century, slide rules were made of wood, plastic, or metal and used logarithmic scales to perform multiplication, division, and even trigonometry. Essentially an analog calculator, they required the user to manually slide the scales to line up numbers. High school and college students carried them much like modern students carry graphing calculators, and mastering the slide rule was considered a rite of passage. The invention of the first electronic calculators in the 1960s, followed by pocket-sized versions in the 1970s, marked the end of this ubiquitous tool that had been in use for nearly 350 years.
Metal lunch boxes were the ultimate school accessory of the 1950s. With sturdy construction and colorful designs featuring cowboys, astronauts, or TV heroes, they made carrying a sandwich and thermos something to show off. Aladdin Industries popularized the trend in 1950with a Hopalong Cassidy lunch box, producing 600,000 units that year at $2.39 each. The item featured a metal snap for a hinged lid and a collapsible metal handle, and was the first lunch box to bear a licensed image, setting a precedent for future designs.
Soon, nearly every popular television show, movie, and character had its own lunch box version, and the clang of a metal lunch box was as familiar as the school bell. Plastic eventually replaced metal due to safety concerns, though vintage metal versions remain highly collectible today.
The 1950s introduced the revolutionary idea of having music you could take anywhere. Before the transistor, radios were large, stationary consoles, and families had to gather around the living room set to hear the latest hits. The invention of the transistor in 1947 made it possible to build smaller, lighter, and more portable radios, paving the way for personal, on-the-go listening.
The first commercially sold transistor radio, the Regency TR-1 in 1954, was a luxury item at $49.95 (about $400 today). But its novelty captured the interest of a generation hungry for new ways to be entertained. As prices dropped later in the decade, transistor radios became more accessible, giving teenagers the freedom to listen to what they wanted, when they wanted, without parental supervision. Transistor radios quickly became a staple for music, sports, and news, but were eventually replaced by the cassette player and boombox.
Spanning from roughly the 1920s to the early ’60s, Hollywood’s golden age begat some of film’s most glamorous stars, classic quotes, and enduring scenes. Indeed, this era of moviemaking from the greater Los Angeles area was a time when the very idea of “Hollywood” as an idealized alternate reality was crystalized in the public consciousness.
While we now know that life wasn’t always so rosy behind the red carpet photo shoots and studio gates, that concept of old, glamorous Hollywood retains a strong emotional pull as a place where dreams were realized. Here’s a look at six photos that capture some of the defining faces, events, and achievements of the era.
Promotional posters have been around since the inception of the motion picture industry, with full-color prints noting the leading men, women, and, in this case, creatures to be seen in the project being highlighted. King Kong (1933), of course, was a landmark film of its era, thanks in part to production techniques that included the pioneering usage of stop-motion photography and creation of an original score.
While the movie’s promotional material effectively showcased the terrible power of its titular monster without being particularly groundbreaking, American film posters soon exhibited more innovative styles of collage and minimalism. And unlike the largely anonymous artists of the early 20th century, illustrators such as Bill Gold and Saul Bass enjoyed acclaim for their work in this particular field during the golden age of Hollywood.
Audiences have long delighted in the sparks-generating tension of romantic comedies, and no on-screen couple pulled it off better than the golden age duo of Katharine Hepburn and Spencer Tracy. From their initial pairing starring as co-workers in Woman of the Year (1942) to their final go-round portraying parents in Guess Who’s Coming to Dinner (1967), Hepburn and Tracy brought out the best in each other across their nine movies together. Their chemistry was rumored to have fueled a real-life romance that lasted from their first movie until Tracy’s death in 1967, with the married actor supposedly refusing to divorce his wife due to his strict Catholic beliefs.
For better or worse, the films and pageantry of Hollywood have served as a reflection of the ambitions, fears, and prejudices of society throughout the years. This is perhaps best exemplified by the 1940 Academy Awards, which saw Hattie McDaniel become the first Black actor to nab a win, in the Best Supporting Actress category for her performance in Gone With the Wind (1939).
Yet McDaniel was only allowed into that year’s ceremony in the segregated Cocoanut Grove nightclub by way of a special favor from ownership, and even then she was forced to sit in the back of the room, away from the table with her white co-stars Vivien Leigh and Clark Gable. It took two dozen years before Sidney Poitier became the second Black Oscar winner, for his leading role in Lilies of the Field (1963), and an additional 19 years for Louis Gossett Jr. to become the third, for his supporting effort in An Officer and a Gentleman (1982).
As with the Academy Awards, the imprinting of hands and/or feet in wet cement at Hollywood’s TCL Chinese Theatre serves as a marker of top-tier success in the movie industry. The tradition is said to have sprung from an accident; silent film star Norma Talmadge supposedly stepped in new cement before the opening of what was then Grauman’s Chinese Theatre in 1927, inspiring owner Sid Grauman to make a formality of such activity with the hand and footprints of Mary Pickford and Douglas Fairbanks in April.
Over the years, a few stars have added a distinct touch to the proceedings: Betty Grable left the imprint of her leg in 1943; George Burns supplied one of his cigars in 1979; and jokester Mel Brooks added a prosthetic sixth finger for a handprint in 2014. Marilyn Monroe even dotted the “i” in her name with an earring during her 1953 dual ceremony with Jane Russell to celebrate the release of Gentlemen Prefer Blondes, although it was carved out by an opportunistic thief a few days afterward.
From George Cukor and John Ford to Howard Hawks and Alfred Hitchcock, Hollywood’s golden age was marked by boldface-name directors who packed the star power to rival that of the era’s A-listers. The same standing briefly applied to Orson Welles, a former stage actor who rocketed to fame with his panic-inducing radio adaptation of War of the Worlds in 1938, before directing one of the all-time cinematic masterpieces with Citizen Kane (1941), as seen in the photo above.
The mercurial Welles soon grew disillusioned with Hollywood, however, and he flitted in and out of the industry’s embrace while continuing to direct and star in signature films including The Lady of Shanghai (1947) and A Touch of Evil (1958). Although he never again reached the towering heights of Citizen Kane, Welles left behind a record as one of the greats of his era thanks to innovative techniques such as extended takes and “deep focus” filming.
The golden age of Hollywood was ruled by the all-powerful studios where the movie magic was made. From 1930 to 1948, the combination of Hollywood’s “big five” studios — Metro-Goldwyn-Mayer (MGM), Paramount Pictures, Warner Bros., 20th Century Fox, and RKO Pictures — and “little three” — Universal Pictures, United Artists, and Columbia — accounted for a whopping 95% of films exhibited in the United States.
MGM, seen above in a photo from around 1945, was perhaps the most prominent of the bunch, and a rundown of its facilities showcases just how much effort went into sustaining the empire. Along with offices for its executives, directors, writers, and various casting, camera, and wardrobe departments, the Culver City-based studio had lots dedicated to indoor and outdoor sets, including a 63 million-gallon artificial lake, as well as vehicle storage, a zoo, and a nursery. However, Hollywood’s bosses lost their iron grip on the industry with the 1948 Supreme Court ruling that forced studios to divest from their movie theater businesses, eventually paving the way for more independent filmmakers and a new era of cinema.
Credit: Universal History Archive/ Universal Images Group via Getty Images
Author Paul Chang
September 9, 2025
Love it?21
Constructed between 1792 and 1800, the White House has been home to every U.S. president except the first: Though George Washington oversaw construction of the building, he never actually lived in it. But the White House’s 132 rooms and 18 acres are more than just a residence — the Pennsylvania Avenue mansion is a symbol of power that occupies a singular place in American history and popular culture. Here’s a closer look at six White House rooms where America’s presidents have lived, worked, and even played.
Despite being the most famous room in the White House, the Oval Office was not part of the original building. In fact, the White House lacked a dedicated presidential office until Theodore Roosevelt constructed a “temporary” executive office building in 1902, known today as the West Wing. It contained the first presidential office — a rectangular room now known as the Roosevelt Room.
It was Roosevelt’s successor, William Howard Taft, who made the West Wing a permanent feature of the White House, holding a competition to select an architect to oversee its renovation and enlargement. The winning architect, Nathan C. Wyeth, doubled the West Wing’s size and constructed the Oval Office in its center. Wyeth’s vision for an office fit for the president took inspiration from another famous room in the White House: the Blue Room, which was also an oval shape due to George Washington’s aesthetic preferences. The Oval Office’s last major renovation took place in 1934 under Franklin D. Roosevelt, who moved the room to its current location in the southeast corner of the White House, overlooking the Rose Garden.
The Situation Room — sometimes referred to as the “nerve center” of intelligence operations at the White House — is technically a complex of connected rooms rather than a single space. The concept of a “War Room” containing telegraph systems and maps dates back to William McKinley’s presidency during the 1898 Spanish-American War. However, the Situation Room as we know it today was built in 1961 by John F. Kennedy in the aftermath of the failed Bay of Pigs invasion.
Frustrated by an inability to receive real-time military information, Kennedy decided that the White House needed a dedicated intelligence and communications center. Since then, the Situation Room has been critical to presidents’ handling of pivotal moments in American history, including the Cuban Missile Crisis, the aftermath of the 9/11 attacks, and the raid on Osama Bin Laden’s compound in 2011.
Advertisement
Advertisement
Credit: Bettmann via Getty Images
Lincoln Bedroom
Despite its name, the Lincoln Bedroom was never actually used by Abraham Lincoln as a bedroom. Rather, it was an office and meeting area. The room has been called the Lincoln Bedroom since 1945, when Harry S. Truman had it decorated with Lincoln-era furniture, including the original desk upon which the 16th president drafted the Emancipation Proclamation.
Today, the room remains a bedroom and contains a copy of the Gettysburg Address, handwritten and signed by Lincoln, as well as a painting depicting enslaved people awaiting news of the Emancipation Proclamation. Rumored to be one of the most supernatural rooms in the White House, the Lincoln Bedroom has been the site of reported encounters with Lincoln’s ghost, notably by Queen Wilhelmina of the Netherlands, First Lady Grace Coolidge, and First Lady Eleanor Roosevelt.
A precursor to the Situation Room, the Map Room was used by Franklin D. Roosevelt during World War II to monitor battle operations around the world. After the outbreak of the war, National Geographic donated special wall-mounted “map cabinets” to Roosevelt, which contained maps organized by hemisphere, region, and theater of operation. These documents were regularly updated by cartographers, enabling the president to have the most up-to-date information. The room was staffed 24/7, and was used to communicate with world leaders including Winston Churchill, Joseph Stalin, and Chiang Kai-shek. Today, the Map Room functions as a sitting room that has been used to host guests, including the Dalai Lama.
Even presidents need a place to unwind. The White House Family Theater was created in 1942 when Franklin D. Roosevelt had a cloakroom known as the “Hat Box” converted into a 42-seat screening room. Since then, presidents have used the theater to enjoy films with family and friends, and as a space to practice speeches. Ronald Reagan, himself an actor, watched an impressive 363 films while in office, surpassed only by Jimmy Carter, who watched 480. Richard Nixon was such a fan of the war movie Patton that he had to refute rumors that the film inspired his decision to invade Cambodia in 1970. And Bill Clinton once said, “The best perk in the White House is not Air Force One or Camp David or anything else. It’s the wonderful movie theater I get here.”
The first White House bowling alley was constructed in the basement of the building in 1947 as a birthday gift to Harry S. Truman when he turned 63. Truman’s first bowl knocked down seven out of 10 pins, one of which is now on display at the Smithsonian. Truman later had the lanes moved from the White House to the Eisenhower Executive Office Building next door, where White House employees formed a bowling league in 1950 — Secret Service agents, household staff, groundskeepers, and more competed in tournaments across the country. Bowling returned to the White House under Nixon, who had a one-lane alley constructed below the North Portico of the White House in 1973. An avid bowler, Nixon reportedly played every week and had a high score of 232.
Even if you favor entertainment over education when it comes to cinema, you probably care at least a little about historical accuracy when watching historical dramas. A little creative license never hurt anyone, but attention to detail adds an authenticity that makes the viewing experience feel richer. That’s especially true of war movies, which attract two overlapping sets of people sometimes known for being persnickety: history buffs and cinephiles. If you belong to either camp, you’ll want to watch these five war movies known for their historical accuracy.
Though most war movies are praised or criticized for how accurately they portray the battlefield, Fred Zinnemann’s Best Picture winner has been noted for capturing another aspect of military life: the U.S. armed forces’ “obsession” with sports, which reached its zenith during World War II.
From Here to Eternity is mainly about the months leading up to the attack on Pearl Harbor, but one of its most significant subplots involves a regimental boxing team, which were common during World War II. (Its most famous scene, however, has nothing to do with either — and you’ve almost certainly seen it even if you’ve never seen the actual movie.) Soldiers were encouraged to pursue athletics to raise morale, with baseball and football also proving popular — not least for the way they contrasted with the preferred sports of the Axis.
The threat of nuclear war was more prevalent when Threads was made in the 1980s than it is today, but watching this uniquely unsettling television movie will have your pulse pounding all the same.
Hailed as “a film which comes closest to representing the full horror of nuclear war and its aftermath” in Toni A. Perrine’s Film and the Nuclear Age: Representing Cultural Anxiety, the documentary-style drama takes place in Sheffield, England, over the course of more than a decade and pulls no punches in its portrayal of nuclear winter and the utterly devastating effects it would have. There’s little hope to be found in Threads, but the film is said to have had a sobering effect on everyone from Ronald Reagan to British Labour Party leader Neil Kinnock.
There’s a very good chance that, even if you agree with the consensus that Come and See is among the greatest movies ever made, you won’t want to watch it more than once. It’s simply that disturbing, but director Elem Klimov’s insistence that his movie had to be “realistic to the maximum” is evident in every scene. That’s particularly true of its depiction of war crimes carried out by Germany during its occupation of Belarus, which has been singled out as the most shocking — and, tragically, historically accurate — aspect of the film.
It took $12 million, four weeks, and 1,500 background actors to film Saving Private Ryan’s landmark opening sequence, which shows the D-Day landing on Omaha Beach in excruciating detail. The result was clearly worth it, as Steven Spielberg’s World War II epic is regarded as one of the greatest movies ever made, earning $482 million at the box office on the way to winning five of the 11 Oscars for which it was nominated (though not, in a stunning upset, Best Picture).
Spielberg interviewed several World War II veterans during pre-production, and that same community was among the most laudatory when the movie was released. There were reports of veterans weeping during and after Saving Private Ryan, with one remarking that “the only thing that was missing was the smell.”
Despite how memorable it is, Saving Private Ryan wasn’t even the best World War II movie released in 1998. That would be Terrence Malick’s The Thin Red Line, which, despite ultimately being about so much more than war in general or World War II in particular, faithfully captures both the Battle of Guadalcanal and the actual experience of being a soldier.
Malick’s philosophical, even poetic approach to the “overlooked” conflict at the film’s center didn’t get in the way of attention to detail, with historian John McManus praising the film for getting everything from how soldiers move across the field of battle to the actual blades of grass on that field right, and ultimately grading its accuracy a nine out of 10.
For similar articles, subscribe to our sister brand Movie Brief, brought to you by our resident film critic Michael Nordine. You’ll receive a weekly review and recommendation of a new movie, whether in theaters or available to stream, as well as a list of 25 must-see movies when you first sign up.
Credit: Gaston Paris/ Roger Viollet via Getty Images
Author Timothy Ott
September 4, 2025
Love it?27
Well before there was a need to enter individual units of an apartment building or secure clothes inside a gym locker, people sought to keep thieving hands away from their stores of grain or precious jewels. As a result, locks and keys have been around for a large chunk of recorded history.
While both have undergone numerous alterations in accordance with ever-updating technologies, the story of keys is perhaps more personal to the human experience as the portable component that has accompanied us on our journeys over the years. Here’s a look at how these pronged keepsakes have changed since they first surfaced in the ancient world.
According to Eric Monk’s Keys: Their History and Collection, the oldest known lock is a wooden specimen unearthed from the ruins of the Palace of Sargon in Dur-Sharrukin (modern-day Khorsabad), near the Assyrian capital of Nineveh. As a similar version was found displayed on frescoes at the Karnak temple complex in Egypt, this style of “Egyptian lock” is believed to be in the neighborhood of 4,000 years old.
This early form of security entailed sliding a wooden board through a slot across a door, with movable pins above the slot dropping through corresponding holes in the board to keep it bolted in place. The key for this type of lock was another long piece of wood, sometimes measuring more than 2 feet, with pegs on the end to push the pins back through the holes and allow the board to be released from the bolting position. While these locks were originally fitted to the outside of a door, a hole cut next to the lock enabled a person to reach through and operate the lock from the inside.
The ancient Greeks, meanwhile, developed a different system in which the lock was mounted on the inside of the door. Unlocking from the outside again involved reaching through a hole, with a sickle-shaped key used to turn the bolt. As with the Egyptian models, these keys were large and were often carried by being slung over the shoulder.
A noticeable difference to methodology arrived with the Romans, who began developing metal locks and bronze keys. Romans employed the use of "wards," which were obstacles built within the locks that could only be bypassed by keys of specific shapes. This led to the development of teeth and cut-out shapes within the "bit," the end of a key that engages the bolt.
The Romans also popularized the use of small keys that fastened to a ring and could easily be carried in the palm of the hand. These keys were typically used for boxes that stored valuables within a home and as such were used to signify the wearer's affluence.
Key ornamentation progressed to a remarkable degree through the long period of the Middle Ages, which eventually saw bronze keys phased out in favor of iron. While some medieval keys were small, others surpassed 9 inches in length. This served the same purpose as the Roman ring keys, to project the wealth and prestige of the owners of such magnificent instruments, with the added benefit that these plus-size keys were less likely to be lost than tinier versions.
Early medieval bits featured a cutout cross or another simple cleft to bypass a built-in ward, but were eventually fashioned with teeth and other irregular shapes. The handle, or "bow," also underwent a series of stylistic changes, from the loops and ovals of the eighth century to the kidney shape, often intricately designed, of the 15th century. By this time, many keys were also embedded with a "collar" to stop them from being pressed too far into holes.
By the 17th century, bows were sometimes designed with animal motifs, initials, or a coat of arms. Ceremonial keys also soon came into vogue, including the gilded "chamberlain's keys" that once served a functional purpose in European royal courts but evolved into ornaments for special occasions.
As mentioned in Keys: Their History and Collection, the attention to craftsmanship failed to solve an enduring problem: Most locks were capable of being picked by a person with the dexterity and patience to make the effort. This sparked a focus on stronger security measures in the second half of the 18th century, and the revival of ancient Egyptian methodology with English locksmith Robert Barron's introduction of a twin-tumbler lock in 1778.
Six years later, English inventor Joseph Bramah unveiled an even stronger lock that featured spring-loaded sliders spaced around a pin. A small, cylindrical key embedded with notches corresponding to the height and depth of each slider engaged the pin, enabling the key to turn.
While inventors on both sides of the Atlantic competed to produce the strongest locks over the first half of the 19th century, their creations were too expensive for the general public to afford. That meant old-fashioned (and easily pickable) warded locks remained commonplace around homes, to be locked and unlocked by hefty iron keys.
However, the industry changed for good after the invention of the Yale lock. Patented by Linus Yale in 1844 and perfected by Linus Yale Jr. over the next two decades, this lock required the insertion of a flat, serrated key that raised a series of spring-loaded tumblers to varying heights.
The Yale products became prototypes of the types of keys that remain widely in use today, and with the invention of specialized machinery by the end of the 19th century, the manufacture of locks and keys moved into an era of mass production that made these items more affordable.
Nowadays, keys for homes and valuables are typically made from materials such as nickel silver, brass, and steel. But the development of key cards in the 1970s marked a turn away from metal, and innovations in biometrics have hinted at a future in which fingerprints and facial recognition technology become the norm for entry.
Ultimately, physical keys may wind up being a relic of the past, much the same way as those clunky old door openers from Egypt and Greece are to us today.
As Gene Kelly noted in the classic 1952 musical Singin’ in the Rain, sometimes people just gotta dance. Whether it’s an impromptu shaking of the hips or a complex routine dashed off by a professional like Kelly, dancing is a gift available to people of any skill level who feel inspired to move their limbs to a beat. And although many of us prefer to display our rhythmic limitations in private, dancing is unquestionably a social activity, and has been from time immemorial.
While most traditional dances seem old-fashioned to us nowadays, many well-known forms, including the polka, foxtrot, and waltz, are relatively modern creations from the past one to two centuries. Other dances, however, are far older, with roots that hearken back to the world’s formative civilizations.
Determining which dance forms are the oldest is an inexact science, since many morphed and were incorporated into other styles as they shimmied across cultures and eras. Nevertheless, here’s a look at some of the oldest dances in human history.
Credit: Heritage Images/ Hulton Fine Art Collection via Getty Images
Grass Dance and Other Indigenous Rituals
Some Indigenous cultures today proudly display moves learned from their ancestors to celebrate deities and mark the cycles of life. Although their traditions date back thousands of years, potentially making them the oldest dances known to humans, the preponderance of oral storytelling over written records in these cultures renders it impossible to determine just how old these dances are.
Native Americans, who first set foot in North America as many as 23,000 years ago, are known in part for the array of dances performed in communal powwow gatherings. One of the older examples is the grass dance, said to have originated with the Omaha people in the Northern Plains, in which performers stomp their feet and twirl to the thumping beat of a drum.
The Aboriginal people of Australia, with a history that dates back some 75,000 years, also have a communal gathering known as a corroboree, a ceremonial combination of dancing, singing, and narration to relate the origin stories of the Dreamtime (Aboriginal mythology). Their dances include the Warren Jarra, which entails the continuous motion of pivoting bent legs inward and outward, and the Ngukum, with participants waving leaves to simulate fending off mosquitoes.
In Central India, the discovery of what appears to be dancing figures amid the Mesolithic-era cave paintings of the Bhimbetka rock shelters underscores the long history of dance on the subcontinent. While it's unknown what particular moves the artist was documenting, the survival of the Naṭyashastra, a Sanskrit treatise on art that may date back to at least the third century BCE, offers better insight on early Indian dances.
Spawned in Southern India, Bharatanatyam was originally a Hindu temple dance performed by solo women. Broken down into the divisions of nritta (technical dance), nritya (expressive dance), and natya (storytelling), Bharatanatyam is characterized by squatting legs and counter-rhythm footwork, its performances lasting up to two hours.
From the eastern part of the country, Odissi also originated as a temple dance performed by women. Compared to Bharatanatyam, its movements are generally slower and subtler, and feature more of the elaborate hand gestures known as mudras.
Kathak developed among nomadic storytellers of North India, so this dance is also strongly built around narrative principles. Not limited to either gender, performers showcase balance and grace with an upright posture and techniques that include chakkars (spins), tatkar (foot strikes), and paltas (patterns). Unlike the country’s other native dances, which date back more than 2,000 years, Kathak came into its modern form by way of combined Hindu and Muslim influences.
In Hawaii, residents have been performing the hula for centuries. This world-famous dance, which involves the swaying of hips and corresponding wavelike arm movements, comes with an array of backstories that tie its roots to differing regions and mythological figures. While such folklore makes hula’s origins nearly impossible to pinpoint, The Encyclopedia of World Folk Dance estimates the dance dates back to around 300 CE.
The encyclopedia also points to a similar dance from a neighboring geographical region with even earlier beginnings. The Siva Samoa, introduced by the Lapita people around 800 BCE, is perhaps the oldest dance of the Polynesian islands. Mainly performed by women at social gatherings, the Siva Samoa incorporates side-to-side swaying, graceful arm movements, and facial expressions to relay the emotions of its performer.
From China, the dragon dance has endured as a tradition of Chinese New Year since the Han dynasty in the third century BCE. Initially performed as a means to worship ancestors and pray for rain, this good luck dance requires the use of multiple performers to coordinate the swirling and sweeping motions of a multipart dragon puppet to the rhythms of cymbals, gongs, and a drum.
Considering Mesopotamia in the modern-day Middle East is known as the “cradle of civilization,” it follows that some of the world’s oldest dances were nurtured in this region. One such dance is the dabke, or dabkeh, which, according to TheEncyclopedia of World Folk Dance, can be traced back to the Canaanites or Phoenicians around 2000 BCE.
Performed by a group of people in a line or circle, the dabke is an energetic dance marked by stomping, kicking, and twirling items such as handkerchiefs. While the dabke has increasingly assumed the status of a protest dance in recent years, it remains a means of celebration at festivals and weddings across Palestine, Jordan, Syria, and Lebanon.
Like the hula, the belly dance is a widely recognized performative art with no consensus origin story. One starting point can be found in the Indus Valley in 3000 BCE, where midwives helped prepare expectant mothers with abdominal exercises. Migrating to the Middle East and North Africa, belly dancing took on differing names and variations of the torso-focused thrusting and shaking that distinguished it from other dances. Belly dancing has endured as a highly developed form of expression, and is often credited as the oldest known dance in the world.
Advertisement
Advertisement
Another History Pick for You
Today in History
Get a daily dose of history’s most fascinating headlines — straight from Britannica’s editors.