Credit: Gray Mortimore/Allsport/ Hulton Archive via Getty Images
Author Tony Dunnell
January 27, 2026
Love it?30
Every four years, the world watches as elite athletes push themselves to ever greater heights in search of Olympic gold — and, perhaps, even a new Olympic record. Some records, however, stand so far above the rest that they seem destined to endure forever. An “unbreakable” record, of course, is a little hard to prove, but some feats — such as the five below — are so exceptional that it seems unlikely they will be bested anytime soon.
Credit: Smiley N. Pool/ Houston Chronicle via Getty Images
Usain Bolt’s 100 Meters
During the 2012 London Games, Jamaican sprinter Usain Bolt set a new Olympic record time of 9.63 seconds in the 100-meter dash. The record has yet to be beaten at the Olympics and would have represented the absolute pinnacle of human speed if it weren’t for Bolt’s world record of 9.58 seconds, set at the 2009 IAAF World Championships in Berlin. (Bolt reached an astonishing 27.8 mph when he hit full stride.)
Bolt is a towering figure, quite literally, among his rivals. The fastest sprinters tend to be comparatively short and compact in comparison to Bolt’s frame of 6 feet, 5 inches, which allowed him to complete a 100-meter race in around 41 steps — three to four steps fewer than his competitors. His perfect technique, peak competition form, and biomechanical uniqueness may never be seen again, making it unlikely that his record will be beaten in the foreseeable future.
When Bob Beamon jumped 8.90 meters at the 1968 Mexico City Games, he didn’t just beat the Olympic record — he obliterated it. Prior to his jump, the record tended to be surpassed by just a few centimeters each time. But Beamon beat the existing world record by a staggering 55 centimeters, in what was soon dubbed the “leap of the century.”
Not even the judges were ready for such a feat. The electronic measuring equipment went up to only around 8.5 meters, so an old-fashioned steel tape had to be found. American athlete Mike Powell eventually broke the world record in 1991 with a jump of 8.95 meters, but Beamon’s leap is still the Olympic record more than half a century later. It remains to be seen whether it will ever be surpassed on the Olympic stage.
Advertisement
Advertisement
Credit: David Madison via Getty Images Sports
Jackie Joyner-Kersee’s Heptathlon Haul
At the 1988 Olympic Games in Seoul, South Korea, Jackie Joyner-Kersee proved that while she may not have been the best in the world in any one of the heptathlon’s events, she was the best in the world at doing all seven. Finishing with 7,291 points, she beat her own previous world record, which she had set four times since 1986. At the time, no other woman had scored more than 7,000 points — but Joyner-Kersee had done it five times. Almost four decades later, her Olympic record still stands.
In 2005, reigning heptathlon Olympic gold medalist Carolina Kluft said that Joyner-Kersee’s record was “probably unbeatable.” That may well be true, but American athlete Anna Hall is intent on trying. In 2025, Hall scored 7,032 at a competition in Austria, which is tied for the second-highest scorer ever — but still 259 points behind the Olympic and world record. She’s certainly receiving good advice in her quest for the record, as her mentor is none other than Jackie Joyner-Kersee herself.
Credit: Don Morley/ Hulton Archive via Getty Images
Nadia Comăneci’s Perfect 10
In 1976, Romanian gymnast Nadia Comăneci became the first gymnast to earn a perfect score of 10 at the Olympic Games. She was only 14 years old when she competed at the Montreal Olympics and got a perfect score on the uneven bars. It was so unexpected that even the scoreboards weren’t prepared. Being unable to display the four digits required for 10.00, they instead had to display each judge’s score as 1.00.
Incredibly, Comăneci went on to record a perfect 10 six more times, while also becoming the youngest all-around Olympic gold medalist. Though other Olympic gymnasts have scored a perfect 10 since then, there will never be another first.
American swimmer Michael Phelps won an amazing 28 Olympic medals during his career, a feat that may well be impossible to surpass. A staggering 23 of those medals were gold — more than double the count of his nearest rivals. The previous record holder for the most Olympic medals accumulated was (and still is) Soviet Union gymnast Larisa Latynina, who won 18 medals (nine of them gold) from 1956 to 1964.
After a long and glorious career, Phelps hung up his Speedos and retired after the 2016 Games, fairly secure in the knowledge that his reign of dominance — and his medal haul — may not be surpassed by any future Olympian.
Advertisement
Advertisement
Who Is Sadie Hawkins — And Why Is a Dance Named After Her?
Credit: Smith Collection/Gado/ Archive Photos via Getty Images
Author Kristina Wright
January 22, 2026
Love it?74
The Sadie Hawkins dance is a familiar tradition to most Americans, best known for the custom of girls asking boys to the dance instead of the other way around. In a world where women run businesses, lead governments, and head nearly half of U.S. households, setting aside one special night for girls to take the lead can feel unnecessary and outdated. Still, the story behind Sadie Hawkins herself offers a fascinating window into Depression-era America and the surprising ways popular culture can shape real-life traditions for generations.
Mind you, Sadie Hawkins wasn’t a real person. She was a comic-strip creation dreamed up in the late 1930s by cartoonist Al Capp for his wildly popular comic Li’l Abner. At its peak,Li’l Abner ran in 900 newspapers in the U.S., and it remained in print until 1977. Set in the rural town of Dogpatch, Kentucky, the strip was filled with broad satire and a large cast of quirky, unforgettable characters. Among them were the handsome and gullible Li’l Abner Yokum, his eternally patient sweetheart Daisy Mae Scragg, the perpetually unlucky Joe Btfsplk, and the scheming industrialist General Bullmoose. But Sadie Hawkins proved to be the character whose antics took her from the comics page into real-life.
Sadie Hawkins first appeared in Li’l Abner on November 15, 1937, initially as a secondary character. She was introduced as the “homeliest gal in all them hills,” an intentionally exaggerated description that played on the intense social pressure for women to marry young. Her father, Mayor Hekzebiah Hawkins, was distressed that Sadie had reached the age of 35 without a husband, a situation he viewed as both humiliating and urgently in need of correction.
To solve the problem, he invented Sadie Hawkins Day, a new local holiday with a peculiar edict. All eligible bachelors were required to run through Dogpatch while Sadie — who was an excellent runner — chased them. According to the rules, any man Sadie managed to catch before sundown was obligated to marry her.
The sight of panicked men sprinting to avoid matrimony while Sadie pursued them delighted readers. Sadie caught — and married — John Jonston, but what began as a one-off gag quickly became one of Li’l Abner’s most memorable storylines, with Capp featuring a new Sadie Hawkins Day race every year until 1952, when Daisy Mae finally caught her beau, Li’l Abner.
The popularity of Sadie Hawkins Day quickly spread beyond the comic strip. Readers embraced the idea so enthusiastically that they began organizing real-life versions of the event. The first Sadie Hawkins-themed celebrations were hosted in 1938 and included mock chases inspired directly by the strip, with students dressing up asLi’l Abner characters. Prizes were awarded for the best costumes, and men who were “caught” were expected to accompany the women who caught them to the Sadie Hawkins dance. In 1939,Lifemagazine reported that 201 colleges and clubs in 188 cities had hosted Sadie Hawkins Day events, and that number grew to more than 500 by 1941.
Over time, the dance became the centerpiece of these celebrations. The events typically featured a reversal of traditional dating norms, with women expected — or at least permitted — to ask men to attend the dance with them. The idea spread so rapidly that many participants assumed Sadie Hawkins was drawn from folklore or history rather than a newspaper cartoon.
Advertisement
Advertisement
Credit: George Marks/ Retrofile RF via Getty Images
A Social Experiment
Sadie Hawkins Day resonated because it tapped into real anxieties about courtship and marriage in early 20th-century America. Marriage was widely viewed as the primary measure of a woman’s success, and unmarried women beyond their 20s were often stigmatized and labeled as spinsters. Al Capp’s exaggerated portrayal of these pressures allowed readers to laugh while still recognizing the social truth beneath the joke.
The Sadie Hawkins storyline inspired a real-life social experiment that revealed how deeply entrenched traditional gender expectations were. By imagining a world where women could pursue men — even if only for a single day — it introduced a socially acceptable framework for a temporary role reversal. For many young women, a Sadie Hawkins dance — sometimes called a “turnabout,” as in the expression “turnabout is fair play” — offered a rare opportunity to take the initiative without defying social norms.
Credit: H. Armstrong Roberts/ Retrofile via Getty Images
Sadie Hawkins Today
As Sadie Hawkins Day settled into American school culture, it evolved into the dance event most people remember today. Some schools leaned into rustic or comic-inspired themes as a nod to Dogpatch, while others treated it simply as a fun reversal of traditional gender roles. Over time, the direct connection to Li’l Abner faded, but the name endured.
By the 1940s and ’50s, Sadie Hawkins dances had become a widespread social phenomenon that moved beyond college campuses to high schools and community groups across the country. Thousands of schools were hosting Sadie Hawkins or “turnabout” dances each year, making it one of the most common nonholiday social events on American school calendars.
Today, Sadie Hawkins dances are less ubiquitous than they once were, and dating roles have loosened considerably. Many schools have even retired the name, replacing it with events such as MORP dances — “prom” spelled backward — that keep the spirit of role-reversal while dropping the gendered framing altogether.It’s no longer taboo for a woman to ask a man to a dance, and being unmarried doesn’t carry the same stigma it once did. Yet when it comes to proposing marriage, tradition still holds firm. According to a 2022 survey, only 2% percent of heterosexual women in the U.S. propose to their partners. In that sense, Sadie Hawkins’ legacy still lingers, reminding us that while some social rules are easy to rewrite, others take much longer to change.
Credit: duncan1890/ DigitalVision Vectors via Getty Images
Author Tony Dunnell
January 22, 2026
Love it?38
The American dance floor in the 19th century was a very different place than it is today — a far cry from TikTok dance challenges, flash mobs, and K-Pop-inspired choreography. It was a world in which European waltzes scandalized conservative society, Bohemian polkas spread like wildfire, and African American dances transformed the cultural landscape.
The 1800s were a century of great cultural exchange, both internationally and domestically. Dances traveled across oceans and crossed social boundaries, becoming more than just entertainment (or elaborate courtship rituals). Rather, they were social phenomena that reflected America’s cultural evolution. Here’s a fleet-footed look at five of the most popular dance movements that shaped America in the 1800s.
Credit: duncan1890/ DigitalVision Vectors via Getty Images
The Waltz
The waltz arrived in the U.S. from Europe, bringing with it a wave of controversy. Prior to the waltz, most Americans — at least in high society — danced around each other without any real contact. The waltz was the first “closed” dance in which partners actually held each other — arm in arm, face to face, with their bodies close together.
In both Europe and the U.S., critics were aghast when the waltz first came on the scene, and they were swift to warn about what they saw as the dance’s sinful nature. The Gentleman and Lady’s Book of Politeness, published in Massachusetts in 1833, advised, “The waltz is a dance of quite too loose a character, and unmarried ladies should refrain from it in public and private.”
The criticism was all too little, too late, however. The waltz quickly revolutionized partner dancing by allowing couples to spin continuously around the dance floor in three-quarter time. Like many scandalous fads that once were thought to threaten the soul of the nation, it eventually became entirely acceptable. By the mid-19th century, the waltz was one of the most popular dances in America, helping to make physical closeness between dancing partners not just permitted but — as shocking as it may seem — even expected on the dance floor.
Credit: Heritage Images/ Hulton Fine Art Collection via Getty Images
The Polka
Just as Americans were resigning themselves to the spread of the waltz, a new scourge arrived: the polka. The Bohemian folk dance exploded across Europe in the 1840s and quickly found its way to the U.S., leaving a trail of exhausted dancers, bewildered onlookers, and disapproving critics in its wake.
With its lively hopping steps and energetic two-four beat, the polka was a courtship dance with a democratic nature that most aristocratic dances didn’t possess, being relatively easy to learn and requiring more enthusiasm than refinement. The frowns of the upper class were not enough to discourage young members of the elite from polkaing, however, and the dance quickly spread through all levels of society, from the ballrooms of the upper crust to the dance halls of the working class.
While the waltz and polka surged in popularity, some people still preferred their dancing a little more restrained. The quadrille offered all this and more — a perfect social mixer for those who enjoyed memorizing complex dance instructions while wearing uncomfortable shoes.
The dance originated in elite Parisian ballrooms and was picked up by English aristocrats in the early 1800s. It then found its way to America, becoming one of the most popular dances during the latter decades of the century. Typically performed by four couples arranged in a square, the quadrille consisted of five or six different figures, or dance steps, performed in succession, each with its own specific music. The dance greatly influenced the development of American square dancing, which still has moves — allemande, promenade, do-si-do — that reflect the original French influence.
The cakewalk is arguably the most significant American dance of the 19th century — and certainly one of the most culturally complex. It originated on Southern plantations when enslaved people created an elaborate parody of the ballroom dances of their enslavers. They dressed in their finest clothes and performed mocking imitations of European formal dances (particularly a couple dance called the Grand March), adding exaggerated high steps, theatrical bows, and backward tilts of the head — all in a highly satirical manner.
Plantation owners sometimes came to watch these dances and even served as judges for contests. They would present cakes as prizes for the best dancers, often blissfully unaware that the performers were mocking them. After the Civil War, the cakewalk surged in popularity, and by the 1870s it had gone mainstream, marking the first time an African American dance had achieved national exposure. Yet the cakewalk became a staple of minstrel shows where white performers left out the satire and instead used the dance to perpetuate racist stereotypes, often performing in blackface, complicating the legacy of this influential dance.
Advertisement
Advertisement
Credit: H. Armstrong Roberts/ClassicStock/ Archive Photos via Getty Images
The Two-Step
By the end of the 1890s, America had created its own ballroom dance sensation: the two-step. Its exact origins are unclear, but it was likely influenced by earlier dances such as the waltz, polka, and galop. The evolution of these dances into a two-step is often attributed to the music of the famous composer John Philip Sousa. In 1889, one of his most popular compositions, “Washington Post March,” took the country by storm — but none of the existing dances matched its rhythm. So, eager dancers developed the two-step, a relatively simple and accessible dance (especially compared to the likes of the quadrille) that went perfectly with the march. It marked a shift in American dance culture, especially among a new generation who increasingly viewed the old European dances as relics of a bygone era.
Anyone who’s frequented the movies or spent time in front of a TV over the past 70 years probably has a deeply embedded idea of what the inside of a typical Wild West saloon looked like. After all, it’s a setting where countless swaggering sheriffs and perhaps an antihero with no name have breezed through swinging doors to encounter shifty-looking cowboys playing cards and a piano player banging out a jaunty tune from the corner before the place inevitably gets upended by a shootout or furniture-smashing brawl.
Of course, Hollywood is notorious for playing up dramatic elements over an adherence to historical accuracy, as well as for rehashing popular ideas to the point where certain characters and outcomes become tropes. So how valid is this media-driven conception of the Old West watering hole?
Credit: Buyenlarge/ Hulton Archive via Getty Images
Early Saloons Were Bare-Bones Establishments
As detailed in Richard Erdoes’ Saloons of the Old West, American saloons first came into existence when pioneers began pushing westward in greater numbers in the early 19th century, and they weren’t even commonly known by the term “saloon” until the 1840s.
Like the dwellings in many of the early U.S. settlements, the first saloons provided the barest of essentials for those who were temporarily looking to forget the hardships of their rugged lives on the frontier. Many were simply a tent or a lean-to erected over a barrel of whiskey, with perhaps boards laid across empty barrels to serve as tables.
As a settlement became more established, its saloons went from temporary setups to more permanent structures built from local materials. Many were made of wood, while those in areas with scarce timber were built from sod or stone.
By the 1850s, the saloon had progressed from its primitive beginnings and was beginning to resemble the locale that persists in popular imagination.
Its entrance was guarded by the familiar batwing doors that pushed open from the middle. Although there was space to peer above and below the swinging doors, they were generally kept shut to help protect the privacy of customers. Most saloons had an additional set of doors with which to lock the premises, although some remained open for all hours of the day.
Once inside, a customer would typically spot the bar positioned on the left. This would be fronted by the wooden counter, which could reach several dozen feet in length and be made from sturdy oak, mahogany, or walnut in a more reputable establishment. Behind it stood one or more bartenders, often rough characters to handle the customers who got more unruly with successive drinks, but also known to be snazzily dressed with brocaded vests and diamond pins in the high-end venues.
Below the counter were a series of hanging towels with which to wipe one's mouth or facial hair, while a brass rail typically ran along the base of the bar. Close by the rail, stationed at strategic points along the sawdust-covered floor, were spittoons for the tobacco-chewing patrons to dispose of their mouthfuls.
Behind the counter rose the backbar or altar, which was lined by various bottles of rye, bourbon, or potent mixtures by such names as Taos Lightning and Red Dynamite. The backbar was typically built around one large mirror or multiple mirrors, with which a drinker was able to keep an eye on the people and activities behind them.
As saloon customers were overwhelmingly men, the walls typically featured at least one poster of a nude but generally tastefully posed woman. Other paintings were renderings of classical themes such as "Aphrodite Emerging From the Bath," while some talented artists found the local saloon to be a gallery for their original works. Interspersed with these creations were various signs, often displaying a humorous message such as "In God we trust — all others pay cash."
Other points of decor depended on the vocations of the regular guests. A saloon frequented by cowhands would perhaps have mounted longhorns along with other tools of the trade such as saddles, spurs, and branding irons. On the other hand, an establishment in a mining town might showcase displays of rock crystals, gold scales, and prospectors' lamps.
Most saloons featured iron stoves with which to warm visitors, while virtually all offered some sort of sustenance from salty snacks to meals of varying quality cooked from regional game. Guests who needed lodging for the night could retire to a partitioned-off area or backroom with blankets, or simply stretch out on a portion of sawdust that hadn't already been claimed by a customer who'd had too many drinks to find their way home for the night.
Credit: bauhaus1000/ DigitalVision Vectors via Getty Images
Specialty Saloons
What else was found inside these venues depended on the type of saloon it was. It was common to see people gathered around a table engaged in games of poker, faro, or keno in just about any watering hole. Gambling saloons had more areas devoted to these games of chance, as well as more expensive accoutrements such as a roulette table and a full orchestra.
For entertainment, some saloons had a piano, often of the self-operating "player piano" variety, while some put on spectacles of men or animals fighting each other. Variety and concert saloons were roomier venues with stages for performers to sing, dance, and act out plays such as Richard III or Uncle Tom's Cabin.
Then there were the pretty-waiter saloons, also known as "hurdy-gurdy houses," in which guests sought out female companionship to drink and dance with for an evening. These establishments centered around a dance floor, to the side of which sat a piano and its player along with rows of chairs for the women to rest between dances.
All in all, saloons provided a smorgasbord of experiences that could be as varied as their clientele of gamblers, prospectors, miners, cowhands, railroaders, traveling salesmen, and ladies of the night. Some included a barber's chair; others offered mailboxes, or perhaps a rack of newspapers. Even without displaying any particularly noticeable feature, many served a civic function by doubling as a voting booth or a courtroom.
In other words, Hollywood got only part of the story correct. Yes, the Old West saloon certainly had its mustachioed villains and reckless cowboys cheating at cards and throwing each other across tables, but the real history of the saloon, both as a social and civic hub and simply a place where a person could feel the ameliorative effects of a stiff drink, is far more complex.
While most Olympic hopefuls spend years training in relative anonymity, the atmosphere changes considerably once the Games begin and the globe’s attention turns to this multinational sporting extravaganza. And with plenty of media around to document the proceedings for numerous publications and a massive TV audience, it’s inevitable that cameras capture the participants during moments of triumph, anguish, and everything in between.
Here are seven of the most memorable images to emerge from more than a century of world-class athletes giving it their all at the modern Olympic Games.
Tommie Smith and John Carlos Salute Black Power (1968)
As much as the Olympics are meant to be a time to set aside political discontent in the spirit of international competition, such allowances are not always on the agenda for participants. They certainly weren’t for U.S. sprinters Tommie Smith and John Carlos, who celebrated their respective 200-meter gold and bronze medals at the 1968 Summer Olympics in Mexico City by raising their fists on the podium in a gesture of Black Power.
The display wasn’t well received by the International Olympic Committee, and the two athletes were subsequently suspended from the U.S. team. Smith in particular suffered major career repercussions, as he was denied the chance to participate and defend his gold medal at the 1972 Olympics. Nevertheless, both stood by the salute that showcased their solidarity with fellow Black Americans and provided one of the most indelible moments in Olympic history.
Despite her successes at the 1975 European Championships, Romanian gymnast Nadia Comăneci was so thoroughly dominant at the 1976 Montreal Games that even the scoreboard operators were caught unprepared. On the very first day of the team competition, the 14-year-old delivered a flawless routine that resulted in the first perfect 10 awarded in Olympic gymnastic competition — only, the scoreboards weren’t capable of showing four digits (10.00), so they instead flashed the comically low score of 1.00.
Both the scorers and other gymnasts quickly realized they needed to up their game, as the graceful Comăneci went on to earn six more perfect scores for her performances en route to winning three gold medals in Montreal.
The U.S. Men’s Hockey Team Celebrates the “Miracle on Ice” (1980)
Few believed that the U.S. men’s hockey team, composed largely of college players, stood a chance against the mighty USSR juggernaut in the 1980 Winter Olympics. The Soviets, after all, were returning seven players from the 1976 gold-medal-winning squad, and had recently annihilated the U.S. by a score of 10-3 in an exhibition game.
But the scrappy Americans pulled ahead in their medal-round match with two quick goals in the final period, before holding on for the 4-3 win to set off pandemonium among the decidedly partisan crowd at Lake Placid, New York. What’s often forgotten is that this wasn’t the final game; the Americans did take the gold a few days later with a 4-2 win over Finland, but the real prize for the players and fans came with the ultimate underdog win immortalized as the “Miracle on Ice.”
Greg Louganis Gets Too Close to the Diving Board (1988)
Competing in the preliminary round of the 3-meter springboard event in the 1988 Summer Olympics in Seoul, South Korea, U.S. athlete Greg Louganis encountered every diver’s worst nightmare when a botched somersault led to him slamming his head against the board.
Despite sustaining a concussion and a nasty gash, a stitched-up Louganis returned to the competition 35 minutes later to ace another dive and advance from the preliminaries. Louganis then shrugged off the scary incident to earn a gold medal in the event the following day, and he later claimed another gold in the 10-meter platform to close out his Olympic career in style.
After the U.S. women’s gymnastics team frittered away a once-commanding lead at the 1996 Summer Olympics in Atlanta, Georgia, things took a turn for the worse when Kerri Strug flubbed a vault landing and, she later recalled, heard something snap in her left ankle. Thinking she still needed to complete a solid turn to carry the U.S. to gold, Strug managed to charge down the runway, flip over the vault, and stick the landing on one good foot.
It turned out the Americans would have won the event even without Strug’s heroics, and coach Béla Károlyi later came under heat for urging his charge to continue on what turned out to be a badly sprained ankle. Nevertheless, the picture of Károlyi carrying the tiny, injured gymnast to the podium has stood as a defining image of the 1996 Games, and an athlete’s willingness to lay her body on the line for Olympic glory.
As much as sporting fans love to watch David upset Goliath, it’s also fun to watch generational talents demonstrate their transcendent abilities by repeatedly overwhelming the competition. Such was the case with Jamaican sprinter extraordinaire Usain Bolt, who blazed to eight gold medals and set a slew of world records over the course of his unparalleled career. Bolt himself couldn’t resist sneaking a peek at his challengers during the 100-meter semifinal of the 2016 Summer Games in Rio de Janeiro, Brazil, before adding to his accolades as the first man to win sprinting’s marquee event in three successive Olympics.
During the 1936 Berlin Olympics, a spectacle that Adolf Hitler had hoped would showcase the “supremacy” of Germany’s Aryan athletes, the Black U.S. track-and-field star Jesse Owens and his German competitor Luz Long formed an unlikely friendship in the face of racial prejudice. While the Berlin Games are largely remembered for Owens upstaging Hitler by winning four gold medals, they’re also remembered for the genuine bond shown between would-be adversaries Owens and Long. When Owens bested Long to win gold in the long jump, his German rival joined him to celebrate. The pair walked arm in arm around the Olympic stadium smiling, displaying sportsmanship and mutual respect in front of thousands of spectators in Nazi Germany.
Advertisement
Advertisement
What Did Victorian Ladies Actually Carry in Their Purses?
Victorian-era England was a time of rapid industrial change but also strict social rules and customs — especially for women. Middle- and upper-class women were expected to appear modest and composed at all times, and even the smallest details of their appearance were carefully considered.
In an era when women had limited legal and financial rights, a purse represented a form of independence. In 1853, American suffragist Susan B. Anthony wrote in her diary, “Woman must have a purse of her own.” Also known as a reticule, this small handbag was one of the few personal belongings a Victorian woman kept close and carried herself. Though small, it held the necessities of daily life — items that allowed her to move through public spaces with confidence, propriety, and a degree of self-reliance.
So, what exactly would you find inside these important pouches? Let’s take a peek.
Victorian women’s fashion was famously elaborate and layered, built around corsets, petticoats, crinolines, and later, bustles. These dramatic silhouettes left little room for practicality. Pockets were rare or difficult to reach, and tie-on pockets, which were popular in the 17th and 18th centuries, disappeared as styles shifted. Yet as women’s lives increasingly extended beyond the home — encompassing shopping, visiting, traveling, and attending lectures or concerts — ladies needed a way to carry personal items.
By the early 19th century, the reticule had become the solution. Usually a small, soft bag gathered with a drawstring, it was carried on the wrist or held delicately in the hand. Though satirized in magazines as the “ridicule” because of its tiny size, the purse endured. It fit Victorian ideals of femininity, but more importantly, it provided a private space in a society that closely regulated women’s behavior — earning it the more positive nickname of “indispensable.”
The classic pouchlike reticule was often made of silk, satin, velvet, or fine cotton, embellished with embroidery, beadwork, or tassels. Many women made their own, turning each purse into a showcase of needlework. Later, metal mesh and intricately beaded purses became fashionable, especially for evening wear. These were prized for their shimmer and craftsmanship rather than their capacity. Some beaded purses featured astonishing density, with hundreds of beads per square inch.
In general, Victorian purses were far smaller than modern handbags. Their size was deliberate, signaling refinement and suggesting that the woman carrying it was not burdened by physical labor.
A Victorian woman’s purse was carefully curated to support the social rituals that governed daily life. Calling cards were among the most important items. These small printed cards, bearing a woman’s name, were exchanged during visits or left behind when making a formal call to a home, and they were collected to keep track of one’s social calendar. Indeed, forgetting one could be a major social misstep.
Money was another necessity, though usually in modest amounts. Coins were needed for small purchases, carriage or streetcar fares, tips, and charitable donations. To keep them secure — and prevent awkward rummaging — many women carried a tiny coin purse inside their reticule.
Writing materials were also common. A short pencil, sometimes paired with a small notebook or loose slips of paper, allowed women to jot down appointments, write brief notes, or leave messages. Even these practical tools were often decorative, reflecting the importance of maintaining an elegant appearance.
Victorian life, especially in crowded cities, came with sensory challenges. Handkerchiefs were indispensable and nearly universal. Often embroidered and lightly scented, they served hygienic purposes while also functioning as an accessory. A lady might carry more than one.
Vinaigrettes — small silver or metal containers that held aromatics known as smelling salts — were another frequent inclusion. Crowded rooms, poor ventilation, and restrictive clothing made faintness a frequent concern at the time. Fragrant substances offered quick relief and helped minimize unpleasant odors encountered on the street or in public spaces.
Folding fans were also commonly tucked into a purse. These delicate items served multiple purposes: creating a breeze in overheated rooms, offering a discreet way to shield the face, and even aiding in the unspoken language of flirtation and social signaling. Personal grooming items, meanwhile, reflected the era’s emphasis on self-control and presentation. A mirror, small comb, lip salve, and face powder might be carried for discreet touch-ups. These items were meant to be used privately and sparingly, reinforcing the idea that a respectable woman appeared effortlessly composed.
Some purse contents addressed the literal constraints of Victorian fashion. For instance, gloves were considered essential attire for public outings, but they were often removed indoors, so women needed to be able to carry them neatly. Button-hooks were also especially useful later in the 19th century, when gloves and boots fastened tightly at the wrist or ankle. These slender tools made dressing and undressing possible without assistance, giving women greater autonomy in public spaces.
Sewing items were also carried. A needle, some thread, and a small pair of scissors allowed women to mend a loose button or torn hem on the spot. While household managers sometimes wore chatelaines — decorative belts with dangling tools — many women tucked these necessities into their purses instead, keeping them close at hand.
Not everything in a Victorian purse was strictly functional. Many women carried small sentimental keepsakes such as pressed flowers, miniature photographs, or love notes. These private items offered emotional comfort while remaining hidden from public view.
Prayer cards or religious tokens provided reassurance during travel or illness, reflecting the era’s deep spirituality. And perfume bottles and scented sachets also served both practical and emotional purposes, masking odors while offering a familiar, comforting scent.
Today, surviving Victorian purses — some preserved in museum collections with their original contents intact — offer intimate glimpses into the daily lives of women in centuries past. Though modest in size, these “indispensables” held the tools a lady needed to meet social expectations, care for her body, manage practical challenges, and preserve a sense of self in a tightly regulated world.
For more than two centuries, the Farmers’ Almanac was a familiar presence in American homes — tucked beside seed catalogs, wedged between cookbooks, pinned on barn walls, or kept in workshop drawers. Its pages offered long-range weather predictions, planting calendars, sunrise and moon-phase charts, home remedies, and practical advice. For generations, it influenced how people prepared for the seasons and understood the cycles of the natural world.
The Farmers’ Almanac regularly included weather lore, folk sayings, planting guidelines, and proverbs — a blend of traditions that resonated with a mostly rural, agrarian readership. Many of the proverbs are sayings Americans still recognize today, such as “A stitch in time saves nine.” Others have much older European or colonial origins, and the almanac played a role in keeping them alive and circulating.
But the long tradition of the Farmers’ Almanac will end with the publication’s 2026 edition. The publishers announced the closure in late 2025, citing rising costs, dwindling print readership, and the reality that digital tools now offer immediate forecasts and guidance once found only in annual books. As the Farmers’ Almanac closes its doors, let’s take a look at the rise and fall of this former household staple.
Credit: Photo 12/ Universal Images Group via Getty Images
A Tradition Older Than America
Almanacs are far older than the United States. Their roots go back to ancient societies that tracked celestial events to guide planting and seasonal work. With the invention of the printing press, the earliest known printed almanacs emerged in Europe — the first appeared in 1457 — and by the late 15th century, such publications commonly included calendars, astronomical data, tide tables, and practical seasonal guidance.
In colonial North America, the tradition of almanac‑making began in the 17th century. The first U.S. almanac was printed by William Pierce in 1639, offering calendars, weather guidance, and seasonal advice for the New England region. By the 18th century, dozens of almanacs circulated across the colonies. Perhaps the most famous wasPoor Richard’s Almanack, first published by Benjamin Franklin in 1732.
Despite the many almanacs that sprang up, only a handful endured into the modern era. At the center of that legacy are two long‑running American institutions: The Old Farmer’s Almanac and the Farmers’ Almanac, which carried forward the tradition of annual almanacs centuries after their 15th‑century predecessors across the pond.
The Old Farmer’s Almanac came first. Founded in 1792 by Massachusetts teacher and bookseller Robert B. Thomas during George Washington’s first presidential term, it quickly became a trusted reference for rural households. Thomas edited the guide for more than 50 years, establishing the traditional tone that remains its hallmark.
TheFarmers’ Almanac followed in 1818, founded by poet‑astronomer David Young and publisher Jacob Mann. Its style was more practical and conversational and it gradually adopted a modern tone, in contrast to TheOld Farmer’s Almanac, which retained a traditional, nostalgic voice.
The publications’ friendly rivalry — especially over long-range weather predictions — became part of their folklore. Accuracy varied, of course. Weather forecasting is notoriously difficult, and independent analyses typically placed both almanacs in the 50% to 70% accuracy range. The Farmers’ Almanac famously credited its forecasts to a secret equation devised by Young. And despite the lack of precision, readers valued a framework for thinking ahead as the seasons changed.
Advertisement
Advertisement
Credit: Bettmann Archive via Getty Images
Why Did TheOld Farmer’s Almanac Survive?
TheOld Farmer’s Almanac adapted steadily as literacy rose, printing improved, and suburban gardening gained popularity. By the late 20th century, it had become the more nationally recognized brand, expanding into calendars, cookbooks, and companion guides while maintaining its distinctive yellow cover and familiar design.
The Farmers’ Almanac remained smaller but carried a loyal readership, especially among gardeners and hobby farmers. Both almanacs survived changes that swept away most comparable publications, and their longevity became part of their mystique.
But even loyalty has limits. As print costs rose and digital tools replaced nearly every function of a printed almanac, the Farmers’ Almanac struggled to maintain the business model that had sustained it since James Monroe was president. Though the publication reported a North American circulation of 2.1 million in 2017, younger readers weren’t replacing earlier generations of buyers, and tradition alone couldn’t sustain the business.
TheOld Farmer’s Almanac managed to adapt more effectively. Strong brand recognition, diversified products, and a robust digital presence helped it remain viable into the 21st century. As of 2026, it continues in print — with a circulation of more than 2.5 million in North America — making it one of the last general-interest American almanacs still published annually.
Looking back at the Farmers’ Almanac’s long run, what stands out is not only the information it provided but also the mindset it represented: a seasonal way of organizing the year by the familiar signals of the natural world. For many readers, it offered a reliable yearly rhythm that modern digital tools, for all their immediacy, can’t replicate.
Though half of this long-standing pair is disappearing, the almanac tradition itself endures. The Old Farmer’s Almanac still arrives each year, offering weather forecasts, astronomical tables, gardening guidance, and the kind of practical advice that defined the genre. A handful of smaller regional or specialized almanacs persist, mostly in niche print runs or online, but none matches the national presence of the two historic giants.
For readers who will miss the Farmers’ Almanac, its closing isn’t the end of the story. The tradition lives on in popular folklore and homespun wisdom that will continue to be passed down from generation to generation.
The modern calendar can seem confusingly arbitrary, with uneven months, leap years, and even missingdays in history. But despite its strange inconsistencies, the calendar we use today is the result of a long quest to design the perfect time measurement system. Here’s a look at how we ended up here.
Our uneven months — ranging from 28 to 31 days long — have their roots in the Roman calendar, which changed several times over the Roman Republic’s existence from 509 BCE to 27 BCE. Based on the lunar cycles, the early Roman calendar originally had 10 months instead of 12 — six 30-day months, and four 31-day months, for a total of 304 days annually. The year began in March, ended in December, and was followed by an unnamed and uncounted gap during the winter months before the solar year would start again in spring.
According to Roman tradition, in an attempt to eliminate this unaccounted-for winter gap and sync the calendar with the lunar year, the legendary King Numa Pompilius added January and February to the calendar around 713 BCE, bringing the number of months to 12. Since the Romans believed odd numbers were auspicious and even numbers were unlucky, Numa wanted years and months to have an odd number of days. (For some reason, an even number of months was fine.) To achieve this, he deducted one day from each of the 30-day months, so they had 29 days.
However, because the newly established year consisted of 355 days (based on 12 lunar cycles), it was mathematically inescapable that one month would have an even number of days. It was thus decided that February, the month dedicated to the infernal gods, would be the “unlucky” month with 28 days.
Though Numa’s reforms brought the Roman calendar closer in line with the lunar year, it was approximately 10.25 days short of the solar year, causing it to fall out of sync with the seasons over time. To address this, the Romans observed an extra month called Mercedonius every two or three years. However, Mercedonius was practiced inconsistently, resulting in seasonal confusion, and was subject to manipulation as politicians would extend or shorten the month in order to prolong or cut short political terms.
The Julian Calendar: Leap Years and a New Start of the Year
Due to this imperfect system, the Roman calendar had drifted about three months ahead of the solar calendar by the first century BCE. To further standardize the calendar and sync it with the solar year, Julius Caesar implemented the Julian calendar in 46 BCE. Designed by the Alexandrian astronomer and mathematician Sosigenes, the timekeeping system was based on the Egyptian solar calendar, which was the first known calendar to use 365 days.
The Julian calendar largely resembles the calendar we still use today — it had 365 days and 12 months, all of which had 30 or 31 days except for February, which had 28 days. Because the Julian calendar took the solar year as having 365 1/4 days, it also instituted leap years — which, unlike the month of Mercedonius, were regular and predictable — to prevent the calendar and solar years diverging over time.
The Gregorian Calendar: Easter Reset and 10 Missing Days
Though the Julian calendar brought the calendar year much more in line with the solar year, it still wasn’t perfect. Due to a difference of about 11 minutes between the Julian calendar, which averaged 365.25 days per year with leap days) and the true time it takes for Earth to orbit the sun (365 days, 5 hours, 48 minutes, 45.25 seconds), the Julian calendar became out of sync with the seasons by about one day per century.
This was addressed in 1582, when Pope Gregory XIII instituted the Gregorian calendar, developed by the Italian scientist Aloysius Lilius — the calendar most of the world still uses today. Lilius realized that leap years every four years made the calendar slightly too long, and so devised a system that adds leap days in years divisible by four, unless the year is also divisible by 100 (on years also divisible by 400, a leap day is added regardless). Though this system works admirably well, it is still off by 26 seconds — so by the year 4909, the Gregorian calendar will be one day ahead of the solar year.
Pope Gregory’s calendar also had a religious motivation: to reset the date of Easter, which over the centuries had drifted from the spring equinox when it was supposed to take place. To remedy this and “reset” the date of Easter, the Gregorian calendar skipped 10 days in 1582, when October 4 was followed the next day by October 15.
The modern calendar may seem like a strange system nowadays, but many of its quirks are the result of continual refinement and improvement throughout history. And though some have proposed alternative systems for keeping track of time, none has stuck. Looks like we’ll be reminding ourselves “30 days has September, April, June, and November” for the foreseeable future.
Credit: PhotoQuest/ Archive Photos via Getty Images
Author Tony Dunnell
December 4, 2025
Love it?51
The oral tradition of nursery rhymes goes back to at least the 13th century. But the golden age came in the 18th century, when many of the most famous verses emerged and became established in the colorful (and sometimes creepy) canon of classics we still hear today. While many of these rhymes seem, at first glance, like innocent childhood entertainment — simple, silly verses passed down through generations to delight young ears — they often have surprisingly complex backstories.
Despite being aimed at children, many classic nursery rhymes are far darker, and in some cases more subversive, than they may appear, touching on everything from medieval taxes to religious persecution. Here’s a look at the hidden origins of five famous nursery rhymes, revealing how even the most innocent-sounding verses can offer a fascinating window into the past.
Credit: Buyenlarge/ Archive Photo via Getty Images
“Baa, Baa, Black Sheep”
The earliest printed version of “Baa, Baa, Black Sheep” dates back to 1744, but the rhyme is likely much older than that. The words, which have barely changed over the centuries, appear to tell a simple story of wool being delivered to three different people: the master, the dame, and a little boy. Historians believe, however, that the nursery rhyme actually alludes to a medieval wool tax that existed in England from 1275 up to the 1500s. The tax demanded that wool producers deliver a third of their product to the king (the master), and a third to the church (the dame), leaving only a third for the farmer — a tax seen as entirely unfair at the time. The specific mention of a black sheep possibly adds another layer, as black wool was less valuable than white because it couldn’t be dyed.
Credit: Historical/ Corbis Historical via Getty Images
“Here We Go Round the Mulberry Bush”
As is the case with many nursery rhymes, the origins of this playground favorite are disputed. But according to historian R.S. Duncan, the song was invented by female prisoners at England’s Wakefield Prison around two centuries ago. Duncan — a former governor of Wakefield Prison — explained how the inmates used to walk their visiting children around a mulberry tree planted in the prison’s courtyard, where the women were allowed to exercise. They invented the rhyme to help pass the time and keep their children occupied. Adding credence to this origin theory is the fact that a mulberry tree has stood at the prison since at least the 19th century.
Advertisement
Advertisement
Credit: Buyenlarge/ Archive Photos via Getty Images
“Ring Around the Rosie”
“Ring Around the Rosie” (or “Ring a Ring o’ Roses”) has perhaps the most widely repeated and infamous origin story of any nursery rhyme. According to popular belief, the rhyme refers to the Great Plague of London in 1665 or possibly earlier outbreaks of the bubonic plague in England, with “rosie” representing the rash, “posies” being the flowers carried to mask the smell of death, and “all fall down” symbolizing the widespread fatalities.
This origin story, however, is actually a modern invention and one now largely debunked. The rhyme’s association with plagues didn’t emerge until the mid-1900s, and there’s little to no evidence, beyond simple speculation, to connect it with the plague — which makes folklorists and scholars highly skeptical of the theory. The rhyme most likely originated as a simple children’s party game, in which children hold hands in a circle and follow fun actions (“We all fall down!”). According to folklorist Philip Hiscock, games such as these may have been invented to skirt Protestant bans on dancing in both Britain and North America in the 19th century.
Whichever way you look at it, “Goosey Goosey Gander” is a strange rhyme, one that somewhat cryptically involves a goose and an old man getting thrown down some stairs. The earliest recorded version of the rhyme dates back to 1784, and as with other nursery rhymes, its origin is open to debate.
One of the most compelling theories involves the religious persecution of Roman Catholics during the reign of England’s King Henry VIII, when wealthy Catholic families often had priest holes in which to hide members of the clergy. If a priest was discovered, he would be forcibly taken from the house (“thrown down the stairs”) and possibly put to death. An alternative theory links the rhyme to the closing of brothels in London during Henry VIII’s reign. Prostitutes were often known as “geese” at the time, and when the brothels were shut down, they were forced to work elsewhere, possibly explaining the significance of “Whither shall I wander … in my lady’s chamber.”
“Mary Had a Little Lamb” is a bit of an outlier in the world of nursery rhymes. Not only are its origins well documented, but it’s also a refreshingly innocent rhyme that’s actually appropriate for children, with no disturbing backstory. The poem was written by American writer Sarah Josepha Hale — who, incidentally, was the force behind the creation of the national Thanksgiving holiday — and published in 1830. It is now generally accepted that the nursery rhyme was based on a real incident at a small school in Newport, New Hampshire.
One day, a student brought her pet lamb to the school at her brother’s urging. The lamb proved too distracting in class and had to wait outside, but it stayed nearby until school was dismissed, then ran to Mary for attention. This heartwarming event inspired the poem, which later became one of the most popular nursery rhymes in the English-speaking world. It gained even greater fame when Thomas Edison recorded it in 1877 as the first song ever captured on his phonograph.
From the downright shocking to the utterly bizarre, some facts about history are particularly fascinating. Did you know the U.S. had a president before George Washington, or that Americans used to live inside giant tree stumps? If you missed these facts the first time, don’t worry — we’ve got you covered. Read on for the 25 most popular facts we sent on History Facts this year.
When Congress declared war on Japan on December 8, 1941, more Americans than ever before heard the call of duty. Some 16.1 million U.S. citizens served in the military by the time World War II ended in 1945, representing 12% of the total population of 132 million at the time.
Before the logging industry, the trees in old-growth forests were hundreds of feet tall, with gnarled bases and trunks that could measure more than 20 feet across. To fell these forest giants, loggers would build platforms 10 to 12 feet off the ground, where the tree’s shape was smoother. The massive remaining stumps had soft wood interiors and sometimes even hollow areas, so it was relatively easy to carve out the center of a stump and turn it into a building, such as a barn, post office, or even the occasional home.
In 1800, tea was the most popular drink among Brits — something of a problem for the British Empire, as all tea was produced in China at the time. And so the English did something at once sinister and cunning: They sent a botanist to steal tea seeds and bring them to India, a British colony at the time. One historian called it the “greatest single act of corporate espionage in history.”
In 1919, Dwight D. Eisenhower, then a lieutenant colonel in the Tank Corps, learned of the U.S. Army’s plan to test the capabilities of its transport vehicles by moving 80 military vehicles across the country. After joining the expedition, he dutifully submitted a report analyzing the quality of the roads encountered along the way. Decades later, Ike made the development of America’s highways a centerpiece of his domestic agenda upon being elected U.S. president in 1952. His vision became the Interstate Highway System that crisscrosses the nation today.
For most of human history, you were either a child or an adult. The word “teenager” first entered the lexicon in 1913, appropriately enough, but it wasn’t until decades later that it took on its current significance. Three developments in the mid-20th century had a major influence on the creation of the modern teenager: the move toward compulsory education, which got adolescents out of farms and factories and into high school; the economic boom that followed World War II; and the widespread adoption of cars among American families.
The ancient Romans are known for many innovations that were ahead of their time, and some that seem ahead of even our time. Case in point: Concrete used in some ancient Roman construction is much stronger than most modern concrete, surviving for millennia and getting stronger, not weaker, over time. The secret ingredient? The sea. Builders mixed this ancient mortar with a combination of volcanic ash, lime, and seawater, creating a material that essentially reinforced itself over time, especially in marine environments.
Though George Washington is indisputably the first president of the United States, he technically wasn’t the first person in the federal government with the title of “president.” Washington was elected under the government formed by the ratification of the U.S. Constitution in 1788, but the Constitution wasn’t the only government-forming document in the nation’s history. Ratified in 1781, the Articles of Confederation — the United States’ first constitution — formed what’s known as the Confederation Congress. This early governing body was led by a president who held a one-year term, the first of whom was Samuel Huntington of Connecticut.
The well-coiffed men of the Victorian era often had some truly impressive beards. The look was partially driven by the desire to appear manly and rugged, but beards were also seen as a way to ward off disease. At the time, many doctors endorsed the miasma theory of disease, which (incorrectly) held that illnesses such as cholera were caused by bad air. Facial hair, it was believed, could provide a natural filter against breathing in so-called “miasms.”
Credit: Heritage Images/ Hulton Archive via Getty Images
When grocery owner Sylvan N. Goldman rolled out the first shopping carts in 1937, he expected a runaway hit. But the reaction wasn’t exactly enthusiastic. Women, already used to pushing strollers, weren’t eager to push another one at the store. Men, on the other hand, preferred not to push something stroller-like at all. Goldman even hired store greeters to hand shoppers a cart, and paid actors to walk around shopping with them until the idea finally caught on.
The earliest loaf of bread ever discovered is a whopping 8,600 years old, unearthed at Çatalhöyük, a Neolithic settlement in what is now southern Turkey. While excavating the site, archaeologists found the remains of a large oven, and nearby, a round, organic, spongy residue among some barley, wheat, and pea seeds. After biologists scanned the substance with an electron microscope, they revealed that it was a very small loaf of uncooked bread.
When the Mayflower passengers finally reached the shores of the New World, they spent a few weeks scouring the region for a spot to bunker down for the winter. As one passenger wrote in their journal, “[W]e could not now take time for further search or consideration, our victuals being much spent, especially our beer.” The Pilgrims promptly began building what became Plymouth Colony, with a brew house unsurprisingly among the first structures to be raised.
It’s easy to assume that Baby Ruth candy bars were named for the famed baseball player George Herman “Babe” Ruth Jr. Indeed, even the Great Bambino assumed as much at the time. After all, the nougaty confection debuted in 1921, after the ballplayer became a household name. But according to the official, legal explanation of the moniker, Baby Ruth bars were named after a different Ruth altogether: “Baby” Ruth Cleveland, the daughter of former U.S. President Grover Cleveland.
Credit: Photo Josse/Leemage/ Corbis Historical via Getty Images
For the thousand or so years that encompassed the Middle Ages, people in Western Europe sometimes slept in two shifts: once for a few hours usually beginning between 9 p.m. and 11 p.m. and again from roughly 1 a.m. until dawn. The hours in between were a surprisingly productive time known as “the watch.” People would complete tasks and chores, check on any farm animals they were responsible for, and take time to socialize.
Credit: H. Armstrong Roberts/ Retrofile RF via Getty Images
For most of human history, a birthday was just another day, and many people didn’t even know when theirs was. Ancient societies sometimes recorded births within noble or wealthy families, but systematic recordkeeping was rare. It wasn’t until the 1530s in England that churches were mandated to document baptisms. Similar practices appeared in colonial America, but birth registration didn't become widespread until the early 1900s.
The Dust Bowl wasn’t entirely confined to the Great Plains. Some of the dust storms that resulted from the natural disaster were so extreme that their clouds reached cities more than 1,500 miles away on the East Coast. Boston, Massachusetts, even saw red snow due to red clay soil becoming concentrated in the atmosphere.
As you may expect, “sept” is a prefix with Latin roots that means “seven.” And indeed, the month of September was originally the seventh month in the Roman republican calendar. That calendar was used in ancient Rome for hundreds of years before the debut of the Julian calendar in 46 BCE. January and February joined the Julian calendar as the first and last month of the year, respectively, but nobody changed September’s name.
We’ll never definitively know what presidents such as George Washington or Abraham Lincoln sounded like, since there are no audio recordings of their voices. The oldest existing recording of a U.S. president is the voice of Benjamin Harrison, the 23rd commander in chief, giving remarks at a diplomatic event. Harrison served from 1889 to 1893, and the audio recording dates to around his first year in office. His voice was captured on a wax cylinder phonograph, a recording device developed by Thomas Edison in the late 1880s.
When you think of synchronized swimming, you may picture the glittering “aquamusicals” of the 1940s and ’50s. But the idea of choreographed aquatic performance actually dates back nearly two millennia — to the watery amphitheaters of ancient Rome. Roman rulers were obsessed with turning water into spectacle, even flooding the Colosseum to do so. The Roman poet Martial described a performance in which women portraying Nereids, or sea nymphs, dove and swam in formation across the Colosseum’s waters.
If you were a well-to-do family in colonial America, you may have draped your floor in richly painted oilcloth, or even imported carpets across the Atlantic. Most people, however, settled for simpler floor coverings, such as straw matting or sand. The latter came with a bonus feature: You could turn it into decor if you were feeling creative, drawing fun designs in the sand as a temporary decoration.
Edinburgh Castle sits atop an imposing rock outcropping called Castle Rock. Ancient people started using the outcropping in the Bronze Age, and flattened its top around 900 BCE. But what they didn’t know is that hundreds of millions of years prior, that rock was the inside of a volcano. The volcano went dormant (and eventually extinct) roughly 340 million years ago, and the magma inside solidified, creating a rock formation that’s exceptionally sturdy and erosion-resistant — the perfect location for a stronghold.
Credit: The Picture Art Collection/ Alamy Stock Photo
When the San José first set sail in 1698, it probably wasn’t expecting to be making headlines three centuries later. The Spanish navy ship met its watery end off the coast of Cartagena, Colombia, with 200 tons of gold and emeralds aboard. Now known as the “holy grail” of shipwrecks, it’s presumed to be worth as much as $18 billion, which explains why several different entities have laid claim to the wreck since its discovery in the 1980s.
Though it’s often seen as a quintessentially American custom today, tipping has its roots in the feudal societies of medieval Europe. In the Middle Ages, wealthy landowners occasionally gave small sums of money to their servants or laborers for extra effort or good service. The gesture later evolved into a more formal custom: By England’s Tudor era, guests at aristocratic households were expected to offer “vails” to the household staff at the end of their stay.
The terra-cotta army in China is a collection of more than 7,000 life-size clay soldiers created in the third century BCE, each made with remarkable unique detail. But there used to be yet another layer of detail: Originally, these figures were painted in various vibrant colors. After the statues were sculpted, fired, and assembled, artisans applied lacquer (derived from a lacquer tree), followed by layers of paint made from cinnabar, malachite, azurite, bone, and other materials mixed with egg.
Among the European aristocracy in the 16th and 17th centuries, especially in France and England, handkerchiefs were meant for display, whether in a pocket, a hand, or as part of an elaborate social ritual. These were no ordinary hankies; they were made with intricate lacework and fine embroidery. Wealthy Europeans posed for portraits with their hankies, bequeathed them in wills, and included them in dowries.
In 1815, Mount Tambora in Indonesia erupted with extraordinary force. The fallout dimmed the sun worldwide, lowering temperatures and devastating harvests. As a result, food prices soared, and horses were slaughtered for meat or starved for lack of feed. This sudden scarcity of transport led to an innovation. In 1817, German inventor Baron Karl von Drais unveiled his Laufmaschine, or “running machine” — a simple two-wheeled wooden frame that riders straddled and propelled by pushing their feet along the ground. Like modern bicycles, it could travel far faster than walking, even on muddy post-rain roads.
Advertisement
Advertisement
Subscribe to History Facts
Enter your email to receive history's most fascinating happenings in your inbox each day.
Sorry, your email address is not valid. Please try again.
Your email is:
Sorry, your email address is not valid. Please try again.