Call coincidence what you’d like: luck, karma, fate, or just random happenstance. In any case, when similar events occur, it’s fascinating and, sometimes, downright eerie. Some coincidences have so many layers that they take on a second role in the form of conspiracy theory or prophecy. Coincidences, by nature, require zero planning; all we need to do is wait for them to happen. Here’s a look back at five strange coincidences throughout history.
John Adams and Thomas Jefferson Died on the Exact Same Day
It’s unlikely enough that two of America’s Founding Fathers would die on the very same day, but this story gets even stranger. First, these two political rivals died within hours of each other. Even weirder? The date of their passing was July 4, 1826 — 50 years to the day after the Declaration of Independence was adopted. John Adams and Thomas Jefferson weren’t the same age when they died, though — Adams was 90 and Jefferson was 83. There are multiple theories on why this happened, and sheer coincidence is certainly one. To add another eerie layer, founding father James Monroe also died on the Fourth of July, five years later.
Halley's Comet Marked Mark Twain’s Birth and Death
Some coincidences have an interstellar twist. American author and humorist Mark Twain was born on November 30, 1835, the same year Halley’s comet passed within sight of the Earth during its 76-year journey around the sun. Twain, who was a unique figure in his own right, was aware of this fact and found it amusing to hitch his wagon to the rare cosmic occurrence. As he wrote in 1909, "I came in with Halley's comet in 1835. It is coming again next year, and I expect to go out with it. It will be the greatest disappointment of my life if I don't go out with Halley's comet. The Almighty said, no doubt: 'Now here are these two unaccountable freaks; they came in together, they must go out together.'" Twain’s wish came to fruition: He passed away on April 21, 1910, as Halley’s comet was passing Earth.
A Man Was Hit by a Baby Falling Out of a Building — Twice
In 1937, Joseph Figlock, a public works street sweeper, was cleaning up an alley in midtown Detroit when an infant plummeted from a fourth-story window and landed on Figlock’s head. Miraculously, the baby survived. While working less than a year later, Figlock was struck from above by another baby that fell from anotherfourth-floor window. Once more, Figlock’s head saved the life of the infant in peril. Figlock himself was uninjured in both incidents.
One Man Hosted the Beginning and End of the Civil War
Behold the peculiar story of Wilmer McLean, who has the distinction of witnessing both the first land battle of the Civil War in 1861 and General Robert E. Lee’s surrender to General Ulysses S. Grant in 1865, which effectively ended the conflict. At the start of the war, McLean’s home in Manassas, Virginia, was commandeered as the headquarters of Confederate General P.G.T. Beauregard. The home took damage from Union shelling, and McLean moved his family 100 miles south to avoid the violence of war. Ironically, the family’s new home was in the small town of Appomattox Court House in Virginia, the site of Lee’s surrender. Grant and Lee spoke for about a half-hour at the McLean home before Lee signed the surrender document. McLean is said to have later remarked, “The war began in my front yard and ended in my front parlor.”
The Lincoln and Kennedy Assassinations Share Many Strange Similarities
Abraham Lincoln and John F. Kennedy are widely considered two of the greatest Presidents in U.S. history, and the coincidences surrounding their assassinations further link the two leaders. In fact, books have been written about them, and members of Congress have even discussed the topic. Both Presidents were killed on a Friday with their wives by their sides. Both were succeeded by men whose last name was Johnson (Andrew Johnson and Lyndon B. Johnson). There are also some parallels in the assassins: John Wilkes Booth shot Lincoln in a theater and was caught in a warehouse; Lee Harvey Oswald shot Kennedy from a window in a book warehouse and was apprehended in a movie theater. What’s more, both Booth and Oswald were themselves killed before they could face justice. Not everyone agrees as to whether these similarities have a deeper meaning or no meaning at all, but at the end of the day, the sheer number of coincidences is quite surprising.
Credit: Graphic House/ Archive Photos via Getty Images
Author Kerry Hinton
February 22, 2024
Love it?42
The Brill Building isn’t just an art deco structure in midtown Manhattan — it’s also the name of a musical genre. Throughout the early and mid-1960s, the “Brill Building sound” became synonymous with groundbreaking pop music. The heyday of the Brill Building era was short-lived, but in one six-year span, the songwriters, arrangers, musicians, and producers behind this sound contributed to hundreds of Billboard Hot 100 hits, including “Stand By Me” (Ben E. King, 1962), “One Fine Day” (the Chiffons, 1963), and “Be My Baby” (the Ronettes, 1963).
Located at 1619 Broadway in New York City, the Brill Building was a hub of songwriters, record labels, and recording studios, all under one roof. It built on the tradition of the “Tin Pan Alley” district before it — a concentration of music publishers and studios in a strip of Manhattan that dominated the music industry in the big-band era. But while their downtown predecessors were mainly concerned with the profits produced by pumping out sheet music for radio hits, the writers and producers at the Brill Building were also on a mission of artistic idealism. Their compositions drew inspiration from classical music, Latin music, traditional Black gospel, and rhythm and blues to create songs that appealed to an audience already hungry for the new sound of rock ’n’ roll. The assembled talent was a once-in-a-generation roster of songwriters, including Burt Bacharach and Hal David, Gerry Goffin and Carole King, and Neil Diamond. Together, they produced sophisticated songs that were directly aimed at a new, youthful generation and a powerful rising subculture: teenagers.
By the mid-’60s, an increasing number of artists — such as the Beatles and Bob Dylan — began composing and playing their own material, making the songwriter-for-hire less of a necessity. As Dylan wrote in 1985, “Tin Pan Alley is gone. I put an end to it. People can record their own songs now.” This may be true, but the creators behind the Brill Building sound helped make the ascent of these singer-songwriters possible. Here are five ways the Brill Building shaped popular music in the 20th century.
The Brill Building employed a model of vertical integration that supervised every phase of a song’s life cycle, from production to distribution, all under one roof. The 11 floors of 1619 Broadway and a few surrounding buildings became a one-stop shop where a songwriter could pen a would-be hit, sell it to a publisher, find a band, and cut a demo. Songs could even be played for radio promoters in the building to garner airplay. This new type of streamlined hitmaking — often called “assembly line pop” — gave publishers and producers a huge pool of material to choose from and encouraged creative collaboration, merging art and commerce in a new way.
At the Brill Building, songcraft mattered. Some of the most interesting and popular songs of the era were written at Aldon Music, one of the music publishing companies in the building. Its founders, Al Nevins and Don Kirshner (“Al” and “Don”), had a plan: to take the spirit of classic Tin Pan Alley songwriting (catchy melodies with commercial appeal) and create well-crafted songs aimed at young people, an increasingly lucrative market. Kirshner had already enjoyed some success writing jingles with his high-school friend Bobby Darin, and after acquiring the talents of the experienced songwriting team of Jerry Leiber and Mike Stoller (the writers behind “Hound Dog” and “Jailhouse Rock”), he convinced the more experienced Nevins to partner with him.
The music may have been aimed at the youth market, but Aldon’s songwriters employed lyrics that addressed bleak social conditions (“We Gotta Get Out of This Place”) and tragedy (“Leader of the Pack”). The arrangements and production were innovative as well. Songs such as the echo-drenched Phil Spector-produced “River Deep, Mountain High” (performed by Tina Turner) showcased new directions in arrangements and production techniques.
The Brill Building sound was created for young people, by young people. In 1962, the oldest of Aldon’s songwriters was just 26 years old. Many of the Brill Building songwriters were only slightly older than their songs’ subjects, making their perspective especially accessible to young audiences. At the age of 20, Carole King wrote the No. 1 hit “Will You Love Me Tomorrow” — recorded and released by the Shirelles in 1960 — with her husband Gerry Goffin. Little Eva, their babysitter, sang the smash hit “The Loco-Motion” (which King and Goffin also wrote) when she was just 17.
Before the Brill Building era, popular songwriting was basically a boys’ club. This changed with the arrival of female songwriters such as Carole King, Ellie Greenwich (“Then He Kissed Me,” 1963), and Cynthia Weil (“You’ve Lost That Lovin’ Feelin’,” 1964), although their husbands were named first in the songwriting credits. In addition to writing dozens of hits, these women proved that they were equally capable in the recording studio as arrangers and producers.
The Brill Building songwriters made rock ’n’ roll popular with mainstream teenage America. Although the majority of writers were white, they all had been influenced by the melting pot of musical styles they heard on the fire escapes and in the clubs of New York City. The result was a hybrid sound that blended genres and often had crossover appeal, finding success on both pop and R&B charts. Songwriters often specifically wrote for Black female artists such as Dionne Warwick, the Ronettes, and the Crystals, using arrangements that gave their music mainstream appeal. This unique musical style united listeners from different backgrounds and opened people’s eyes to the possibility of a biracial popular culture.
Advertisement
Advertisement
Historical Figures You Didn’t Realize Were Friends
Credit: Consolidated News Pictures/ Archive Photos via Getty Images
Author Tony Dunnell
February 22, 2024
Love it?75
Muhammad Ali once said, “Friendship is the hardest thing in the world to explain. It’s not something you learn in school. But if you haven’t learned the meaning of friendship, you really haven’t learned anything.” Like Ali and his own best pal, photographer Howard Bingham, some friendships in history have been formed by figures with wildly different backgrounds and career choices. The following friendships are as surprising as they were genuine — though they were not all long-lasting. From Mark Twain and Nikola Tesla to Hunter S. Thompson and Pat Buchanan, here are five unlikely bonds between notable figures you never knew were friends.
A friendship between the famed writer Mark Twain and inventor Nikola Tesla might, on the surface, seem unlikely. And yet, before the two met, they already shared some significant interests. Tesla had once been bedridden for nine months with a severe bout of cholera, during which time he read some of Twain’s earlier works. He later described them as “unlike anything I had ever read before and so captivating as to make me utterly forget my hopeless state.” Twain, meanwhile, was fascinated by technological innovations and, in particular, electricity. When the two men eventually met in the 1890s, they became friends and spent a lot of time together in Tesla’s lab and elsewhere. One famous account tells of Twain’s participation in an experiment involving an electromechanical oscillator, which Tesla believed might be therapeutic. But when Twain sat on the vibrating plate, it served as something of a laxative, forcing the acclaimed author to run for the bathroom.
The story of Arthur Conan Doyle and Harry Houdini is one of both friendship and rivalry. The celebrated mystery author (and creator of Sherlock Holmes) and the famous illusionist first met in 1920, during the magician’s tour of England. Their shared interest in spiritualism — and the use of séances to contact the dead — initially brought them together. But the two men took fundamentally different paths thereafter. Conan Doyle (in stark contrast to his highly rational and analytical Holmes) became increasingly obsessed with and convinced by the existence of spiritual powers. Houdini, on the other hand, began exposing fraudulent mediums and decried spiritualism as "nothing more or less than mental intoxication.” After Conan Doyle’s wife, Jean, supposedly contacted Houdini’s mother during a seance in 1922, Houdini publicly announced that he did not believe the message came from beyond the grave. This is partly because the message consisted of 15 pages of handwritten, English-language notes, and his mother didn’t speak English. Not long after the séance, the friendship began to break apart.
Ella Fitzgerald and Marilyn Monroe struck up a friendship in the 1950s. Hollywood’s favorite blond bombshell was a huge fan of Fitzgerald’s music, and went to see her perform in Los Angeles in 1954. The two met in person, and soon formed a tight bond. Monroe famously helped Fitzgerald land a gig at L.A.’s renowned Mocambo nightclub, which had previously rejected Fitzgerald due to what they saw as her lack of sex appeal (not because of her race, as the story often goes). Monroe ensured that numerous A-list celebrities attended each performance, drawing plenty of media attention. The gigs were a turning point in the career of the “First Lady of Song.” When asked about their relationship, Monroe said, “Well, my very favorite person, and I love her as a person as well as a singer, I think she’s the greatest, and that’s Ella Fitzgerald.” The two women remained friends until Monroe’s death at the age of 36.
It’s not known precisely when John F. Kennedy and Frank Sinatra first met, but by the end of the 1950s they were well acquainted. The politician and crooner notoriously shared a love of nightlife and women, so it’s perhaps no surprise they got along. But beyond their mutual admiration, their relationship was beneficial to both men. Sinatra gained access to the corridors of power, while Kennedy counted on Sinatra’s star power to win him votes in the 1960 presidential election. When Kennedy won, the President-elect publicly thanked Sinatra for his efforts in the campaign. But the friendship collapsed after JFK entered the White House, partly because First Lady Jackie Kennedy reportedly disliked the singer, and also due to Sinatra’s alleged connections with organized crime.
The “gonzo” journalist Hunter S. Thompson made no secret of his intense hatred of Richard Nixon, who he said represented “that dark, venal, and incurably violent side of the American character.” It’s something of a surprise, then, that Thompson ended up being good friends with the conservative politician Pat Buchanan — who was once one of the most trusted advisers to none other than President Nixon. Thompson, most famous for his counterculture magnum opus Fear and Loathing in Las Vegas (1972), met Buchanan during Nixon’s presidential 1972 campaign. He later said of their friendship, “We're still friends. Patrick is a libertarian, or at least in that direction. I think of politics as a circle, not a spectrum of one line, not just right and left. Patrick and I are often pretty close. Patrick's an honest person.” The two remained friends until Thompson’s death in 2005.
Each year on February 14, romantic partners exchange affectionate cards and sugary-sweet chocolates, all in the name of St. Valentine — and all while the iconic image of Cupid takes center stage. But who are these figures, and how did they converge for this sentimental holiday? From Cupid’s roots in Greek mythology to St. Valentine’s Christian symbolism, here’s how these two figures became the unlikely faces of love and Valentine’s Day.
The exact origin of the saintly namesake of Valentine’s Day is murky. According to one belief, St. Valentine was a third-century Roman priest who defied the Roman Empire’s stance against men marrying at a young age (it was thought that they should instead serve as soldiers). Valentine continued to perform marriages in secret, leading to his execution on February 14. Another belief portrays St. Valentine as a compassionate man who helped free persecuted Christians in ancient Rome. According to legend, he healed the local jailer’s blind daughter and, before his death, sent her a note signed, “from your Valentine.” Whether these were two separate figures or just one isn’t entirely clear, nor is whether they were actually historical characters and events or just myths. In records from the medieval era, for instance, there is no connection between St. Valentine and love or marriage. But regardless of how the figure became linked with romance, the association between St. Valentine and love has remained strong.
Today, we think of Cupid as a surreal cherubic figure, adorned with wings and armed with a bow and arrows. This iconic imagery is rooted in depictions of Eros, the Greek god of love. Initially depicted as a handsome youth, Eros underwent a transformation during the Hellenistic period (around 323 BCE to 31 BCE), evolving into the cherubic, winged child we recognize today. When the Romans adopted the deity, he became Cupid, a name derived from the Latin word for “desire.” The new likeness remained, as did Eros’ mischievous use of his arrows to arouse love or extreme passion in whomever happened to be struck by one.
Advertisement
Advertisement
Photo credit: Heritage Images/ Hulton Archive via Getty Images
How Did Valentine’s Day Start?
While there is no single backstory for our modern celebration of Valentine’s Day, the holiday is often linked to the ancient Roman fertility festival of Lupercalia, which took place on February 15 and dates back to the sixth century BCE. In the fifth century CE, Pope Gelasius I abolished the pagan observance of Lupercalia and instead declared February 14 as a commemorative day for the martyrdom of St. Valentine — with no explicit mention of love, however. In fact, it wasn’t until several centuries later that Valentine’s Day’s romantic connotations emerged, sometime in the late 1300s, when English poet Geoffrey Chaucer wrote about the mating rituals of birds in his epic poem "Parlement of Foules." He wrote of “Seynt Valentynes day” as the day “whan every foul cometh ther to chese his make” — or when birds choose their mates.
Advertisement
Advertisement
Photo credit: Kean Collection/ Archive Photos via Getty Images
How Did Cupid Become the Face of a Day for St. Valentine?
Chaucer’s prose is believed to be the first mention of Valentine’s Day as a romantic holiday; from there, the association gained more traction. In the 1470s, an English woman named Margery Brews wrote to her fiancé John Paston and referred to him as “My right well-beloved Valentine” — a letter believed to be the oldest English-language valentine. The concept of Valentine's as a day for love was helped along not only by Chaucer, but by William Shakespeare, whose use of both Valentine’s Day and Cupid as romantic symbols further bolstered the idea in Britain. Shakespeare also considered Valentine’s Day a day for lovers, and associated Cupid with love. By the 16th century, Valentine’s Day and Cupid were established cultural symbols of love, and they eventually coalesced on greeting cards. At this time, cards were enormously popular across Europe, Valentine’s cards chief among them. By the mid-1800s, many Valentine’s Day cards featured imagery not far off from Chaucer’s whimsical vision of the day — birds and flowers in springtime — as well as frequent portrayals of the familiar winged, curly-haired Cupid.
Advertisement
Advertisement
A Day in the Life of a Settler in Colonial America
There was no one typical day in colonial America — the experiences of colonial families differed based on their location, economic status, and individual circumstances. The colonial era not only spanned a large period of time — from the early 17th century to the late 18th century, before the United States became an independent nation — but it also covered a large and varied landscape. The 13 original American colonies stretched from Massachusetts to Georgia, and were populated by settlers from different parts of Europe whose beliefs, traditions, and lifestyles varied greatly.
Colonial settlements ranged from the growing urban centers of the Northeast to the rural agrarian communities of the Southern colonies, and the daily routines of families were impacted by their environment, which included the influence of Indigenous populations. Colonists often adopted or adapted aspects of Native American culture, including agricultural practices and culinary techniques, in order to survive. Weather conditions, seasons, and the availability of resources also played significant roles in shaping daily life in the colonies. But whether you were a farmer, a merchant, a tradesman, or a wife and mother, day-to-day life in the colonies consisted of long days, hard work, and community connections.
Photo credit: ClassicStock/ Archive Photos via Getty Images
The Workday Started Before Dawn
The workday in colonial America typically began before dawn and lasted until the sun went down, and throughout the day, families dedicated themselves to tasks essential for their survival. The morning started with a modest breakfast of bread and milk, porridge, or cornmeal mush with cider or beer before the work began. This sustenance was much-needed: Depending on the time of year, weather, and season, the typical workday could be up to 12 hours long, six days a week.
Men were expected to provide for their family, and while farming was the primary occupation throughout the colonies, there were a variety of other jobs to be filled, particularly as towns started to grow. Men worked as blacksmiths, carpenters, silversmiths, and in other skilled trades and crafts that contributed to the local economy. Women typically worked in the home, managing all aspects of the household and childcare, including meal preparation and basic education of the children. In addition, married women in non-farming families often worked alongside their husbands in shops or trades, and unmarried or widowed women took jobs as seamstresses, midwives, or tavern keepers. Children often assisted their parents in their work, learning valuable skills for their future roles in the community.
In the colonial era, the midday meal was called “dinner,” and it was the biggest, heartiest meal of the day. Those who worked in town in trades and shops could go to a common hall for a meal of stew made from pork or poultry, and seasonal vegetables such as corn and cabbage. The midday break provided a short period of rest from work, allowing families to connect with neighbors and strengthen social bonds.
For many colonial settlements, the local economy also relied heavily on indentured and enslaved labor. Indentured servants brought from Europe and enslaved individuals, predominantly of African descent, played essential roles in supporting colonial households. Their contributions, though often overlooked, were integral to the success of farms and trades and the day-to-day functioning of many households. Farm laborers, including enslaved workers, would have their dinner brought to them where they worked in the fields.
Advertisement
Advertisement
The Seasons Had a Big Impact on Daily Life
During peak agricultural seasons, such as planting in springtime and harvesting in the fall, the workday for farming families was extended, as all members of the family dedicated themselves to essential tasks on the farm. The demanding nature of these seasons required longer hours to ensure a successful harvest and sustained livelihood. On the other hand, in the winter months or during inclement weather, when outdoor activities were limited, the workday might be shorter. Families would then focus on indoor tasks and household chores, such as preparing food for storage, or weaving and mending clothing.
Clothes were generally made from locally sourced materials, particularly sheep’s wool and cultivated cotton and flax, and families produced their own textiles through spinning and weaving. Everyday clothing was modest and simple, reflecting the practical needs of daily life. Men typically wore shirts, waistcoats, and breeches, while women donned dresses with full skirts covered by long aprons. Both men and women also wore leather shoes or boots, as well as hats made of felt or animal pelts.
Photo credit: MPI/ Archive Photos via Getty Images
Leisure Time Meant Music, Games, and Social Events
Not unlike today, evenings were a time for more leisurely activities at home or in town. Large families were common in colonial America, and homes were generally modest in size, with a central hearth for cooking and heating that also provided a place for families to spend time together at the beginning and end of the day. After a light supper of leftovers, the family would often gather around the hearth for warmth and engage in storytelling, handiwork, or quiet chores by candlelight.
Homemade entertainment, such as playing musical instruments, singing songs, or playing simple games, were also popular ways for families to spend what little free time they had. Outside of the home, attending church services, gathering in common halls and taverns, and participating in organized social gatherings like fairs and dances offered a chance for neighbors to interact, share meals, and reinforce their sense of community.
At the end of the evening, families would retire to their beds, which were made of woven sacks known as “ticks.” Beds were stuffed with straw or chaff and sometimes layered with a softer feather-stuffed tick and homemade quilts for warmth. Because there were few rooms in the average home, there were rarely any designated sleeping quarters; rather, people slept in living areas with multiple family members sharing a room. Older children slept in one bed and infants and very young children slept next to their mother. With the natural rhythm of daylight and limited artificial lighting, bedtime was early by modern standards — and a necessity for preparing for the long days of physical labor.
The fad is perhaps the piece of cultural ephemera that most defies explanation. Fashion trends often have clear motivating factors: perhaps a celebrity sporting a certain style, or a TV character wearing a certain haircut that sparks imitation. Souvenirs and collectibles usually directly follow their origin: There’s no mystery where baseball cards came from, or vintage records, and so on. But a true fad — a popular behavior or interest practiced with enthusiasm that’s as strong as it is temporary — exists at the fleeting intersection of a cultural time, mood, and impulse, and some of these short-lived trends seem to outright defy logic.
While fads frequently do have a clear beginning moment (and sometimes even a person who can be named as their initiator), a precise ending moment is never as apparent; we can only know that a fad has ended retroactively, and estimate the point of its demise. And the further away we are from the time of a certain fad, the more inexplicable and strange it can seem. Let’s try to wrap our heads around some of the more bizarre fads of the past.
Photo credit: George Rinhart/ Corbis Historical via Getty Images
Flagpole Sitting
Flagpole sitting was one of the most logistically confusing fads of all time, as it involved remaining upon a flagpole for a marathon duration. The first instance of flagpole sitting was in January 1924, when former sailor and fledgling stunt performer Alvin “Shipwreck” Kelly was hired to perch atop a pole outside a Hollywood movie theater for as long as he could, in order to publicize an upcoming film. Kelly stayed aloft for 13 hours and 13 minutes.
The stunt attracted an impressed crowd and media attention, and Kelly was hired by other businesses to repeat his feat. As word spread, copycats emerged and sought to outdo each other in endurance. Kelly increased his time to eight days in 1927, but Los Angeles woman Bobbie Mack bested him when she spent 21 days atop a flagpole. Kelly then recaptured the record by enduring 49 days while being spurred on by a total of 20,000 onlookers, only for Bill Penfield of Iowa to break it again with a 51-day bout. By this point, flagpole sitters were fashioning some degree of shelter atop the pole where they could eat, sleep, and use the bathroom, in order to extend their stay.
The craze fizzled in the 1930s, but it never went away entirely. Later flagpole sitters such as Richard “Dixie” Blandy, Mauri Rose Kirby, and Peggy Townsend set new records in the ’50s and ’60s, and Blandy kept at the practice into the ’70s. The current record for flagpole sitting was set by H. David Werder in 1984, for a mind-boggling 439 days, 11 hours, and 6 minutes.
Photo credit: Camerique/ Archive Photos via Getty Images
Phone Booth Stuffing
Speaking of logistically confusing fads, the phone booth stuffing craze of the 1950s had teams of participants vying to see who could cram the most people into a single phone booth. Though there are some conflicting reports as to where and when the fad began, it was popularized in Durban, South Africa, in 1959, when a group of 25 people managed to contort themselves into a booth. By March of that year, students on college campuses throughout the United States and Canada were attempting to break the record.
Students at St. Mary’s College of California achieved a count of 22 in a manner that looks legitimate, but it wasn’t long before a lack of standard criteria led to participants achieving greater numbers by dubious means. Some attempts counted people who were mostly (or entirely) outside the confines of the actual phone booth; some altered the booth itself to make it more accommodating; and others stretched the very definition of a phone boothby instead using an indoor phone room. This increasing entropy caused some existential questions about the activity. By the end of the year, the fad had seemingly died out, but anniversary reenactments still occasionally commemorate it.
On March 3, 1939, Harvard University student Lothrop Withington Jr. swallowed a 3-inch goldfish, spurred by a $10 bet with his classmates. There was apparently enough buildup to the stunt that it had spectators and press coverage, and the publicity caused “goldfish gulping” to quickly spread to college campuses across the United States. The stunt almost immediately changed in nature, too — while Withington’s initial bet was about whether or not he could swallow a whole goldfish, goldfish gulping became about how many whole goldfish someone could swallow in a single session.
The fad was competitive. Just shy of a month later, the goldfish-swallowing record was 25. Then a record of 36 stood for a day, until MIT student Albert E. Hayes Jr. set a new record of 42. By the time Clark University’s Joseph Deliberato set an unthinkable record of 89, all kinds of concern was building among adults. College administrators considered the trend a breach of proper student conduct, animal rescue organizations thought it cruel, the U.S. Public Health Service warned against health risks, and a Massachusetts state senator even wanted to outlaw it, drafting a bill to do just that. The variety of opposition expedited the fad running its course; by the end of the year, the trend was history.
Wigs were seen as status symbols among Western aristocracies starting with the reign of Louis XIII of France, who popularized long dark wigs in the 17th century. But by the 18th century, increasingly elaborate and more vertically oriented hairpieces came into fashion. This culminated in a tall and ostentatiously decorated style called a “pouf,” where layers of hair and additional hairpieces were amassed and pinned atop pads, and adorned with unusual decorative elements such as feathers, figurines, and more.
The pouf was created by Marie Antoinette’s dressmaker Marie-Jeanne “Rose” Bertin and hairdresser Léondard Autié, though it was first worn by the Duchess of Chartres in 1774. Her wig was made from 14 yards of gauze and included a tower with plumes, multiple figurines, and a parrot. Not to be outdone, Marie Antoinette commissioned a series of poufs herself, sparking imitation throughout the French aristocracy. The style’s flamboyance increased throughout the decade and prompted criticism from both nobility and the populace. The peak of extravagance may have been what Marie Antoinette wore in celebration of a French naval victory over the British: a wig topped by a full replica of the victorious warship Belle Poule, blurring the lines between reality and caricature.
The Pet Rock seems, on its surface, like the most frivolous fad on record. This simple Mexican beach stone was sold in a box (with air holes!) that included a satirical-sounding manual with instructions on what to do if the rock "appears to be excited." Created by California advertising professional Gary Dahl in August 1975, the rock was an instant hit as a fuss-free pet.
In reality, the product wasn’t meant to be taken literally, but more as a multimedia gag. After all, Dahl paid great attention to detail in cheekily fashioning the box as a pet carrier, and filled the “instruction manual” with tongue-in-cheek tricks to teach the rock. The entire phenomenon might make much more sense to us today if the manual had appeared on the New York Times bestsellers list under “humor.” As Dahl later explained it, “At the time, the Vietnam War was winding down; Watergate had just started up. There was a whole lot of bad news going on… It wasn't a real good time for the national psyche. I think the Pet Rock was just a good giggle. Everybody needed a good laugh and the media ate it up."
The media attention was the driving factor in the rock’s popularity. As Dahl wrote, “During its five month retailing life span, the Pet Rock was referenced in nearly every daily newspaper in the country, most major magazines, all national network news programs, The Tonight Show and other late-night talk shows, most radio talk shows, and international media, such as the BBC. I was personally interviewed hundreds of times.” The relentless publicity led to the sale of more than a million Pet Rocks by Christmas of 1975. Even still, jokes don’t have a long shelf life under repetition: The fad was essentially over by 1976.
Throughout the 1920s, New York City’s Harlem neighborhood served as the vibrant headquarters of a transformative period in African American art, literature, music, and social justice leadership. This movement, known as the Harlem Renaissance, was a catalyst for celebrating African American culture and heritage, giving the Black community newfound ownership of their experiences and pride in how their stories were told. It also sought to challenge racial stereotypes and forge social and political equality, planting ideas that would be meaningful for years to come. Here, told in six facts about the movement, is the story of the Harlem Renaissance.
From the 1910s until the 1970s, approximately 6 million Black Americans made their way from the Southern U.S. to Northern, Midwestern, and Western states, fleeing racial discrimination and economic hardships, and seeking better work and education opportunities. Known as the Great Migration, this mass movement transformed the country’s demographic landscape and was a major impetus for the Harlem Renaissance. By the 1920s, some 200,000 newcomers had made the New York City neighborhood of Harlem home; at just 3 square miles in size, the neighborhood had the largest concentration of Black people in the world, with people from all backgrounds, including artists, laborers, scholars, and writers. By the early ’20s, a vibrant cultural community was blossoming in this small corner of Upper Manhattan.
Photo credit: Stock Montage/ Archive Photos via Getty Images
Magazines Were Crucial to the Movement
In March 1924, a dinner party at Harlem’s Civic Club brought together a group of emerging and established writers and publishers. That gathering is now widely regarded as kicking off the Harlem Renaissance. The movement encompassed a wide variety of creative arts, but at its core was the literary scene. Two publications in particular emerged as crucial platforms for burgeoning African American writers at the time. One was The Crisis, founded in 1910 as the official magazine of the National Association for the Advancement of Colored People (NAACP) by renowned civil rights activist and sociologist W.E.B. Du Bois. It initially focused on social and political issues but expanded its content to include a wider representation of African American life and ideas. The magazine provided space for then-unknown writers such as Langston Hughes, Claude McKay, and Jean Toomer to share their literary works, and by the start of the 1920s, The Crisis was distributing 100,000 copies a month. Another magazine, Opportunity, emerged in the early 1920s, and also amplified creative Black voices, including that of Zora Neale Hurston.
Some of the 20th century’s most important African American creatives and activists emerged during the Harlem Renaissance. In literature, leaders such as Zora Neale Hurston and Langston Hughes, as well as Countee Cullen, among others, sought equality while celebrating Black identity and heritage. Meanwhile, jazz and blues maestros such as Louis Armstrong, Duke Ellington, and Bessie Smith became powerful musical voices reflecting the social climate of the time. The Harlem Renaissance was also a battleground for social justice, with activists such as W.E.B. Du Bois and Marcus Garvey advocating for equal rights and laying crucial groundwork for the civil rights movement. Today, the era’s vibrancy is immortalized in celebrated visual art from painters Beauford Delaney, Archibald Motley, and sculptor Augusta Savage. The collective impact of these icons continues to reverberate today.
The Harlem Renaissance Music Scene Was Boosted by Prohibition
While jazz music predates the Harlem Renaissance, its popularity soared during the 1920s “Jazz Age” — and one of the genre’s unlikely benefactors was Prohibition. In January 1920, the United States banned the manufacture and sale of “intoxicating liquors,” and the law quickly spurred the creation of underground bars known as speakeasies. By the mid-1920s, thousands of New York speakeasies were competing for business, and to attract more of the predominantly wealthy, white crowds, entertainment was brought in regularly — jazz music legends such as Louis Armstrong and Duke Ellington were among the most popular performers at the time. Speakeasy culture led to ample and well-paying opportunities for Black musicians, and it introduced jazz music to a broader audience beyond Harlem.
Despite its name, the Harlem Renaissance wasn't confined solely to Upper Manhattan. While Harlem was the symbolic capital, this revival of Black culture resonated as a nationwide movement in urban centers such as Chicago (whose own movement was known as the Chicago Black Renaissance) and Los Angeles, while also finding its way to smaller pockets throughout the country. In Texas, African American poet Bernice Love Wiggins had several works published in local papers, and self-published a poetry collection that has drawn comparisons to some of the era’s literary greats, including Hurston and Hughes. Short story writer and poet Anita Scott Coleman, from New Mexico, was considered one of the American West’s most important Harlem Renaissance contributors. Her work was published in the influential magazine The Crisis, and she eventually published more than 30 short stories in other periodicals. Aaron Douglas, one of the most renowned visual artists of the Harlem Renaissance, began his career in Kansas, but was eventually convinced by Opportunity magazine founder Charles S. Johnson to bring his talents to the movement’s epicenter in Harlem, extending his influence even further.
The 1929 stock market crash and the onset of the Great Depression plunged the United States into economic turmoil, and as a result, most of the Harlem Renaissance’s patronage and financial support — not to mention its artistic energy — waned. The end of Prohibition in 1933 dealt a final blow to the remaining nightlife scene, and by the mid-1930s, many of the onetime pillars of the community had moved on in search of other work to make ends meet. By 1935, the initial optimism and empowerment fostered by the Harlem Renaissance withered in the face of deep-rooted racial prejudices and inequalities that persisted despite the era’s advancements. In March of that same year, the tensions culminated in a defining event, when rumors about the arrest and police treatment of a Latino teenager accused of shoplifting spread throughout Harlem, sparking a deadly riot that has come to be seen as the official end of the Harlem Renaissance. But while the era was over, its influence lived on. African American thinkers, artists, and activists gained recognition and validation like never before during the Harlem Renaissance, and the consciousness that the movement cultivated helped fuel the Civil Rights Movement throughout the 1950s and ’60s.
There have been 91,310 days in the last 250 years, but only a few of them stand out as singularly odd. Unexplained phenomena, surprising coincidences, and, in some cases, a strange quiet, don’t happen every day — especially on a massive scale.
From the day an entire region thought the apocalypse was coming, to the day apparently nothing of note happened at all, some days really stand out.. Next time you’re having an eerie day, put it in perspective with these five dates.
This Friday in May started out like any other, with the sun rising and bringing daylight with it. But if you happened to be in the northeastern United States or small parts of southeastern Canada, the sky was yellow by midmorning and completely darkened by noon. This would be disorienting at best even today, but in the 18th century, without the benefit of modern science to explain what happened, it was even more harrowing.
People left work and school and flooded into churches and taverns. Some believed it was the second coming of Christ. Others decided to stay put; one state legislator famously said, in response to his colleagues calling for adjournment, “The day of judgment is either approaching, or it is not. If it is not, there is no cause for an adjournment; if it is, I choose to be found doing my duty. I wish therefore that candles may be brought.”
The moon came out around midnight that night, much to the relief of those who thought it was judgment day. Nobody knew what caused the darkness at the time, but the likely culprit, based on reports from the period and physical evidence on older trees, was wildfire smoke blowing in from Canada.
Photo credit: Hulton Deutsch/ Corbis Historical via Getty Images
April 18, 1930: The Slowest News Day
A day with no news seems next to impossible in today’s 24-hour news cycle, but one evening in 1930, BBC News reported that there was nothing to report — at least nothing that hit the station’s desk. When it came time for the regular 15-minute radio news bulletin at 8:45 p.m., the broadcast was very short: The announcer simply said, “There is no news.” The remainder of the 15 minutes was filled with piano music before the station returned to what was playing before, a live concert of the BBC Symphony Orchestra at the Queen’s Hall in London.
It’s easy now to look back at what actually happened around the world that day, including a typhoon in the Philippines, but communication wasn’t as fast in the 1930s, and journalists relied heavily on wire services and government announcements. Even still, it was pretty rare to not have any news at all to report for the day.
April 11, 1954: The Most Boring Day, According to AI
Almost every day, major events happen somewhere in the world, and someone famous or noteworthy is born or dies on that day. But not always — and one such dull day was April 11, 1954. This is according to the artificial intelligence project True Knowledge, which indexed hundreds of millions of facts and was later sold to Amazon to help develop the Alexa product. In November 2010, computer scientist William Tunstall-Pedoe created a query to determine the single most boring day in history. The answer was April 11, 1954. The program determined that a Turkish academic was born that day — his field is electrical engineering — but reported that nothing else particularly significant happened. The AI clearly wasn’t a fan of water polo — the Hungarian water polo Olympian Attila Sudár was born on April 11, 1954.
If you’re from the United States, you may have a very different idea of the cartoon character Dennis the Menace than someone from the United Kingdom. In America, Dennis is a baby-faced blonde boy, a lovable scamp who gets into trouble but is ultimately endearing. The British Dennis the Menace, on the other hand, is a violent bully with a grumpy expression and a hunched posture.
The weirdest part is that neither Dennis came first: They debuted at the same time, with no coordination, on March 12, 1951. The American Dennis was syndicated to 16 newspapers, while the British Dennis was in a weekly comic book magazine called The Beano. It’s a bizarre coincidence, but rarely causes confusion — when the 1993 American film Dennis the Menace was released across the pond, it was just called Dennis.
On August 15, 1977, the Big Ear radio telescope at the University of Ohio was trained on the cosmos, ready to catch a signal from some kind of extraterrestrial intelligence, should it exist. That day, it picked up what astronomers now know as the “Wow!” signal, a strange radio signal at a frequency and volume they’d been looking for. It’s still the most compelling example of a potential extraterrestrial communication — and while the general consensus now is that it was something other than alien life, nobody knows for sure what that is.
In 1977, SETI researchers were wowed by what they’d picked up, and the astronomy community was buzzing. But it was a busy 24 hours, and the world’s eyes were on a different type of star. On August 16, Elvis Presley died, setting off a firestorm of media coverage and national mourning. While there are many conspiracy theories around the death of the King, surprisingly (given the timing), most of them don’t involve aliens.
In 1958, amid growing fears of Cold War nuclear proliferation, thousands of people gathered in London’s Trafalgar Square on Good Friday. The protesters were there to embark on a 50-mile march to the Atomic Weapons Research Establishment in Aldermaston, a small village where England carried out weapons research, production, and testing. The demonstration was the work of the Direct Action Committee Against Nuclear War (DAC) and the newly formed Campaign for Nuclear Disarmament (CND), and it featured the debut of an image that became one of the most recognized protest symbols in the world: the peace symbol.
Photo credit: Steve Eason/ Hulton Archive via Getty Images
The emblem was designed by English artist Gerald Holtom, a pacifist and conscientious objector during World War II. His design was stark, yet powerful: a circle, symbolizing Earth, with downward-pointing lines inside, forming the shape of the letter “N” and the letter “D,” for "Nuclear Disarmament.” The symbol also held deeper layers of symbolism for Holtom. In a letter to Hugh Brock, editor of the British pacifist magazine Peace News, Holtom revealed that he saw the symbol as somewhat of a self-portrait, capturing an individual in despair, standing with outstretched hands. The artist also named Spanish Romantic painter Francisco Goya's 1814 painting The Third of May 1808, which portrayed civilian resistance to war, as an influence on his design.
Before long, the CND symbol had made its way across the Atlantic, where it shed its strict nuclear arms association and was used as a sign of peace during civil rights demonstrations. Although the symbol’s exact path to America isn’t known, it seems most likely to have been brought over by Bayard Rustin, a close associate of Martin Luther King Jr. who was at the 1958 London protest. Peace activist Philip Altbach has also claimed he brought CND buttons to his colleagues after a trip to England, introducing Americans to the symbol for the first time.
Photo credit: Shepard Sherbell/ Corbis Historical via Getty Images
For as widely as the peace symbol has been embraced over the years, however, it has also faced misinterpretation and criticism. Christian groups have construed it as a broken, reversed cross. Its anti-war association had it condemned as a communist symbol, and it has also been compared to a runic symbol for death. Nevertheless, the simple visual has endured as a totem of nonviolent activism, and it continues to be used in advocacy for conflict resolution, LGBTQ+ rights, and various other social justice issues. As far as South Africa, the sign was adopted as a powerful symbol of the resistance to apartheid, which even caused the government to attempt to ban the symbol in 1973.
The peace sign’s designer, Holtom, did once express some regret over the final design of his original nuclear disarmament symbol. In an unused design, which was upside-down from the final image, the letter “N” was instead a “U.” Holtom saw this as a way to signify a broader disarmament — a “unilateral” one. Given the peace symbol’s universality, his instincts have proved correct either way, and his creation remains an aspirational symbol for a more peaceful and equitable world.
H. Armstrong Roberts/ClassicStock/ Archive Photos via Getty Images
Author Rachel Gresh
January 4, 2024
Love it?200
Elvis was on the radio, TheEd Sullivan Show was on the TV, and scores of people were hightailing it to the suburbs — this was 1950s America. It was a young nation, with 31% of its 151 million residents under age 18, and it was on the brink of change. Birth rates continued to rise at unprecedented levels, giving way to a new generation of “baby boomers.” The “nuclear family” (describing married couples with kids at home) was ingrained in the culture; more than half of all people (68% of men and 66% of women) were married. By the time the ’60s rolled around, many of these cultural norms would be upended, but this generation left a lasting mark on American society. Here is a snapshot of family life in the 1950s, by the numbers.
Post-World War II America saw a rapid increase in birth rates lasting from 1946 through 1964. It became known as the “baby boom,” and the 1950s were smack dab in the middle of it. During the ’50s, around 4 million babies were born every year in America, a sharp increase from the previous average, around 2.7 million births annually between 1910 and 1945. By the end of the boom, around 77 million babies had been born. This influx of births was due to many positive aspects of the postwar era, including low unemployment rates, a burgeoning economy, low interest rates, and a strengthened middle class.
In alignment with the nuclear family mindset, most ’50s households consisted of a married couple, and typically only one spouse worked (generally the man). In 1950, only 29% of working-age women living in the U.S. held a job — but nearly half of single working-age women (46.4%) worked. The number decreased dramatically among married working-age women; less than a quarter of them (21.6%) held jobs.
By 1960, the number of working women in America increased from 16.5 million to nearly 22.5 million, a 35% increase (despite only a 14% increase in the population of working-age women). The five most popular jobs for women of this era were secretary (stenographer or typist), salesperson (retail), schoolteacher, bookkeeper, and apparel factory worker.
Advertisement
Advertisement
Photo credit: Camerique/ Archive Photos via Getty Images
Mortgage Rates Averaged Around 2.5%
The housing market of the 1950s was booming. An increasing number of Americans were leaving busy urban lifestyles behind in favor of the suburbs. Mortgage rates ranged between 2.1% in 1950 and 2.6% in 1959. For the 16 million World War II veterans living in 1950, the G.I. Bill lowered mortgage rates even more. For many, it was the perfect time to purchase a home.
One of the most recognizable examples of 1950s suburban neighborhoods must be the “Levittowns,” named after real estate developer William Levitt, who built thousands of houses in planned communities around the mid-Atlantic during the late 1940s and early ’50s. The most famous of these communities was in Long Island, New York, where during peak construction, one house was built every 16 minutes.
Around 4.4 million homes had television sets by 1950. This might sound like a lot for the era, but it was only 9% of households. By the end of the decade, the figure spiked to 90% of households, marking a transformational decade for entertainment. Television programming, especially the American sitcom, became a staple of family life. These shows epitomized the stereotypical American family unit, from the Cleavers of Leave It to Beaver to the Warrens of Father Knows Best.
Although the golden age of Hollywood was nearing its end in the 1950s, cinemas were still as popular as ever, and fortunately for moviegoers, this pastime didn’t cost a fortune. In 1950, one theater ticket cost 46 cents, which was less than the price of a dozen eggs (60 cents). A family of four could go to the movies for the price of around two gallons of milk (one gallon cost 83 cents) — a feat that is not likely accomplished today.
Families flocked to theaters to see Disney’s Cinderella, the top-grossing film of 1950. Released on February 15, the film grossed more than $52 million that year and sold nearly 99 million tickets. Other top-grossing films of 1950 included King Solomon’s Mines, Father of the Bride, and All About Eve.
More Than Half of All Households Had Children at Home
Due to the ongoing baby boom, most American households had young children at home in the 1950s. Census records show that around 52% of households had children under 18 at home in 1950; in 2019, that number was down to 41%. Families were large during the decade: Around 58% of households had between three and five members, 21% had more than six members, 18% had two members, and only 3% had one member. The average family unit size has steadily declined since its peak during the late 1950s and early ’60s. In 2022, the average American family size was 3.13 people.
With so many Americans moving to the suburbs in the ’50s, more and more families depended on a car to get around. In 1954, most U.S. households (64%) owned one car. Between 1954 and 1960, the number of one-car families rose from 30.1 million households to 32.4 million. Multicar ownership wasn’t popular — a little more than 8% of households owned two cars in 1954, and only 0.9% had three or more cars. (Owning two cars became slightly more common by the end of the decade.) Just how much did a car set you back during the 1950s? Two popular family cars, the Cadillac DeVille and the Oldsmobile 88 Fiesta, cost around $3,523 and $3,541, respectively, which would be around $37,000 today.
Advertisement
Advertisement
Another History Pick for You
Today in History
Get a daily dose of history’s most fascinating headlines — straight from Britannica’s editors.