Credit: New York Times Co./ Archive Photos via Getty Images
Author Nicole Villeneuve
April 23, 2026
Love it?184
On a late spring evening in 1933, a crowd of moviegoers in New Jersey embarked on an entirely new viewing experience. They parked their cars in a lot in the town of Camden, and settled in to watch the film from their very own automobiles under the nighttime stars. It was the world’s first drive-in theater, dreamed up by a local automotive chemicals salesman named Richard Hollingshead Jr.
Hollingshead had come up with the concept not long before. The 33-year-old entrepreneur tested his idea in his own driveway, setting up a projector on the hood of his car and hanging a white sheet from the trees. On May 16, Hollingshead received a patent for his “park-in,” (the term “drive-in” caught on later) and just three weeks later, the first theater opened on June 6 with a showing of the British comedy Wives Beware.
Patrons paid 25 cents per car plus another quarter per person; a car with three or more people was charged a flat rate of $1. Hollingshead was proud to offer a different moviegoing experience, boasting, “The whole family is welcome, regardless of how noisy the children are.”
The novelty of the drive-in caught on slowly at first. But by the end of the 1940s, with World War II in the rearview, Americans and their growing families were ready to indulge in leisure and entertainment. Families didn’t have to dress up, kids could doze off in the back seat, and you could bring your own snacks — or heed the call of the animated intermission ads urging a trip to concessions for hot dogs, popcorn, and candy galore.
By the late 1950s, drive-in theaters were booming. At the trend’s peak, more than 4,000 locations dotted the country, mostly in rural and suburban areas where land was plentiful. The Johnny All-Weather Drive-In in Copiague, New York, was one of the most elaborate. When it opened in 1957, it had room for 2,500 cars, a full-service restaurant, a playground, and even a shuttle train to ferry moviegoers from their cars to the amenities. Other drive-ins, including the 99W in Oregon (opened in 1953) or the Bengies in Maryland (opened in 1956), became beloved local institutions that are still operating even today.
Drive-ins weren’t just movie theaters — they became all-out destinations. Some added mini-golf, pony rides, and swing sets right underneath the massive screens, enticing kids to play until showtime at dusk. Of course, these outdoor theaters also played a starring role in the burgeoning teenage and youth culture. For many, the drive-in was where they went on their first date, shared a first kiss, or congregated with friends, the movie sometimes secondary to the memories being made under the stars.
In the 1970s, though, business started slowing down. The oil crisis caused Americans to drive less, while the proliferation of color TV sets and VCRs brought movies into the living room. And importantly, the huge plots of land beneath the drive-in screens eventually became more valuable than the drive-in businesses themselves. A theater needed 10 to 15 acres of open space, often in growing suburbs where developers were eager to build shopping malls or more homes. The screens slowly started going dark.
Even so, the drive-in never fully disappeared. A few hundred theaters held on, many family-run and fiercely beloved. Today, fewer than 300 drive-ins remain in operation across the U.S. While that may be a fraction of the number of drive-ins at its midcentury peak, the institution nonetheless survives thanks to nostalgia, community support, and a shared memory of an earlier time.
For centuries, tarot cards have carried an aura of mystery. To many people, they are tools for reflection, storytelling, or spiritual insight, while to others, they are simply beautiful objects, or perhaps sources of fear and disdain. But few people are aware that tarot’s earliest history has nothing to do with divination. Long before the cards were used to peer into inner lives or imagined futures, they were created for a very different purpose: play.
Credit: DEA / G. CIGOLINI/ De Agostini via Getty Images
A Renaissance Card Game
Tarot emerged in northern Italy in the early 15th century, at the height of the Renaissance, when card games were a fashionable diversion among aristocratic courts. Wealthy families commissioned ornate decks known as carte da trionfi, or “cards of triumph,” to play a game called tarocchi. Though the rules have not survived intact, the game appears to have combined skill, memory, and chance, and may have been a bit like bridge.
The most likely patron of the first tarot deck was Filippo Maria Visconti (1392-1447), the reclusive Duke of Milan, whose court delighted in symbolic display and intellectual play. Roughly 15 decks associated with the Visconti family survive today, most famously the Visconti-Sforza deck, dating from the mid-1400s. These cards were luxury objects, lavishly decorated with gold leaf and fine detail, designed to impress as much as to entertain.
Credit: Marka/ Universal Images Group via Getty Images
What the Original Images Represented
Despite their later reputation, these early tarot cards were not occult devices. Their imagery drew from a shared visual language familiar to Renaissance viewers: Christian virtues, classical allegory, courtly hierarchy, and reminders of mortality. The characters on the cards — emperors, popes, virtues such as Justice and Fortitude, and figures such as Death or the Fool — echoed themes found in sermons, pageants, poetry, and art.
Some details were intensely personal. Fair-haired figures likely reflected the Viscontis themselves, while lions, fountains, laurel wreaths, and fruit referenced specific military victories, marriages, and dynastic power. (For example, the lion depicted in the Fortitude card likely referenced a military defeat over Venice.) Some scholars have suggested that the trump sequence — the set of allegorical cards later known as the Major Arcana — functioned as a loose meditation on the rise and fall of worldly power. But if the cards hinted at life’s larger forces, they did so without a sense of instruction. They acknowledged fate, virtue, and fortune, but did not claim to explain them.
Advertisement
Advertisement
Credit: Stefania Pelfini la Waziya/ Moment via Getty Images
New Meanings Inspired by “Egyptomania”
Tarot spread gradually across Europe, reaching France by the 16th century, where the current name took hold (adapted from the Italian tarocchi). Over time, however, the cultural context that once made the symbolic imagery legible began to fade. By the late 18th century, tarot had become visually familiar but symbolically opaque, which also meant that it was ripe for reinterpretation.
That reinterpretation arrived during a period of intense intellectual ferment. France was in the grip of “Egyptomania,” a fascination with Egypt fueled by Napoleon’s campaigns there and a popular belief that ancient Egypt held some kind of key to universal wisdom. At the same time, traditional religious institutions were struggling to adapt to political and social upheaval, creating space for alternative spiritual systems.
Into this atmosphere stepped Antoine Court de Gébelin (1725-1784), a French scholar who declared that tarot was no mere card game but a fragment of ancient Egyptian wisdom long disguised for the sake of its own survival. The theory was entirely made-up, but it was nevertheless compelling. Tarot, once a Renaissance amusement, was reborn as an esoteric relic.
Court de Gébelin’s ideas found eager successors, especially in France. French occultist Jean-Baptiste Alliette published one of the first tarot decks designed explicitly for divination in 1789, layering astrology, numerology, and invented Egyptian lore onto the cards. In the 19th century, occultists such as Éliphas Lévi and Gérard Encausse (known as Papus) linked tarot to hermeticism, astrology, and the kabbalah, embedding the cards within elaborate metaphysical belief systems.
Tarot was no longer a game for anyone to pick up. It was treated as powerful, even dangerous — a tool that required initiation and expertise. To misuse it, some warned, risked madness or ruin.
Advertisement
Advertisement
Credit: Keystone/ Hulton Archive via Getty Images
The Secret Society That Shaped Modern Tarot
The next major transformation occurred in England, under the influence of the Hermetic Order of the Golden Dawn. Though the secret society never numbered more than a few hundred members, its impact on modern tarot was enormous. Golden Dawn practitioners mapped tarot’s 22 trump cards onto the kabbalistic Tree of Life, assigning each a specific spiritual pathway. These associations still underpin many contemporary interpretations.
Two Golden Dawn members proved especially influential: English occultists Aleister Crowley and Arthur Edward Waite. Crowley expanded tarot’s network of esoteric correspondences, while Waite made a more practical innovation. In 1909, Waite commissioned artist Pamela Colman Smith to illustrate a new deck in which even the numbered suit cards — previously simple arrangements of symbols — became narrative scenes. The result, now known as the Rider-Waite-Smith deck, transformed tarot reading by making it more story-driven. It remains the most widely used tarot deck in the world.
With the rise of New Age culture in the late 20th century, tarot underwent yet another shift. While divination remained central, the emphasis moved from predicting the future to healing, self-development, and psychological insight. Influenced by Jungian ideas about archetypes, modern tarot users embraced the cards as mirrors of inner experience rather than fixed messages from beyond.
At the same time, tarot became radically accessible. Themed decks proliferated — pagan, astrological, pop-cultural, playful, and profound — and anyone with curiosity and a little time could learn to read them. What had once been a guarded esoteric system became a flexible, personal practice.
Seen across its long history, tarot has changed meanings many times, reshaping itself to fit the hands that hold it. It began as a game, was transformed into an esoteric symbol system, and now lives in countless forms at once. Each era reworked the cards to reflect its own concerns — be they power, fate, mysticism, or self-knowledge.
Rather than containing timeless secrets, tarot reveals something more human: the enduring impulse to find meaning in images, to tell stories, and to see our own lives reflected in symbolic form. In that sense, tarot has long been doing what it does best — inviting us to look, interpret, and imagine.
While they’re rarely seen today, variety shows, with a genial host introducing an eclectic array of singers, comedians, jugglers, and the like, were once among the most popular displays on television — and before that, on radio, and before that, on stage. They’re a remnant of another time, before a remote control or the click of a mouse could point our drifting attention toward a different channel.
Until relatively recently, variety shows were a prominent part of American culture. Here’s a look at how this form of showmanship rose with the times, but failed to keep pace as the entertainment industry evolved.
Credit: Bettmann Archive via Getty Images
From the Stage to Radio
Variety acts have been part of the American theater tradition since at least the 18th century, when they were used to keep audiences amused between sets of the main show. They emerged as independently staged productions by the 1840s, and by the early 1880s, the variety show extravaganza known as vaudeville was en route to becoming the country’s most popular form of entertainment.
With the burgeoning prevalence of radio in the 1920s, performers who made their living on stage began showcasing their skills over the airwaves. The medium’s first mainstream variety show belonged to singer and bandleader Rudy Vallée, who provided music, interacted with guest stars, and unveiled a dramatic sketch as part of The Fleischmann’s Yeast Hour beginning in October 1929.
Vallée was credited with discovering top talents such as Eddie Cantor, who brought in a studio audience to liven up his own radio program. Stars such as Ed Wynn, Fred Allen, and Bing Crosby also enjoyed success as variety show hosts during this era.
Television’s first variety show surfaced on NBC in May 1946, courtesy of Standard Brands, which had been a major sponsor of the format on radio. Hour Glass featured an ever-changing array of guest stars who introduced comedic sketches, musical acts, and commercials. But squabbling between the network and sponsor ultimately led to the show’s cancellation in February 1947.
NBC tried its hand at variety again in June 1948 by bringing the long-running radio program Texaco Star Theatre to television, and soon settled on Milton Berle as its permanent host. A well-traveled if moderately heralded veteran of vaudeville, radio, and movies, Berle found this weekly gig a perfect platform for his brand of outlandish physical comedy. Texaco Star Theatre quickly emerged as the top show on TV, to the point where restaurants and movie theaters reportedly closed while Berle commandeered the attention of would-be customers from 8 p.m. to 9 p.m. on Tuesday nights.
Twelve days after Texaco Star Theatre’sTV debut, Toast of the Town premiered on CBS with a markedly different host. Awkward and notorious for forgetting names, Ed Sullivan hardly looked the part of a man tasked with entertaining the masses. Yet from the very first episode, which featured the Toastettes dance troupe and the comedy duo of Dean Martin and Jerry Lewis, it was clear its host possessed a sharp eye for talent. The show was renamed in his honor in 1955, and Sullivan took care to balance the presentation of children’s fare with up-and-coming rock ’n’ roll acts such as Elvis Presley and the Beatles. The show served as a kingmaker of American tastes until going off the air in 1971.
The successes of Berle and Sullivan spurred an onslaught of TV variety shows, many of which carried their audiences from previous incarnations. Arthur Godfrey brought his competition program Arthur Godfrey’s Talent Scouts from CBS Radio to TV in late 1948, and soon afterward premiered the more traditional variety show Arthur Godfrey and His Friends. The Red Skelton Show became another successful radio-to-TV hit beginning in 1951, with its star’s stable of comic characters such as Junior the Mean Widdle Kid proving just as engaging on camera.
Not every established star was able to make the leap to a weekly TV format, however. Comedian Joe E. Brown, who possessed face-distortion abilities to rival those of Berle, couldn’t keep The Buick Circus Hour afloat for more than a season in the early 1950s. And for all his accomplishments as a singer and actor, Frank Sinatra twice failed to gain significant traction with a variety show that decade.
While old entertainment standbys such as Dean Martin enjoyed success with their programs in the 1960s, variety shows experienced a lag until being jolted by a wave of new talent later in the decade.
Chief among these was The Carol Burnett Show, which boasted the comedic and musical talents of its titular star as well as a strong supporting cast. Known for its recurring characters and spoofs of popular movies, as well as the signature ear tug and Tarzan yell of its host, The Carol Burnett Show claimed a whopping 23 Emmy Awards over its run from 1967 to 1978.
Also debuting in 1967, The Smothers Brothers Comedy Hour featured the sibling-rivalry schtick of folk singers Tom and Dick Smothers. While the show included traditional variety elements such as musical production numbers, the co-hosts and their guests rankled CBS executives with their barely veiled drug references and criticism of the Vietnam War, until they were canceled at the end of the second season. Meanwhile, Rowan & Martin’s Laugh-In, which premiered in January 1968, also poked fun at cultural mores and pushed the envelope with its risqué jokes, although it managed to hang on until 1973.
With the launch of The Flip Wilson Show in 1970, the variety format had its first successful Black host. Wilson excelled at character creation, particularly with the fan-favorite Geraldine Jones, and his show was the country’s second-most-watched program for two of its four seasons.
Musicians came to dominate variety shows in the 1970s, with Cher and Sonny Bono, Donny and Marie Osmond, and Tony Orlando and Dawn enjoying solid ratings as hosts. Other notable entries from the period include The Muppet Show, which featured the brilliant puppet work of Jim Henson and his crew from 1976 to 1981, and Saturday Night Live, which embarked on its endless run in 1975.
However, the format was clearly running on fumes by the time Pink Lady and Jeff, which starred a pair of non-English-speaking Japanese pop stars, went on and off the air in 1980. The long-running Hee Haw, which first introduced a country flavor to the genre back in 1969, continued airing original shows until 1992, while The Statler Brothers also enjoyed success among the country set on the Nashville Network through most of the 1990s. Otherwise, few variety shows managed to make a dent in the public consciousness across the final years of the 20th century.
The format’s demise is often blamed on the rise of cable television and the accompanying fragmentation of audiences: About 23% of American households received basic cable TV service in 1980, a number that jumped to 60% by the end of the decade. However, it’s clear that changing tastes have also dictated viewing preferences, as 21st-century audiences have tuned in to traditional networks to watch reality TV competitions such as America’s Got Talent, which features the disparate singing, dancing, and athletic performances that would have been welcomed on The Ed Sullivan Show, though without the competition element
Variety shows continue to find their fans in international markets, and multitalented artists including Maya Rudolph and Neil Patrick Harris have attempted to revive them with limited success in recent years. And while SNL shares some elements with traditional variety shows, it is mostly focused on sketch comedy. Otherwise, variety has largely been relegated to old video clips and other forums of nostalgia, waiting for the day when the curtain rises again and this once-celebrated form of entertainment is called back into the spotlight.
Advertisement
Advertisement
6 Childhood Games Only Baby Boomers Will Recognize
Afternoons and weekends in the 1950s and 1960s looked a little different than they do today, particularly for kids. Streets, schoolyards, and living rooms were alive with the sounds of children playing games — analog, not video — including some that dated back centuries and others newly invented or imported by toy companies. For baby boomers, playtime was about creativity, skill, and sometimes even a touch of danger. It was an era when kids were expected to make their own fun, though family time was valued too — and many games brought the older and younger generations together.
Whether using a pocketknife, a piece of string, or just their imagination, the games baby boomers enjoyed entertained them for hours and created memories that lasted decades. Which of these games do you remember playing?
Credit: Bettmann Archive via Getty Images
Mumblety-Peg
The odd-sounding mumblety-peg got its start in the 17th century in the British Isles. Also known as mumbley peg, mumble-the-peg, and mumbledepeg, the game eventually became a 19th-century American frontier pastime that carried into mid-20th-century childhoods. Played with a pocketknife, it was equal parts challenging and dangerous. The basic goal was to flip or toss the knife in increasingly complex tricks so that the knife ended up stuck in the ground, blade down, as close to one’s foot as possible. There were many variations on the theme — some players created elaborate stunt sequences or dares, and a popular penalty was having to retrieve a “peg” driven into the ground with one’s teeth. Not surprisingly, the game often ended with minor injuries and dirt-covered faces and hands.
The daring game was mentioned in Mark Twain’sTom Sawyer, Detective (1896), and by the 1950s, mumblety-peg’s thrill made it a favorite pastime among boys and men. Though its outlaw image only heightened the game’s appeal, its popularity waned in the 1970s as schools and summer camps instituted bans and fewer boys carried pocketknives.
Hūsker Dū? (from a Danish phrase meaning “Do you remember?”) was introduced to U.S. households in the 1950s by the Pressman Toy Corporation. Although there were other memory-matching games already on the market, this one came with a playful marketing hook that promoted it as a game “in which the child can outwit the adult.” Based on classic memory-matching games that had circulated in Europe, it challenged players to find and recall the location of hidden illustrated pairs.
Instead of simple face-down cards, though, the game used a rotating wheel with pictures concealed beneath sliding pegs, so every spin created a new layout. The game’s odd-sounding name stuck in popular memory, too, resurfacing decades later when the Minnesota punk band Hüsker Dü adopted it in 1979, adding umlauts in the style of heavy metal bands such as Motörhead and Blue Öyster Cult. According to lore, the band members couldn’t remember the French lyrics to a Talking Heads song and started throwing out every foreign-language phrase they did know — including the name of a game at least one of them had played as a child.
Also known as Jack in the Pulpit or fan sheng in China, cat’s cradle wasn’t born in the boomer era, but it did thrive then. Similar string-figure games date back centuries, with versions found across Asia, Africa, and the Pacific Islands, often used to tell stories or pass on traditions. These games were introduced to the United States through books and magazines that taught children how to create the figures. By the mid-20th century, instructions were widely available, and the game’s simple materials made it accessible to everyone.
Using a 4-foot piece of string or yarn, children created and passed intricate designs such as “Jacob’s Ladder,” “Cup and Saucer,” and “Spider’s Web,” challenging each other to try ever-more-complicated patterns. While primarily played in pairs, the game could also be practiced alone, promoting problem-solving, patience, and hand-eye coordination.
Cat’s cradle left its mark on literary pop culture, too — Kurt Vonnegut used it as the title of his 1963 novel satirizing topics including science, technology, and religion in a postmodern world. The name symbolized humanity’s need to learn, improve, and make order out of chaos.
Advertisement
Advertisement
Credit: George Marks/ Hulton Archive via Getty Images
Mille Bornes
Mille Bornes was created in France in 1954 by Edmond Dujardin, a driving instructor who wanted to simulate the excitement and strategy of a road race. The title, meaning “a thousand milestones,” referred to the distance markers on French roads, and the objective was to be the first player to complete the race using a special deck of 110 colorful cards. These cards depicted hazards including flat tires, accidents, and stop signs, as well as remedies, distances, and speed boosts. The game’s clever blend of luck and strategy, including the famous “coup fourré” counterplay, made each hand suspenseful and satisfying.
Recognizing its potential appeal to American audiences, Parker Brothers brought Mille Bornes to the United States in 1962, and it soon became a staple of family game nights. The bilingual cards and whimsical artwork set it apart from typical American-made games, and its mix of strategy and chance won over both children and adults as players raced to reach the finish line at 1,000 miles (1,000 kilometers in the French version).
Credit: George Rinhart/ Corbis Historical via Getty Images
Pinochle
Pinochle, a trick-taking game that evolved from European favorites such as bezique and cinq-cents, arrived in the U.S. in the late 19th century and was popular with German immigrants at the time. The name “pinochle” likely derived from the German wordbinokel, meaning “eyeglasses” or “two eyes,” a nod to a specific scoring combination. Played with a 48-card deck (two copies of nines through aces for all four suits), the game combined melding — creating point-scoring card combinations — with strategic trick-taking. A single hand could last 30 to 45 minutes, and long sessions often stretched across multiple rounds.
A true multigenerational game, pinochle gained popularity during the Great Depression and hit its peak in the 1940s as children learned from parents and grandparents, who often kept notebooks of scores or tracked long-term “winning streaks.” More than just strategy, it was the shared rituals — debating moves, celebrating clever plays, teasing over mistakes — that made pinochle a bridge between generations, connecting family members across ages and experiences.
Advertisement
Advertisement
Credit: Steve Kagan/ The Chronicle Collection via Getty Images
Clacker Balls
Clacker Balls, marketed as Clackers, Ker-Bangers, or Click Clacks, stormed American playgrounds in the late 1960s. The toys were inspired by older momentum-based designs, including boleadoras, a kind of weapon used by Argentinian gauchos (cowboys) that was made from wood or metal. The toy version featured two hard acrylic balls on a string, swung to collide with a sharpclack-clack-clack. Simple in concept but tricky in execution, the challenge was to keep the rhythm without tangling — or shattering — the balls.
By the early 1970s, hundreds of toymakers had sold millions of Clackers worldwide. However, the acrylic balls could break apart on impact and cause injuries, leading to school bans and removal from U.S. and Canadian markets. Safer versions appeared decades later, marketed under names such as Newton’s Yo-Yo and Klicka, but the original craze remains an iconic example of hands-on, skill-driven play from the baby boomer era.
There’s nothing like a great television series to sweep you into another time and place, bringing history to life with all the intrigue, romance, and drama of the past. And with today’s prestige TV and myriad streaming options, these shows are bigger, bolder, and more engaging than ever.
Whether they’re about royal power struggles, wartime heroics, or the social upheavals that shaped the modern world, historical dramas capture the moments where everything changes — and there’s no shortage of incredible stories to tell. Here are 15 historical TV shows that history buffs are sure to love.
This lavish Netflix series chronicles the reign of Queen Elizabeth II, blending historical events with personal drama. Starring Claire Foy, Olivia Colman, and Imelda Staunton as the queen at different times in her life, The Crown earned multiple Emmys, including Outstanding Drama Series.
Inspired by real events and Peter Morgan’s 2013 playThe Audience, the show portrays the political and personal struggles of Britain’s longest-reigning monarch. The meticulous attention to period detail, from costumes to set designs, makes The Crown one of Netflix’s most ambitious and expensive projects, reportedly costing $14.4 million per episode.
This gripping HBO miniseries reconstructs the 1986 Chernobyl nuclear power plant disaster, exploring the scientific, political, and human factors that led to one of the worst nuclear accidents in history. Featuring powerful performances by Jared Harris, Stellan Skarsgård, and Emily Watson, Chernobyl received multiple Emmy and Golden Globe awards for writing, directing, and acting. While dramatized for television, the production was praised for its hauntingly realistic portrayal, with much of it filmed in Lithuania to replicate the abandoned Soviet-era city of Pripyat.
Based on James Clavell’s 1975 bestselling novel, Shōgun immerses viewers in feudal Japan, following the journey of an English sailor who becomes involved in the court of a powerful daimyo (feudal lord). Starring Hiroyuki Sanada as Lord Yoshii Toranaga and Cosmo Jarvis as the stranded Englishman, John Blackthorne, this fresh adaptation revisits the classic story that first inspired the 1980 miniseries of the same name.
Originally intended as a single-season limited series, Shōgun became a runaway success, setting a record as the most-awarded single season in Emmy history with 18 wins. It also made history as the first Japanese-language series to win the Primetime Emmy Award for Outstanding Drama Series. Thanks to its overwhelming popularity, the series has been renewed for two additional seasons, expanding its epic tale beyond its initial scope.
Starring Paul Giamatti and Laura Linney and based on David McCullough’s Pulitzer Prize-winning biography, this HBO miniseries provides an insightful look at the founding of the United States through the political career and personal life of John Adams. Filmed in colonial-era locations including Williamsburg, Virginia, John Adams explores the founding father’s role in drafting the Declaration of Independence, his presidency, and his complex alliance with Thomas Jefferson. The series won 13 Primetime Emmys, including Outstanding Miniseries, and earned critical acclaim for its performances, particularly Giamatti’s portrayal of Adams.
Set in post-World War I Birmingham, England, Peaky Blinders is a stylish and gritty crime drama that follows the rise of the Shelby crime family. Led by the cunning and ambitious Tommy Shelby (Cillian Murphy), the gang expands its influence while navigating political intrigue, rival gangs, and personal betrayals. The series blends fiction with history, incorporating real-life figures including Winston Churchill and Oswald Mosley.
With an atmospheric soundtrack featuring modern artists such as Nick Cave and Queens of the Stone Age, Peaky Blinders became a cultural phenomenon. The show earned multiple BAFTAs, including Best Drama Series, and Murphy received critical acclaim for his portrayal of Tommy. With appearances by Tom Hardy and Anya Taylor-Joy, the series cemented itself as one of the most compelling crime dramas of the decade.
Downton Abbey is a British period drama that chronicles the lives of the aristocratic Crawley family and their servants in early-20th-century England. Created by Julian Fellowes, the series blends historical events with personal drama, capturing societal shifts with elegance and emotional depth. Starring Hugh Bonneville, Michelle Dockery, and Maggie Smith, it received widespread acclaim, winning multiple awards including a Golden Globe for Best Miniseries and a Primetime Emmy for Outstanding Miniseries. The show’s success led to two film sequels: Downton Abbey (2019) and Downton Abbey: A New Era (2022), with a third film forthcoming.
With a third season currently in production, The Gilded Age is another period series created by Julian Fellowes of Downton Abbey fame. Set in 1880s New York City, the series depicts the era’s societal clash between established aristocracy and emerging industrial wealth. Christine Baranski portrays widow Agnes Van Rhijn, a staunch defender of “old money” traditions; Cynthia Nixon plays Agnes’ unmarried sister, Ada Brooke, who is reliant on her sister’s charity; and Carrie Coon plays Bertha Russell, a determined socialite from a "new money" family striving to break into elite society.
While the series centers on the lives of fictional characters, real historical figures such as Caroline Astor, Clara Barton, and T. Thomas Fortune also make appearances in the script, adding depth to the portrayal of the social and political dynamics of Gilded Age America.
Produced by Steven Spielberg and Tom Hanks, these companion World War II miniseries follow soldiers from different fronts of the war. The trilogy began with Band of Brothers, based on Stephen E. Ambrose’s book of the same name; it follows Easy Company, a unit of the 101st Airborne Division, from their training up through D-Day and the war’s end.
The Pacific, inspired by memoirs such as Helmet for My Pillow by Robert Leckie and With the Old Breed by Eugene Sledge, shifts the focus to the brutal island-hopping battles of the Pacific Theater and centers on three Marines. The latest installment, Masters of the Air, adapts Donald L. Miller’s book of the same name, exploring the experiences of the 100th Bomb Group, known as the "Bloody Hundredth," as they endure high-risk aerial missions over Nazi-occupied Europe.
Rome is a gripping historical drama that chronicles the turbulent fall of the Roman Republic. The story is seen through the eyes of two soldiers, Lucius Vorenus and Titus Pullo, portrayed by Kevin McKidd and Ray Stevenson, respectively, whose lives become entangled with key historical figures such as Julius Caesar, Mark Antony, and Cleopatra. Praised for its historical authenticity and rich storytelling, the series is also known for its lavish production, intricate political intrigue, and visceral battle sequences. Despite garnering critical acclaim, Rome lasted only two seasons due to its high production costs — the 10 episodes of the first season cost $100 million.
Mad Men isn’t a traditional historical drama, since its focus is on character-driven storytelling rather than major historical events, but its detailed re-creation of 1960s America captures the advertising world’s glamour and the decade’s shifting social norms. Created by Matthew Weiner, the series follows the enigmatic Don Draper, played by Jon Hamm, and rising copywriter Peggy Olson, played by Elisabeth Moss. The show won 16 Emmys, including four consecutive wins for Outstanding Drama Series, and earned widespread praise for its nuanced portrayal of gender, race, and class in a rapidly evolving society.
Created by Lisa McGee, Derry Girls is an Irish coming-of-age comedy that blends sharp humor with the historical backdrop of the Troubles, the decades-long conflict between nationalists and unionists in Northern Ireland. The series follows Erin, her eccentric friends, and her English cousin as they navigate teenage life in the Northern Ireland town of Derry in the 1990s, balancing school, family, and mischief amid political unrest. While primarily a comedy, it offers a poignant perspective on a turbulent era, capturing both the absurdity and warmth of adolescence in extraordinary times.
Call the Midwife is a long-running BBC drama that follows midwives in London’s East End from the 1950s onward, blending real medical and social history into its compelling storylines. Created by Heidi Thomas and based on the memoirs of Jennifer Worth, the series explores the challenges of childbirth, poverty, and evolving health care in postwar Britain while tackling issues such as immigration, women’s rights, and public health crises. The show has been praised for its compassionate storytelling and strong ensemble cast, including Jenny Agutter and Helen George.
Novelist, screenwriter, and producer Daisy Goodwin is the visionary behind Victoria, a lush three-season period drama that brings to life the early reign of Queen Victoria, chronicling her personal and political struggles. Starring Jenna Coleman in a critically acclaimed performance, the series is based on historical events and the queen’s own diaries, offering an intimate look at the monarch’s rise to the throne, her marriage to Prince Albert, and the challenges of ruling a rapidly changing empire.
Long before Netflix, video games, or podcasts existed, people turned toward other hobbies for their personal amusement — some of which seem quite strange by modern standards. Entertainment-seekers of yesteryear would gather to witness the unwrapping of ancient mummies, or pack arenas to watch people walk in circles for hours on end. These odd historical pastimes offer a fascinating glimpse into how folks in the past enjoyed their free time. Let’s take a look at six truly strange ways people used to have fun.
“Egyptomania” — a fascination with ancient Egyptian culture — swept across Europe in the 19th century, particularly in Victorian England, where people developed an obsession with mummies. It was even popular to attend events known as mummy unrollings, where actual corpses brought over from Egypt were unwrapped in the name of both science and morbid amusement.
In the middle of the 18th century, brothers and anatomists John and William Hunter were among the first to unroll mummies, doing so in the name of science. But the practice transitioned into more of a spectacle under enthusiasts such as “the Great Belzoni,” an explorer and showman who specialized in Egyptian antiquities, and Thomas “Mummy” Pettigrew, an English surgeon who was drawn to Egyptian antiquities. Pettigrew hosted private parties where he unwrapped and performed autopsies on mummies, revealing various amulets or bits of preserved hair and skin to the delight of those in attendance.
The trend really took off after the U.K. passed the 1832 Anatomy Act, which legally permitted doctors to dissect bodies for study. These mummy unrollings attracted large crowds, and were held at hospitals, scientific research centers, and private homes. The pastime remained popular for several decades, though ultimately lost its luster by the time Pettigrew died in 1865. Mummy unrollings continued, albeit on a smaller scale, with the last recorded event occurring in 1908.
While spectators fill up stadiums today to watch their favorite professional hockey or basketball team play, people once flocked to arenas for an entirely different sport: pedestrianism. Yup, competitive walking.
This unusual form of entertainment developed an avid following of folks who would come to see their favorite professional walkers compete. One of the most celebrated competitors of all time was Robert Barclay Allardice, who was nicknamed the “Great Pedestrian” for walking 1,000 miles in 1,000 successive hours. Another famous walker, Edward Payson Weston, was known for his signature wobble, while competitor Daniel O’Leary was famous for pumping his arms while clutching corn cobs to absorb the sweat. Women participated as well, thanks in no small part to competitor Anne Fitzgibbons, who helped popularize the sport with women in the United States. Fitzgibbons became the first woman on record to walk 100 miles in less than 24 hours.
Pedestrianism really took off in the U.S. in the 1870s, in part because people were migrating to big cities and seeking out new activities to enjoy in their free time. Spectators flocked to see pros walk for six days at a time, covering hundreds of miles in total, with the person who covered the most ground over six days being named the winner. Competitions were held in large arenas that offered food vendors, live bands, and even wagering to add to the fun.
Arguably the biggest event in the history of the sport came on September 21, 1879, when 13 of the best and brightest competitive walkers gathered at the original Madison Square Garden in New York City in front of 10,000 raucous onlookers. However, interest in pedestrianism began to wane toward the end of the 19th century, giving way to the rise of more popular sports such as baseball.
Victorian England saw a heightened interest in all things macabre, so much so that it became popular to take “headless” portraits in the late 19th century. These images often depicted a headless torso holding their noggin in hand, an impressive feat given the lack of modern photo-editing software.
Photographer Samuel Kay Balbirnie was among the best in the business, advertising his services in the newspaper as: “Headless Photographs - Ladies and Gentlemen taken showing their heads floating in the air or in their laps.” To achieve this illusion, the subject would pose as multiple images were captured depicting the body and head making various gestures and expressions. The photographer would then take those images and cut out a head from one and paste it on another, resulting in one or several headless photos that appeared to be captured in one take.
While pressing and scrapbooking flowers has long been a popular pastime, for a brief moment in the 19th century there was a fascination with using seaweed instead. The trend was particularly popular among women, who often weren’t afforded the opportunity to travel the world collecting specimens of flora and fauna, but could collect seaweed from the local beach. It was also a social activity: People would head to the shore, pick up seaweed, and return home to scrapbook. It was such a popular pastime that it’s believed Queen Victoria herself was an avid seaweed scrapbooker when she was young, though her rumored collection has been lost to the annals of time.
Each page of these seaweed scrapbooks featured colorful and unusually shaped seaweed surrounded by ornate borders, resulting in amateur works of art. But achieving the results didn’t come easy, as mounting, drying, and pressing each piece of seaweed required considerable patience and skill. The hobby provided an artistic escape and allowed scrapbookers to freely express themselves. This can be seen in the work of Margaret Gatty, a renowned seaweed scrapbooker who published a two-volume handbook titled British Sea-Weeds in 1863 and 1872.
The ancient Romans found fun in watching unusual and indeed dangerous activities. In addition to gladiator battles, they enjoyed a spectacle called naumachia, or mock naval battles. These events drew thousands of spectators to amphitheaters, artificial lakes, and colosseums throughout the Roman Empire, which were flooded with water in order to stage conflicts at sea. Thousands of competitors participated on each side as both oarsmen and attackers, resulting in an entertaining, destructive, and often fatal spectacle.
The earliest recordednaumachia was held in 46 CE under Julius Caesar, and it featured 6,000 people competing atop an artificial basin in Rome’s Campus Martius — a large open area in ancient Rome used for military exhibitions and public gatherings. Forty-four years later, Emperor Augustus held an even grander spectacle with 1,000 additional combatants. But the audience didn’t come to the shows just to watch the action; the poet Ovid noted that naumachia were popular events for raucous revelry such as drinking alcohol and fornication.
While the ancient Romans stopped holding naumachia by around 250 CE, there was a brief revival in 19th-century England at London’s Sadler’s Wells Theatre, which was flooded in order to recreate these ancient events for modern audiences.
Incubated baby fairs were dual-purpose exhibitions that not only entertained the masses but also played a pivotal role in providing care for premature babies. The concept originated at the Great Industrial Exposition of Berlin in 1896, where Martin Couney — then working as an apprentice under obstetrician Pierre Budin — debuted these innovative machines that could help premature newborns survive infancy. At the time, many people in the medical community viewed premature children as weak and unfit to reach adulthood, but Couney was an early advocate for their well-being. He brought the exhibition to the U.S. in the early 1900s, but needed a way to fund it, and opted to charge admission.
Couney set up an incubated baby fair along the Coney Island boardwalk in Brooklyn, New York, and another in Atlantic City, New Jersey. On display were 5-foot-tall glass-and-steel incubators housing the infants, who were cared for by a team of doctors and nurses. Many parents willingly brought their children to be displayed in these machines, as there weren’t any dedicated premature hospital units at that time.
The exhibition was received with positive acclaim in the U.S., as people came to see these unusually small babies in the flesh. Spectators ponied up 25 cents (almost $10 today) to enter the exhibit, which helped cover the daily operating cost of roughly $15. It was essentially a quid pro quo situation — babies received essential care while attendees received a form of entertainment, however strange by today’s standards. That said, some elements of the exhibition were simply for show, such as babies being dressed in oversized outfits cinched at the waist to accentuate their tiny size. Couney continued to oversee these events, which remained popular boardwalk attractions until he shut them down in 1943, as hospitals began to debut dedicated incubator wards of their own.
As far back as the first millennium BCE, ancient Mesopotamians relied on the concept of magic as a guiding principle. They used magic in rituals to ward off evil demons, and as an explanation for perplexing natural phenomena. Over many millennia, as our scientific understanding of the world advanced, “magic” evolved into a form of entertainment, popularized by legendary sleight-of-hand artists and illusionists such as Harry Houdini. Today, your typical magic show leaves the audience feeling awe-inspired and wondering how the tricks are done. Well, wonder no more — this is how five classic magic tricks actually work.
Sawing a Person in Half
It’s impossible to know for certain when a magician first sawed their assistant in half, but one of the earliest recorded instances of the trick was performed by British magician and inventor P.T. Selbit in London in 1920. He debuted the trick to the shock of onlookers, as he seemingly sawed a woman in a box in half and then put her back together, allowing her to leave the box unharmed. In the summer of 1921, Selbit toured the trick throughout the United States, where he encountered pushback from other illusionists claiming to have invented the trick.
While there are several variations of this illusion, depending on the magician’s personal style, one of the most common methods involves two assistants — one that the audience sees and another hidden inside the box. The trick begins by unveiling a long, thin box, with an assistant already hidden inside, tucked away and contorted at the end where the other person’s feet will be. Then the magician opens the box and invites the other assistant to climb inside and lay down. Once they do, they also contort themselves in a way that leaves an empty middle section to cut through. At this point the magician closes the box, and one assistant pops their head out while the hidden assistant pops their feet out.
With the two safely separated, the magician takes a saw and cuts through the middle of the box, seemingly slicing their assistant in half. The magician then separates the halves of the box while the head and feet continue to move for added effect.
To end the trick, the magician wheels the boxes back together, says a few magic words, opens the lid, and allows one assistant to climb out unscathed while the other pulls their feet back in and remains hidden inside the box.
The origins of this iconic trick are also murky. One theory points to a Swiss magician in the late 1800s named Louis Apollinaire Christien Emmanuel Comte, known as the “King’s Conjurer.” He performed magic in the courts of three kings of France: Louis XVIII, Charles X, and Louis-Philippe. The rabbit trick was also popularized by Scottish magician John Henry Anderson, who performed the illusion in theatrical spectacles in the 19th century.
The trick is simple in concept, but requires tremendous skill nonetheless, especially when working with an unpredictable live animal. The simplest way to perform the illusion is with a top hat and table, both of which contain a secret hidden compartment that the magician can reach through. A rabbit is hidden inside a container and placed underneath the table, behind a tablecloth, before the act begins.
To perform the trick, the magician simply places the top hat on the table with the open side facing upward, reaches through both secret compartments, and “magically” pulls out his hand holding a live rabbit.
Performers who are more talented with sleight of hand can also complete this without a gimmicked (altered) top hat or surface. This allows the audience to inspect the hat beforehand to make sure it’s authentic. Magicians performing the trick this way will secretly hide a rabbit on their person, either in their pocket or up their sleeve. Then in the blink of an eye, they tip the hat toward themselves, slide the rabbit in, and pull it out for all to see.
Linking Rings
In this trick, a magician manages to link multiple metal rings together, despite each individual ring being seemingly impenetrable. Some claim that the trick originated in ancient China, though those claims lack hard evidence. Among the earliest confirmed performers of the trick are an 18th-century French magician known as Philippe, as well as a Chinese magician named Ching Ling Foo who toured America during the late 19th and early 20th centuries.
This trick is usually done with four rings, three of which are gimmicked. One ring is your standard metal ring, one has a small gap that’s imperceptible to the audience, and the remaining two rings are actually pre-interlocked.
Using a series of quick sleight of hand motions, the magician takes all four rings in their hand and counts them out in front of the audience, making sure to obscure the fact that any are gimmicked or already linked together. Then they take one of the loose rings and tap it against the interlocked rings several times. In a quick motion, they drop one of the interlocked rings so that it hangs, making it appear like they just linked those two rings together.
Next, they take the ring with the hidden gap in it and forcefully link it with the two rings that were already interlocked, thus linking three rings together.
Lastly, they take the normal ring and tap it against the ring with a gap in it before locking those rings together in one quick and fluid motion. At that point, all of the rings are locked together.
Advertisement
Advertisement
Cups and Balls
Many formative musicians considered the cups and balls trick to be one of the most basic yet essential components of any magic routine. In his book Modern Magic, Professor Hoffman (the pseudonym of English author Angelo John Lewis) called the trick “the groundwork of all legerdemain,” a word referring to the act of skillfully using one’s hands.
Essentially, this illusion makes it appear as if balls are “magically” passing through solid cups onto the table. The trick requires three cups and four balls; three balls are visible to the audience and one remains hidden the entire time. It doesn’t matter what kind of cups or balls are used, though sleek metal cups and colorful balls are often preferred by professional magicians for their visual appeal.
The magician begins the trick by placing three balls on the table, equally spaced apart, as well as a stack of all three cups with the open ends facing upward. What the audience doesn’t know is that the secret fourth ball is hidden between the top and middle cups.
The performer then takes the cups and places one behind each ball. They place the first cup behind the leftmost ball, then the middle cup — which contains the hidden ball — behind the middle ball. When doing this, the magician brings the open end toward their torso and keeps it hidden so that the audience doesn’t see that there’s a ball inside. Finally, they place the last cup behind the rightmost ball.
Once the cups are placed, they take one of the balls atop the middle cup (which has a secret ball underneath) and stack the other cups on top. They snap their fingers, lift the cups, and reveal the hidden ball as if it had passed through the cup.
They then repeat this exact process another time to reveal two balls underneath, and a final time to reveal three balls underneath. All the while, a single ball always stays hidden within the stack of cups.
Metamorphosis
Harry Houdini achieved global popularity thanks in no small part to one of the most popular tricks in his routine, Metamorphosis, which he performed with his wife Bess. However, the trick was actually created by English magician John Nevil Maskelyne, who built a special trunk in the mid-1800s to perform it. The trick involves a magician placing their restrained assistant inside the box and then standing on top of the box. Two other assistants then pull up a cloth to obscure the box, and the magician and assistant magically swap places. But how is it done?
First, a solid wooden crate is wheeled out on stage, and the magician bangs on all of the sides to show the audience it’s real and durable. The magician presents their assistant, asks the assistant to climb inside a large bag, and securely ties the bag at the top.
The bagged assistant is then placed inside the crate and the lid is placed on top. The entire box is finally secured with ropes or chains with the assistant inside.
While this happens, the assistant secretly opens a hidden zipper at the bottom end of the bag, allowing them to escape inside the box. They then pull on the rope through a secret hole in the bottom of the box, which makes the visible rope taut and makes it appear as if the box is securely fastened, even though there’s actually quite a bit of slack on the rope.
As the magician climbs atop the crate, a curtain is raised up by other assistants to shield the entire scene. While the curtain is raised, the magician climbs off the box, the lid is slid open, the freed assistant climbs out, the magician climbs in, and the lid is once again placed on top of the box.
With the two having swapped places, the curtain is dropped as the assistant dances about the stage, distracting the audience. While this happens, the magician zips themselves into the bag in the box, which is then opened by the two other assistants. They untie the bag, revealing the magician inside.
Credit: Aaron Rapoport/ Corbis Historical via Getty Images
Author Michael Nordine
May 16, 2024
Love it?42
The way we watch television is changing, and so is the way we measure viewership: 2023 was the first year in which viewers who no longer pay for traditional TV such as cable service outnumbered those who still do. Cord-cutting is increasingly the norm as people flock to Netflix, Hulu, and other streaming services. The small screen remains a favorite passive pastime all the same, with Nielsen ratings and other metrics showing why the following seven shows have proven so enormously popular with viewers around the world. All of them proved popular throughout their run, with individual episodes (often their finales) setting records for viewership.
Before it was a Harrison Ford movie, The Fugitive was a wildly popular TV series. It took all 120 episodes — 90 broadcast in black and white, 30 in color — to reveal what really happened to the wrongly accused Dr. Richard Kimble (portrayed by David Janssen), and America was more than ready by the end. The series finale, “The Judgment,” set a record when 78 million people watched it, but The Fugitive’s place atop the ratings mountain didn’t last long. When the series ended in 1967, the show that eventually dethroned it was just five years from making its own debut on the small screen.
M*A*S*H aired 256 episodes throughout its 11 seasons, none of which drew more viewers than its record-shattering finale. When “Goodbye, Farewell and Amen” aired on February 28, 1983, around 105 million viewers were there to see how the beloved series ended — the most of any television broadcast in American history at that point, a record that stood until Super Bowl XLIV in 2010. Nearly 60% of American households helped make it the most-watched episode of any TV show by tuning in, a record unlikely to be broken in the streaming era.
Few series have become cultural phenomena to the same extent as Roots, the miniseries about slavery’s history and legacy based on Alex Haley’s novel Roots: The Saga of an American Family. In addition to critical acclaim and a slew of accolades — the show won a Golden Globe, a Peabody Award, and nine of the 37 Emmys for which it was nominated — Roots broke Nielsen ratings records during the eight consecutive nights on which it aired, and every episode still ranks among the 100 most-watched episodes of all time. Roughly 51% of all American households gathered around their television sets for the finale, and an estimated 140 million viewers watched the show overall. It seems to have been all anyone could talk about in January 1977: “Theaters and restaurants emptied out during the show,” wroteTIME magazine’s Frank Rich two years later. “Hundreds of colleges started Roots courses; the National Archives in Washington found itself flooded by citizens’ requests for information about their ancestors.” In addition to a 1979 sequel, Roots also inspired a 2016 remake.
Dallas was well known for its cliffhangers throughout its 13-year run, but none of them riled the country into a frenzy the way its third-season finale did. “A House Divided” premiered on March 21, 1980, and after it aired, everyone was asking the same question: “Who shot J.R.?” When that burning question was answered exactly eight months later, 76% of all television viewers in the U.S. were watching — meaning every other show broadcast combined for just a quarter of the night’s total viewership. That amounted to some 90 million people, a record that stood until the M*A*S*H finale. The cliffhanger’s massive success helped popularize the now-common practice of ending a season with unresolved questions, including the Simpsons spoof “Who Shot Mr. Burns?” It wasn’t just this one episode that drew viewers, however — Dallas was a ratings success throughout its run, with seasons 4 through 8 all ranked either first or second according to Nielsen.
It’s the place where everybody knows your name, and just about everyone in the country tuned in when Cheers aired the last of its 275 episodes. “One for the Road” received a Nielsen rating of 45.5, meaning 45.5% of all American televisions were tuned to the episode, with a total viewership of some 93 million. To this day, M*A*S*H is the only series finale to be seen by more people — even massive hits such as Seinfeld (76 million), Friends (52.5 million), and Game of Thrones (13.6 million) didn’t come close.
Traditional television viewership may be declining in the U.S., but it’s never been more popular in South Korea. Viewership records have been set and broken time and again over the last several years, as K-dramas have proved increasingly popular abroad as well. TV ratings are measured in terms of the percentage of households that tune in to a given episode, and the twisty relationship miniseries The World of the Married holds the current record in its home country. A full 28.37% of Korean homes (more than 14 million people) tuned in to the finale, breaking the previous record of 23.77% set by Sky Castle a year earlier.
South Korea is also responsible for Netflix’s most-watched series of all time: Squid Game, the global sensation that 142 million households pressed play on for a total of 1.65 billion viewing hours within four weeks of its release; the only other Netflix series to crack 1 billion viewing hours in that time frame are Wednesday and Stranger Things 4. Squid Game also won awards across the globe and has been renewed for a second season, which is expected to be wildly popular as well.
One of Hollywood’s most famous figures stands at just 13.5 inches tall, weighs only 8.5 pounds, and goes by just one name: Oscar. The famous golden statuette is awarded annually by the Academy of Motion Picture Arts and Sciences and is one of the highest honors in the film industry. Like a lot of old Hollywood lore, there have been competing stories through the years about how the little gold statuette — officially named the Academy Award of Merit — got its human nickname. Here are some prevailing theories on how this prized statuette came to be known as “Oscar.”
Credit: Santi Visalli/ Archive Photos via Getty Images
The Birth of the Little Gold Man
The first Academy Awards ceremony took place in May 1929 in the Blossom Room of the Hollywood Roosevelt Hotel, and introduced the gold-plated, solid-bronze statuette that has been an iconic Hollywood image ever since. Motion picture art director Cedric Gibbons designed it, and sculptor George Stanley brought to life the knight holding a crusader’s sword, standing on a reel of film. The film reel’s five spokes represent the original five branches of the Academy of Motion Picture Arts and Sciences: actors, directors, producers, technicians, and writers.
Although the Academy Awards and its traditional statuette have been around since 1929, the Oscar name wasn’t officially adopted by the academy until 1939. The exact origin of the nickname, however, is fairly murky. One widely circulated legend attributes the moniker to the academy’s first librarian and eventual executive director, Margaret Herrick. According to the story, in 1931, Herrick saw one of the statuettes on an executive's desk and said it reminded her of her Uncle Oscar. A newspaper reporter who happened to be nearby overheard Herrick, and the nickname stuck. (There are some inconsistencies in the story, however; Herrick later claimed the name Oscar came from an inside joke with her husband). Another popular, though unlikely, theory is that the actress Bette Davis said in a 1936 Academy Awards acceptance speech that the statuette’s backside resembled that of her then-husband, Harmon Oscar Nelson. And yet another oft-repeated legend is that Hollywood gossip columnist Sidney Skolsky coined the term when he used it in a 1934 story about Katharine Hepburn’s first Academy Award for Best Actress.
Advertisement
Advertisement
Credit: Andrew H. Walker/ Getty Images Entertainment via Getty Images
The Award for Naming the Oscar Goes to…
According to Bruce Davis, who spent 20 years as the academy’s executive director, the Oscar nickname began to make its way through the Hollywood community sometime between 1930 and 1933 — predating Bette Davis and Sidney Skolsky’s use of the term. Davis suggests someone else entirely deserves credit for inventing the Oscar alias: a woman named Eleanore Lilleberg. Lilleberg was an academy secretary and office assistant during the award's early days, and part of her duties included managing the statuettes before the ceremony. Stories have occasionally surfaced that she jokingly called the award “Oscar,” which Davis claims is the true origin of the name. While researching his book The Academy and the Award, he came across an autobiography by Lilleberg’s brother, California artist Einar Lilleberg, at the tiny Lilleberg Museum in Green Valley, California. Einar’s text claimed that his sister referred to the awards as “Oscar” in honor of a Norwegian army veteran she knew in their hometown of Chicago. Einar described the veteran as, like the famous statuette, always "standing straight and tall."
Credit: Kevin Winter/ Getty Images Entertainment via Getty Images
How the Academy Awards Became “the Oscars”
Since 1939, 10 years after the first Academy Awards, the AMPAS has accepted and used “the Oscars” name as the official shorthand for its annual ceremony. In 2013, for the 85th annual Academy Awards, the show dropped the lengthy original title and simply went with “the Oscars.” Neil Meron, who co-produced the 2013, 2014, and 2015 ceremonies, said the rebrand was an attempt to modernize the show. “We're not calling it the 85th annual Academy Awards, which keeps it mired somewhat in a musty way,” he said. "It'll be like the Grammys. The Grammys don't get a number, and neither will the Oscars." The colloquial new name remains in use to this day.
Credit: Graphic House/ Archive Photos via Getty Images
Author Kerry Hinton
February 22, 2024
Love it?43
The Brill Building isn’t just an art deco structure in midtown Manhattan — it’s also the name of a musical genre. Throughout the early and mid-1960s, the “Brill Building sound” became synonymous with groundbreaking pop music. The heyday of the Brill Building era was short-lived, but in one six-year span, the songwriters, arrangers, musicians, and producers behind this sound contributed to hundreds of Billboard Hot 100 hits, including “Stand By Me” (Ben E. King, 1962), “One Fine Day” (the Chiffons, 1963), and “Be My Baby” (the Ronettes, 1963).
Located at 1619 Broadway in New York City, the Brill Building was a hub of songwriters, record labels, and recording studios, all under one roof. It built on the tradition of the “Tin Pan Alley” district before it — a concentration of music publishers and studios in a strip of Manhattan that dominated the music industry in the big-band era. But while their downtown predecessors were mainly concerned with the profits produced by pumping out sheet music for radio hits, the writers and producers at the Brill Building were also on a mission of artistic idealism. Their compositions drew inspiration from classical music, Latin music, traditional Black gospel, and rhythm and blues to create songs that appealed to an audience already hungry for the new sound of rock ’n’ roll. The assembled talent was a once-in-a-generation roster of songwriters, including Burt Bacharach and Hal David, Gerry Goffin and Carole King, and Neil Diamond. Together, they produced sophisticated songs that were directly aimed at a new, youthful generation and a powerful rising subculture: teenagers.
By the mid-’60s, an increasing number of artists — such as the Beatles and Bob Dylan — began composing and playing their own material, making the songwriter-for-hire less of a necessity. As Dylan wrote in 1985, “Tin Pan Alley is gone. I put an end to it. People can record their own songs now.” This may be true, but the creators behind the Brill Building sound helped make the ascent of these singer-songwriters possible. Here are five ways the Brill Building shaped popular music in the 20th century.
The Brill Building employed a model of vertical integration that supervised every phase of a song’s life cycle, from production to distribution, all under one roof. The 11 floors of 1619 Broadway and a few surrounding buildings became a one-stop shop where a songwriter could pen a would-be hit, sell it to a publisher, find a band, and cut a demo. Songs could even be played for radio promoters in the building to garner airplay. This new type of streamlined hitmaking — often called “assembly line pop” — gave publishers and producers a huge pool of material to choose from and encouraged creative collaboration, merging art and commerce in a new way.
At the Brill Building, songcraft mattered. Some of the most interesting and popular songs of the era were written at Aldon Music, one of the music publishing companies in the building. Its founders, Al Nevins and Don Kirshner (“Al” and “Don”), had a plan: to take the spirit of classic Tin Pan Alley songwriting (catchy melodies with commercial appeal) and create well-crafted songs aimed at young people, an increasingly lucrative market. Kirshner had already enjoyed some success writing jingles with his high-school friend Bobby Darin, and after acquiring the talents of the experienced songwriting team of Jerry Leiber and Mike Stoller (the writers behind “Hound Dog” and “Jailhouse Rock”), he convinced the more experienced Nevins to partner with him.
The music may have been aimed at the youth market, but Aldon’s songwriters employed lyrics that addressed bleak social conditions (“We Gotta Get Out of This Place”) and tragedy (“Leader of the Pack”). The arrangements and production were innovative as well. Songs such as the echo-drenched Phil Spector-produced “River Deep, Mountain High” (performed by Tina Turner) showcased new directions in arrangements and production techniques.
The Brill Building sound was created for young people, by young people. In 1962, the oldest of Aldon’s songwriters was just 26 years old. Many of the Brill Building songwriters were only slightly older than their songs’ subjects, making their perspective especially accessible to young audiences. At the age of 20, Carole King wrote the No. 1 hit “Will You Love Me Tomorrow” — recorded and released by the Shirelles in 1960 — with her husband Gerry Goffin. Little Eva, their babysitter, sang the smash hit “The Loco-Motion” (which King and Goffin also wrote) when she was just 17.
Before the Brill Building era, popular songwriting was basically a boys’ club. This changed with the arrival of female songwriters such as Carole King, Ellie Greenwich (“Then He Kissed Me,” 1963), and Cynthia Weil (“You’ve Lost That Lovin’ Feelin’,” 1964), although their husbands were named first in the songwriting credits. In addition to writing dozens of hits, these women proved that they were equally capable in the recording studio as arrangers and producers.
The Brill Building songwriters made rock ’n’ roll popular with mainstream teenage America. Although the majority of writers were white, they all had been influenced by the melting pot of musical styles they heard on the fire escapes and in the clubs of New York City. The result was a hybrid sound that blended genres and often had crossover appeal, finding success on both pop and R&B charts. Songwriters often specifically wrote for Black female artists such as Dionne Warwick, the Ronettes, and the Crystals, using arrangements that gave their music mainstream appeal. This unique musical style united listeners from different backgrounds and opened people’s eyes to the possibility of a biracial popular culture.
Advertisement
Advertisement
Another History Pick for You
Today in History
Get a daily dose of history’s most fascinating headlines — straight from Britannica’s editors.