The concept of etiquette dates back to Europe during the medieval era, when rules and social conventions first gained prominence. During the Renaissance, expectations of behavior at royal and noble courts were outlined in courtesy books, or books of manners. In the 19th century, etiquette manuals continued to flourish in Europe and the United States, guiding behavior for ladies and gentlemen in both social and professional settings. By the early 20th century, these guidebooks were increasingly popular with both wealthy and middle-class women in the U.S., and author Emily Post became the definitive expert with the publication of her first book of etiquette in 1922.
Today, the rules of behavior observed by previous generations might seem old-fashioned and strange, and certainly there are some social conventions better left in the past, as they reflect the inequality and biases of bygone eras. But etiquette itself isn’t inherently outdated. While specific customs may evolve, the underlying principles of courtesy, respect, and consideration for others remain as relevant today as they were a century or two ago. With that in mind, here are some of the most unusual and surprising etiquette rules from decades past.
According to Vogue’s Book of Etiquette, published in 1948, wives should defer to one’s husband as “head of the house.” By not paying proper respect to their husbands, the thinking went, bad-mannered American wives were placing their husbands in a subordinate position, which was “most unbecoming to a man.” Among the suggestions for being a better wife were to say “we” or “our” instead of “I” or “me,” and to let one’s husband take the lead on deciding when to leave a party. Reflecting the often oppressive gender norms of the era, the guide reminds wives that “a woman can gracefully play second fiddle, but a man who is obviously subordinated to a dominating woman is a pathetic and foolish figure.”
There Should Be One Servant for Every Two Dinner Guests
The anonymous countess who authored the 1870 etiquette book Mixing in Society: A Complete Manual of Manners declares, “It is impossible to over-estimate the importance of dinners.” She goes on to detail the many aspects of planning and hosting a dinner party. In addition to having an equal number of ladies and gentlemen at a dinner (and never 13, out of respect for superstitious guests), the hostess should make sure to have “one servant to every two guests, or, at least, one to every three.”
A Man Was Expected to Choose His Riding Companion’s Horse
Published in 1883, American Etiquette and Rules of Politeness outlines the rules for men going horseback riding with a woman, noting that the gentleman should “be very careful in selecting her horse, and should procure one that she can easily manage.” He is also admonished to “trust nothing to the stable men, without personal examination,” and to “be constantly on the lookout for anything that might frighten the lady’s horse.”
The Most Important Rule for Children Was Obedience
In 1922, Emily Post published her first book of good manners, Etiquette in Society, in Business, in Politics and at Home. It offered more than 600 pages of rules and standards, from how to make introductions to proper behavior when traveling abroad. No one was exempt from learning and practicing proper etiquette, including children. “No young human being, any more than a young dog, has the least claim to attractiveness unless it is trained to manners and obedience,” Post states in the chapter “The Kindergarten of Etiquette.” In addition to learning how to properly use a fork and knife and remaining quiet while adults are speaking, a child should be taken away “the instant it becomes disobedient,” directs Post. By teaching a child that it can’t “‘stay with mother’ unless it is well-behaved,” she writes, “it learns self-control in babyhood.”
Photo credit: FPG/ Archive Photos via Getty Images
Flirting Was a Sign of Ill Breeding
Published in 1892, the guidebook Etiquette: Good Manners for All People; Especially for Those “Who Dwell Within the Broad Zone of the Average,” focused on advice for middle-class Americans, and not just wealthy society. The book describes itself as offering “some of the fundamental laws of good behavior in every-day life” for “people of moderate means and quiet habits of living.” In the chapter on “Gallantry and Coquetry,” readers are reminded that there is nothing wrong with a man enjoying the company of a charming woman, or a woman delighting in the conversation of a brilliant man. However, these acts of mutual appreciation have “nothing in common with the shallow travesty of sentiment that characterizes a pointless flirtation.” Not only is flirting a sign of poor breeding, the guide suggests, but “a married flirt is worse than vulgar.”
Advertisement
Advertisement
Photo credit: Hirz/ Archive Photos via Getty Images
A Man Couldn’t Speak to a Woman Unless She Spoke to Him First
During England’s Victorian era in the 19th century, women had “the privilege of recognizing a gentleman” first by acknowledging him with a bow, according to the 1859 British handbook The Habits of Good Society: A Handbook of Etiquette for Ladies and Gentlemen. Men were expected to wait for that acknowledgment before speaking. “No man may stop to speak to a lady until she stops to speak to him,” the book advises. The guidelines go on to say, “The lady, in short, has the right in all cases to be friendly or distant. Women have not many rights; let us gracefully concede the few that they possess.”
The 20th century produced an array of iconic toys that captured the public’s imagination and, in some cases, continue to delight young people worldwide. The Slinky, originating in the 1940s, and the Rubik’s Cube, first sold in the United States in the early 1980s, have remained more or less the same since their invention, invoking a nostalgic simplicity. Other toys, such as LEGO and Barbie, have offered up countless iterations, weathering changing trends to endure in popularity and appeal. The legacy of these toys is in more than just their entertainment value — it’s in the way they reflected or even set cultural trends, interests, and technological advancements. Here are some of the most popular toys throughout the 20th century, many of which are still around today.
In the early 1940s, United States industry was largely focused on producing goods for the war effort, and it was during this time that the Slinky was accidentally invented. Richard James, a mechanical engineer, stumbled on the idea in 1943 while working with tension springs for naval equipment at a Philadelphia shipyard. After accidentally knocking some of his prototypes off a shelf, James couldn’t help but notice the way one of them “walked” down a stack of books on his desk. He worked on this strange spring — which his wife named “Slinky” after seeing the word in the dictionary — over the next two years. By the end of 1945, James got an initial run of 400 Slinkys into a local department store. It wasn’t until he staged a live demonstration, however, that the product’s popularity picked up, and the toy sold out. Within the first 10 years, he sold 100 million. The Slinky has endured for decades, not only as a popular toy on its own, but also through licensing and its iconic jingle — the longest-running jingle in television advertising history.
LEGO is known for its colorful modular plastic bricks, but when the company started in Denmark in 1932, it made wooden toys such as cars and yo-yos. Plastic toys didn’t come along until the late 1940s, when founder Ole Kirk Christiansen developed the forerunner of the buildable bricks we know today, known at the time as Automatic Binding Bricks. In 1958, the modern LEGO brick was patented, with an updated interlocking design that became its signature.
Through a deal with Samsonite, LEGO made its way to Canada and the U.S. in the early 1960s, but the iconic toy didn’t truly find its footing in North America until the early 1970s. The New York Times claimed the toy had been “ineptly marketed” since its stateside arrival, and the then-head of LEGO’s U.S. operations called the deal with Samsonite “a disaster.” In 1973, however, the company took over its own U.S. production and sales and, per the Times, sales “soared.” LEGO grew to be much more than a toy in the ensuing decades — it became an entertainment empire. Throughout it all, the company has stood by its name, which also happens to be its guiding principle: LEGO is an abbreviation of the Danish words “leg godt,” meaning “play well.”
When Mattel released the first Barbie doll on March 9, 1959, it was the first time that most children had seen a three-dimensional, adult-bodied doll — the norm at the time were baby dolls designed to be taken care of. Ruth Handler, the co-founder of Mattel and creator of Barbie, had a different idea. After watching her daughter Barbara, the toy’s namesake, play with paper dolls, Handler envisioned a doll that was a little bit older and could inspire more aspirational play: Young girls could see their future selves in the doll, instead of a child to nurture. Barbie’s initial launch at the New York Toy Fair faced some skepticism from other toy industry executives, but Handler’s instincts were right: Around 300,000 Barbies sold within the first year. As beloved as Barbie was, though, she also courted controversy. Early on, detractors were uncomfortable with the doll’s figure. Barbie was at times criticized for being too conventional; other times, too progressive. But the doll’s popularity endured as the company diversified her looks, skin tones, body types, and, of course, jobs: Throughout her lifetime, Barbie has explored more than 250 different careers. The cultural phenomenon continues to this day: Around 1 billion Barbie dolls have been sold, and in 2023, the first live-action movie based on Barbie became the year’s biggest release.
Photo credit: Justin Sullivan/ Getty Images News via Getty Images
G.I. Joe
Following Mattel’s major Barbie breakout, rival toy company Hasbro sought a similar success story. Barbie thrived by marketing primarily to young girls, and Hasbro aimed to fill a gap in the market with a toy made for boys. In the early 1960s, toy maker Stan Weston approached Hasbro with an idea for a military toy, but was turned down. One Hasbro executive, Don Levine, saw the toy’s potential, however, and workshopped the idea until the company approved. It wouldn’t be called a doll, though — Hasbro created the term “action figure” to market the new product, and even forbade anyone in the company from referring to it as a doll. Released in 1964, the original G.I. Joe line consisted of four 12-inch figures, one for each of the U.S. military branches: the Army, Navy, Air Force, and Marines. The action figure took off, and within two years, G.I. Joe accounted for almost 66% of Hasbro’s overall profits. The franchise eventually created less military-centric characters, expanded to comic books and animated series, and embraced sci-fi, espionage, and team-based narratives that have carried the toy as a symbol of adventure and heroism across generations.
At first glance, a Rubik’s Cube appears simple, but the mathematically complex puzzle is anything but, and solving it is a problem that has captivated the public ever since the toy’s invention. Created by Hungarian architect and professor Ernő Rubik in 1974, the first "Magic Cube," as he called it, resulted from months of work assembling blocks of wood with rubber bands, glue, and paper. After painting the faces of the squares, Rubik started twisting the blocks around, and it took him weeks to get it back to its original state. One month later, he finally did. He patented the toy Rubik’s “Buvos Kocka,” or “Magic Cube,” and it first appeared in Hungarian toy shops in 1977. Within two years, 300,000 Hungarians had bought the puzzling cube. By 1980, an American toy company was on board, and international sales of the renamed Rubik’s Cube took off — 100 million were sold in three years. As quickly as the craze started, however, it seemed to fade. TheNew York Timesreported in 1982 that it had “become passe,” replaced by “E.T. paraphernalia…[and] electronic video games.” But the toy has nonetheless endured, and to date, an estimated 350 million colorful cubes have been sold, making it one of the bestselling puzzles in history.
Photo credit: Barbara Alper/ Archive Photos via Getty Images
Cabbage Patch Kids
Known for their one-of-a-kind features, unique names, and adoption certificates, Cabbage Patch Kids caused a full-on frenzy in the 1980s, leading to long lines at stores — and even riots. Although the dolls are known as the invention of Xavier Roberts, whose signature is on every doll, the origin story reportedly starts with a folk artist named Martha Nelson Thomas. In the late 1970s, Thomas was selling her handmade “doll babies” at craft fairs in Louisville, Kentucky. Roberts reportedly resold the doll babies at his own store for a while, but eventually remade and renamed them Cabbage Patch Kids. (Thomas eventually took Roberts to court over the copyright, but the pair settled in 1985.) In 1982, Roberts licensed his dolls to the Coleco toy company, and the following year, thanks to a robust advertising campaign, demand was much greater than supply , sparking angry mobs of disappointed parents that holiday season. Around 3 million Cabbage Patch Kids had been “adopted” by the end of 1983, and over the next two years, sales topped half a billion dollars. The doll’s popularity faded quickly after that, but Cabbage Patch Kids remain toy store fixtures to this day.
Advertisement
Advertisement
Photo credit: Chesnot/ Getty Images News via Getty Images
Tamagotchi
In the early 1990s, video game consoles were household staples, and by the end of the decade, tech toys such as Tickle Me Elmo and Furbies caused consumer crazes. But one pocket-sized toy that combined the best of both worlds was a ’90s must-have: the virtual pet. The handheld pixelated companions required regular feeding and playing, imbuing users with responsibility and emotional attachment, and engaging them in a type of continual play that was relatively new at the time.
The most popular virtual pet was the Tamagotchi, created by Japanese toy company Bandai. It was released in the United States on May 1, 1997, six months after it was launched (and subsequently sold 5 million units) in Japan. After the first day of the toy’s U.S. release, some stores were already sold out. Within the year, there were several competing virtual pets: GigaPets and Digimon offered different pet options and more gameplay. The constant connectivity of the virtual pets led to schoolbans, and as the internet gained traction in the late ’90s and early 2000s, online versions such as Neopets all but replaced the Tamagotchi. Virtual pets had an undeniable influence on future trends in gaming and handheld electronic devices, and while the toy has gone through several iterations and relaunches over the years, the original Tamagotchi largely remains a nostalgic relic of the ’90s.
The 1960s and ’70s are considered a golden age in advertising, though the industry’s creative revolution arguably started in the 1950s, thanks in part to the rise of television unlocking new forms of storytelling. It was an era of bold ideas, increasingly large budgets, and even bigger personalities — a time when advertising was seen as a glamorous, if maybe unethical, profession populated by well-dressed men and women (but mostly men) profiting from the postwar consumer culture.
At the time, many of the nation’s largest ad agencies were located on Madison Avenue in Manhattan, and the street came to be synonymous with American advertising and its unique methodology. Safire’s Political Dictionary, published in 1978, referred to “Madison Avenue techniques” as the “gimmicky, slick use of the communications media to play on emotions.” More recently, the culture surrounding this advertising boom has been portrayed in 2007’s acclaimed AMC series “Mad Men,” centered on the charismatic creative director Don Draper (played by Jon Hamm). Here are five fascinating facts about the golden age of advertising, and the real-life ad men and women of Madison Avenue.
A “Small” Ad Changed the Way Americans Looked at Cars
In the 1960s, advertising underwent a transformation that became known as the Creative Revolution, shifting the industry’s focus from research and science to an approach that was creative and emotionally driven. For better or worse, this era of advertising owes a lot to the Volkswagen Beetle, and the visionary ad man Bill Bernbach. In 1959, at a time when Americans were buying cars out of Detroit and vehicles were getting bigger and flashier, Bernbach’s agency, Doyle Dane Bernbach (DDB), was contracted to promote the German-made Volkswagen Beetle in the United States. The problem was, Volkswagen’s strong link to Nazi Germany made it a tough sell in the U.S. The challenge called for an unconventional approach. Rather than attempting to duplicate the busy, colorful advertising style of American-made cars, the creative team behind Volkswagen’s campaign went in the opposite direction. The first ad, “Think Small,” featured a small black-and-white image of a Volkswagen Beetle against a backdrop of white space. The now-iconic ad encouraged consumers to look at the car in a new light, from being able to “squeeze into a small parking spot” to having small insurance payments and small repair bills.
The 1960s ushered in a new era of creativity in advertising, delivering advertisements that were brash and irreverent but also respectful of the consumer and entertaining to read. Ironically, one of the biggest players in American advertising was British ad man David Ogilvy, founder of the New York City-based advertising giant Ogilvy & Mather and known today as the “father of advertising.” Ogilvy believed in the importance of creating “story appeal” through the use of unique, unexpected elements or “hooks,” such as the eye patch worn by “The Man in the Hathaway Shirt” ads. “Every advertisement is part of the long-term investment in the personality of the brand,” Ogilvy said, and it was a philosophy that Madison Avenue took to heart. Ogilvy’s portfolio included the first national advertising campaign for Sears, the quirky Commander Whitehead ads promoting Schweppes Quinine Water, and the beautiful tourism ads that helped revitalize the image of Puerto Rico. In 1962, Ogilvy’s creative and innovative vision led TIME magazine to call him “The most sought-after wizard in today’s advertising industry.”
The “three-martini lunch” — the typically all-male leisurely power lunches where ideas were sparked and deals were made over a few rounds of cocktails — is the stuff of legend today, and an iconic image of the culture surrounding Madison Avenue. And indeed, during the heydey of advertising’s golden age, drinking at lunch was not only acceptable, but expected. According to ad exec Jerry Della Femina, “The bartender would be shaking the martinis as we walked in.” It was accepted that drinking fueled the creative process; David Ogilvy’s advice for tapping into the creativity of the unconscious included “going for a long walk, or taking a hot bath, or drinking half a pint of claret.” Regardless of whether the real ad exes of Madison Avenue were regularly imbibing as much alcohol as Don Draper and his pals on “Mad Men,” there’s one indisputable fact about the so-called three-martini lunch: It was a deductible business expense that symbolized success as much as excess.
The Leo Burnett Company was one of the few major advertising agencies not based in Manhattan, but the Chicago agency was responsible for a number of well-known campaigns, including the Pillsbury Doughboy, Tony the Tiger, and one of the most successful campaigns in advertising history, the Marlboro Man. Smokers and non-smokers alike know the iconic cowboy character, who was developed by Burnett in the mid-1950s to rebrand the “mild” feminine cigarette. The ads were a hit and, in the mid-1960s, the team at Burnett went even further in promoting the brand by using real cowboys on a Texas ranch. When tobacco advertising was banned from television and radio in the early 1970s, the Marlboro cowboys still found success in print, making Marlboro the top-selling brand worldwide in 1972.
Advertisement
Advertisement
Photo credit: Ben Martin/ Archive Photos via Getty Images
The ’60s Saw the First Female CEO of a Major Ad Agency
Throughout the 1950s and ’60s, college-educated women recruited to work on Madison Avenue were more likely to be found sitting behind a typewriter than in the boardroom. In a booklet published in 1963 by the J. Walter Thompson agency (JWT), young women were encouraged to hone their typing and shorthand skills so they could become the “right hand to a busy executive” or “secretary to one of the senior analysts.” But in 1966, Mary Wells Lawrence, the founding president of Wells Rich Greene, became the first woman to found, own, and run a major advertising agency. Two years later, she became the first woman CEO of a company listed on the New York Stock Exchange. Some of her agency’s most notable campaigns include Alka-Seltzer’s “Plop, Plop, Fizz, Fizz,” Ford’s “Quality Is Job One,” and the “I ❤ New York” tourism campaign.
We all know of the Freemasons and the ever-mysterious Illuminati, but throughout history, plenty of other secret societies have flourished under the radar. The western U.S. is home to a long-running, low-key historical society with a unique and eccentric ethos, while northern Spain’s historic food culture has been kept alive through selective supper clubs for more than a century. Though their stories don’t often get told, these clandestine groups have nonetheless left their own obscure marks. Read on to learn about five little-known secret societies.
Secret societies typically conjure a dark air of mystery, but the Order of the Occult Hand illustrates the fun side of underground organizations. Its origins can be traced to 1965, when Joseph Flanders, a crime reporter for the Charlotte News, wrote an article about the shooting of a local millworker. “It was as if an occult hand had reached down from above and moved the players like pawns upon some giant chessboard,” Flanders wrote. His colleagues, the legend goes, found the flowery description so funny, they formed the Order of the Occult Hand, a secret society dedicated to sneaking “it was as if an occult hand,” or a similar phrase, into their work.
The mission quickly spread among journalism circles in Charlotte and beyond. By the early 1970s, the mischievous media conspiracy was becoming so prevalent that the Boston Herald reportedly banned “occult hand” from the paper. Over the years, the phrase continued to show up in TheNew York Times, TheWashington Post, and the Los Angeles Times. In 2004, writer James Janega published a thorough exposé of the Order in the Chicago Tribune, and in 2006, journalist Paul Greenberg, a long-running member of the society, copped to creating a new secret phrase that went into circulation, even as the “occult hand” keeps going.
In 2011, a team of researchers cracked the code of a centuries-old manuscript belonging to a secret society known as the Oculists. The text, known as the Copiale Cipher and believed to date back to between about 1760 and 1780, was discovered in former East Germany following the Cold War. Once the confusing use of Roman and Greek characters, arrows and shapes, and mathematical symbols was deciphered, a ritual manual for an 18th-century German group with a keen interest in eyesight was revealed.
The cipher detailed the Oculists’ initiation ceremonies, oaths, and “surgeries,” which seemed to consist of plucking hairs from eyebrows with tweezers — a nonsurgical procedure, of course, but described by the manuscript as symbolic actions. Another passage described a tobacco ceremony in which the hand pointedly touches the eye; another still told of a candidate kneeling in a candlelit room in front of a man wearing an amulet with a blue eye in the center. Research has suggested that the group’s focus on the eye was simply due to the fact that eyes are part of the symbology of secret societies — the Oculists did not appear to be optometrists, and their ultimate purpose remains a mystery.
Northern Spain’s Basque Country is home to a handful of “txokos” — food-centric secret societies that started as a way to save money on food and drink when dining out of the home. These gastronomic societies function as exclusive clubs; members, often chosen after being waitlisted for years, have access to a fully stocked kitchen and pantry, where they cook for themselves or each other, using the honor system to pay for items needed or used. While it sounds similar to a modern dinner party, many of the txokos have been around for decades, and are still going strong.
Kañoyetan, reportedly the oldest society in the region (founded in 1900), counts renowned local chef Martin Berasategui — a 12-time Michelin star recipient — among its members. Until recently, txokos operated as secret societies only for men; the club claimed to be a place for men to socialize and cook outside of the home, where, according to the BBC, “their wives traditionally called the shots.” Wine and cider are always on hand; these days the dinners can start late in the evening, and have been known to stretch on until the early morning hours.
The mysterious society known as E Clampus Vitus originated in West Virginia sometime around the mid-1840s. By the early 1850s, the “Clampers,” like many people during the gold rush era, made their way west to California. Many of the fraternal club’s rituals were adopted as a reaction to the formalities of other organizations at the time, such as the Odd Fellows and Freemasons. Clampers, who were primarily miners, wore eccentric clothing and accessories, conducted lighthearted rituals, and adopted the slogan “Credo Quia Absurdum,” or, roughly translated, “I believe because it is absurd.”
The Clampers’ clubs waned around the turn of the 20th century, and by the 1920s, the society was all but defunct. But in the 1930s, the Clampers reestablished themselves with a new objective: to chronicle some of the most obscure details of the history of the American West. In California alone, more than 1,400 historical markers have been installed to commemorate moments in the state’s history that might otherwise go overlooked, including the birthplace of the martini, filming locations, and the “world’s largest blossoming plant.”
The name Pythagoras likely brings back memories of high school geometry, but the ancient Greek philosopher and mathematician was also the head of a mysterious society. The Divine Brotherhood of Pythagoras was formed in the sixth century BCE. The community may have been based on the study of mathematics, but it operated more like a secret society — or, as some might say, a cult. It’s believed they lived together communally, surrendered their personal possessions, were vegetarians who purportedly did not eat beans because it was believed beans had souls, and followed several strict rituals.
The Pythagoreans’ motto was “all is number,” and their aim was to be pure of mind and soul. Their focus on mathematics and science was a way to achieve purity — as was avoiding wearing woolen clothing, and never stirring a fire with a knife, as laid out in Pythagoras’ rules. The group ultimately had many mathematical achievements, but their selective and rigid way of life contributed to a lingering sense of mystery around the community.
As we look back at American history, it’s crucial to take a moment to reflect on and recognize the contributions made by the nation’s Indigenous peoples, who are so often overshadowed by famous figures who came to the United States from other parts of the world. To commemorate this important part of America’s heritage, here’s a look at five notable Indigenous heroes and leaders who shaped the nation through their tireless efforts.
Photo credit: Hulton Archive/ Archive Photos via Getty Images
Geronimo (1829-1909)
A medicine man and leader of the Bedonkohe band of the Chiricahua Apache, Geronimo was born on the Gila River in New Mexico, where he was originally given the name Goyahkla, meaning “the one who yawns.” After the United States government forcibly relocated 4,000 Apaches to a reservation in San Carlos, Arizona, Geronimo led dozens of breakouts in an effort to return his community to their nomadic roots. Geronimo’s legacy is vast. His relationship with many American and Mexican civilians was complex, as he fought against colonialism but was made famous after appearing in Buffalo Bill’s “Wild West” sideshow and eventually in Theodore Roosevelt’s election parade. Geronimo’s tireless fight for Apache independence cemented him as a fearless crusader for freedom by the time of his death from pneumonia in 1909.
The son of a warrior, Sitting Bull was born in what is now South Dakota and was nicknamed “Slow” for his lack of fighting ability — that is, until he was branded Tatanka Yotanka (“Sitting Bull”) at age 14 after “counting coup” in a battle against the Crow Tribe. (“Counting coup” is a way to humiliate an enemy by riding close enough to touch them with a stick.) Sitting Bull eventually rose to become chief of the Hunkpapa Sioux, and fought tirelessly against the U.S. military, who sought to seize Indigenous land.
After fleeing to Canada to escape a vengeful army in the wake of the defeat of General George Armstrong Custer (and his 210 troops) in 1876 at the Battle of Little Bighorn, Sitting Bull returned to the U.S. in 1881 and was held prisoner at Standing Rock Reservation on Dakota Territory. His impact, however, could not be contained: After an Indigenous mystic claimed in 1889 that a ghost dance would eliminate the threat of white settlers on Native land, Sitting Bull allowed his followers to practice the dance — much to the horror of federal officials, who feared another uprising. Sitting Bull was killed by gunfire upon his arrest in 1890, and is remembered as a martyr for freedom.
Born near the Black Hills of South Dakota, Lakota Chief Crazy Horse was the son of a warrior with the same name, and at a young age he began showcasing his capacity for battle and bravery. Having helped lead the Sioux resistance against the U.S. military’s attempts to colonize the Great Plains throughout the 1860s and ’70s, Crazy Horse led a band of Lakota warriors against General Custer's 7th Cavalry Regiment during the Battle of Little Bighorn in 1876 (alongside Sitting Bull) before returning to the Northern Plains. Unfortunately, Crazy Horse and his community faced an unwavering enemy; forced to keep moving — and fighting — to evade federal resettlement, the chief and his 1,100 followers ultimately surrendered to the U.S. military at Fort Robinson in May 1877. There, in the wake of his arrest (and under the banner of truce), Crazy Horse was stabbed during a scuffle with U.S. soldiers and died of his injuries. He is remembered for his courage, leadership, and his endless perseverance against the colonizing forces.
Photo credit: MPI/ Archive Photos via Getty Images
Sacagawea (c. 1788-1812 or 1884)
Sacagawea was only around 16 years old when she carved her place in Native American history through her ability to communicate with different peoples. Kidnapped by the Hidatsa (Indigenous people of North Dakota) at age 12, Sacagawea was then claimed by French Canadian trader Toussaint Charbonneau as one of his wives at age 13. Despite this treatment, upon the arrival of explorers Meriwether Lewis and William Clark to Hidatsa territory in 1804, the young woman proved herself invaluable. Chosen by her husband to serve as interpreter as he and the explorers moved west, she rescued records and supplies from the river when the crew’s boat tipped and took on water, helped acquire horses from her brother when the expedition passed through Idaho, and saved her counterparts from starvation as they faced food shortages. Most importantly, her role as translator helped assure safety for both her own team and the Indigenous communities they crossed paths with. Her knowledge and wherewithal earned her momentous respect from the 45 white men who relied on her, and ultimately made the expedition a success. Her date of death remains a mystery. Following the expedition, Sacagawea and Charbonneau worked for the Missouri Fur Company in St. Louis in 1810, and it was believed that Sacagawea succumbed to typhus in 1812. However, some Native American oral histories claim that she lived until 1884 on the Shoshone lands where she was born.
Advertisement
Advertisement
Photo credit: Peter Turnley/ Corbis Historical via Getty Images
Wilma Mankiller (1945-2010)
For 10 years, Wilma Mankiller served as the principal chief of the Cherokee Nation, the first woman to do so. Born on the territory in 1945, Mankiller and her family were moved to a housing project in California in the 1950s, where they endured culture shock, racism, and the effects of poverty, which shaped the future chief’s ethos. Mankiller returned to Cherokee territory in 1977, where she founded the Community Development Department for the Cherokee Nation, and advocated endlessly for improved education, health care, and housing services.
For these efforts, then-Principal Chief Ross Swimmer asked her to run as his deputy in 1983. Two years later, Swimmer stepped down to lead the Bureau of Indian Affairs, and Mankiller became principal chief, serving until 1995. She was celebrated for lowering infant mortality rates, boosting education, and working to ensure financial and social equality. Mankiller was inducted into the National Women’s Hall of Fame in 1993, received the Presidential Medal of Freedom in 1998, and continued to advocate for women’s rights and Indigenous rights until her death in 2010 at age 64.
Depending on where you lived and when you grew up, it’s possible you might have known more than one person with the same name. Maybe there was a Jennifer A. and a Jennifer L., or maybe you knew four different people named Michael. Year after year, decade after decade, there are trends in baby names that draw on history, religion, and cultural references. Here are the most popular baby names in the United States during each decade of the 20th century.
Between 1900 and 1909, the most popular name for boys in the U.S. was John, and the most popular girls’ name, by a long shot, was Mary. This is according to data from the U.S. Social Security Administration, based on people applying for Social Security cards. There were 84,591 applications under the name John, and 161,504 entries for Mary. These two names popped up time and time again throughout the 20th century. Both names come from the Bible — John is one of Jesus’ disciples, and Mary is the name of both Jesus’ mother and Mary Magdalene. After John, the most popular boys’ names of this decade were William, James, George, and Charles, and the most popular girls’ names after Mary were Helen, Margaret, Anna, and Ruth.
Photo credit: FPG/ Archive Photos via Getty Images
1910s
Between 1910 and 1919, the most popular names were once again John and Mary. In this decade, there were 376,312 registered Johns and 478,637 Marys. Why the sudden jump? For one, the Social Security Administration began collecting data in 1937, so anyone born before that was only counted if they applied for a Social Security card after 1937. (That means the data for the 1900s, 1910s, and 1920s is based on people who listed their birthdays in these decades despite obtaining cards later in life, and doesn’t count anyone born in this period that didn’t apply for a Social Security card.) The U.S. also saw a population spike as infant mortality rates decreased throughout the 20th century, thanks to advances in health care and better access to clean water.
In the 1910s, for the second decade in a row, the second most popular names for boys and girls were William and Helen, respectively, followed by James, Robert, and Joseph for boys, and Dorothy, Margaret, and Ruth for girls. William has long been a popular English name dating back to William the Conqueror, who became the first Norman king of England in the 11th century. Helen, meanwhile, has its origins in Greek mythology: Helen of Troy was a famous beauty, known as the “face that launched a thousand ships.”
Between 1920 and 1929, John finally fell out of the top spot, as the most popular name for boys was Robert, with 576,373 entries. Robert, like William, dates back to English royalty and translates to “bright with fame” or “shining.” Mary stayed strong for girls, with 701,755 registered applications. The 1920s saw continued population increases both in the U.S. and worldwide. This is sometimes credited to a baby boom that occurred after World War I and the Spanish influenza, but is largely due, as in the previous decade, to better health care.
Photo credit: Hulton Archive/ Archive Photos via Getty Images
1930s
Between 1930 and 1939, Robert and Mary stayed at the top of the list, with 590,787 Roberts and 572,987 Marys. Though there were more Roberts born this decade than in the previous one, there was a decline in the birth rate overall due to the strain that the Great Depression placed on families. (The overall population was still higher in 1940 than in 1930, at roughly 132 million versus 123 million people.) A few new interesting names entered the runner-up positions in the 1930s. In female names, Betty and Barbara grew in popularity. Betty is a nickname for Elizabeth, a versatile name with Hebrew origins that is also found in English royalty (namely, Queen Elizabeth I). Barbara, like Helen, comes from Greek, and is also the name of St. Barbara, the patron saint of armorers, miners, and artillerymen. For boys’ names, the runners-up after Robert were James, John, William, and Richard.
Between 1940 and 1949, the name Robert fell to the second spot after James, which had 795,753 entries. Mary remained the most popular name for girls at 640,066 entries. The name James derives from Hebrew, and, like John, stems from a number of uses in the Bible. Like many other popular names, James is also found in the English monarchy, as well as the Scottish monarchy. Though it’s fallen out of the top slots in recent years in the United States, James remains one of the most popular baby names in Scotland. The next most popular boys’ names in the 1940s were Robert, John, William, and Richard; for girls, the list included Linda, Barbara, Patricia, and Carol. Interestingly, while Linda was never the most popular name in any given year, it is the most popular American baby name of all time, translating to “beautiful” in Spanish and Portuguese. Patricia, on the other hand, had been popular in England long before its time in the states, as it was the name of Queen Victoria’s granddaughter.
Photo credit: George Marks/ Hulton Archive via Getty Images
1950s
Between 1950 and 1959, the names James and Mary remained at the top of the list with 843,711 and 625,601 entries, respectively. Not far behind James, however, was a new popular name: Michael. Michael, like James, stems from the Hebrew Bible, and variations of the name exist across a number of languages, such as Miguel in Spanish and Micha in German. After James and Michael, Robert, John, and David topped the list for boys’ names, while Linda, Patricia, Susan, and Deborah followed Mary for the most popular girls’ names.
Between 1960 and 1969, everything changed, as is fitting for this revolutionary decade. Both James and Mary were unseated from the No. 1 slot: Michael became the most popular name for boys at 833,102 entries, and Lisa for girls at 496,975 entries. In fact, there were almost 150,000 more Lisas than Marys in the 1960s. The name is another variation on the popular moniker Elizabeth, and even Elvis Presley picked it for his daughter, Lisa Marie, who was born in 1968. While not much else changed in boys’ names this decade, popular girls’ names saw the addition of newcomers Susan, Karen, and Kimberly.
Between 1970 and 1979, Michael remained the most popular name for boys, topping out the decade with 707,458 entries, while Jennifer unseated the short-lived reign of Lisa with 581,753 entries. There were more new names that cropped up in the second and third slots, however, including Christopher and Jason for boys. The name Jennifer, meanwhile, grew so popular, it became known as the “standard” name for a baby girl. The initial spike in Jennifers started 50 years prior with the appearance of the name in a George Bernard Shaw play called The Doctor’s Dilemma. After Jennifer, the most popular ’70s girls’ names were Amy, Melissa, Michelle, and Kimberly.
Between 1980 and 1989, Michael retained its title as the most popular name for boys, with 663,827 entries, while Jessica just barely unseated Jennifer as the most popular name for girls — there were 469,518 Jessicas versus 440,896 Jennifers. Jessica stems from the Hebrew Bible, where its original spelling was “Jeska”; the common spelling in English comes from William Shakespeare’s play The Merchant of Venice. The top five boys’ names in the 1980s were Michael, Christopher, Matthew, Joshua, and David, and the top five for girls were Jessica, Jennifer, Amanda, Ashley, and Sarah.
Between 1990 and 1999, Michael and Jessica stayed the most popular names for each gender, with 462,390 Michaels and 303,118 Jessicas. Still, there were fewer entries for both than in the previous decade, in part because a handful of newer, trendy names cropped up as well, such as Matthew, Justin, and Andrew for boys and Ashley and Tiffany for girls. Andrew, like James, is a popular name with links to Scotland, while Matthew goes back to the Bible. Ashley and Tiffany, meanwhile, reflect the trend of girls’ names ending in “y” — names such as Brittany, Courtney, Emily, and Kelsey took off in the beginning of the 21st century.
Some of the most profound moments in history can be encapsulated in a single, memorable quote. These succinct phrases, often pulled from longer speeches or events, distill complex ideas into digestible gems. At their best, they act as verbal snapshots, capturing the essence of historical moments with an emotional urgency that lingers and lets them resonate across generations. Martin Luther King Jr.’s rallying cry of “I have a dream” is easily one of the most famous such lines in history. Similarly, Neil Armstrong’s “That’s one small step for man, one giant leap for mankind” immortalizes a peak moment in humanity; the astronaut’s muffled voice as he spoke to the public on Earth from the moon is unforgettable.
These sound bites have become cultural shorthand for momentous events and the ideals they captured, and their historical weight will keep them in the cultural consciousness for years to come.
At the heart of Martin Luther King Jr.’s famous 1963 speech were four simple words: “I have a dream.” On August 28, from the steps of the Lincoln Memorial and against a backdrop of racial segregation and discrimination in the United States, King energized the crowd — and the world — with his dream of a better life for his family and all African Americans. “I have a dream,” King said, “that one day this nation will rise up and live out the true meaning of its creed: We hold these truths to be self-evident, that all men are created equal.” He employed the phrase again, several times, to great effect, throughout the speech. “I have a dream that my four little children will one day live in a nation where they will not be judged by the color of their skin but by the content of their character,” he said. “I have a dream today.” The urgent, eloquent delivery laid bare the need for change; “I have a dream” became a rallying cry for the civil rights movement, and remains not a relic of history, but a living aspiration to this day.
King’s speech was televised by major broadcasters to a large live audience. At the time, he was a nationally known figure, but this was the first time many Americans — including, reportedly, President John F. Kennedy — had ever seen him deliver a full address. Less than a year later, President Lyndon B. Johnson signed the Civil Rights Act of 1964; the following year saw the Voting Rights Act of 1965 come into law. These pieces of legislation were the biggest civil rights advancements since the end of the Civil War.
On July 20, 1969, the first human walked on the moon. As astronaut Neil Armstrong climbed down the ladder of Apollo 11’s lunar module and onto the moon’s surface, he encapsulated the profound moment with these words: “That’s one small step for man, one giant leap for mankind.” He spoke through a muddled transmission to Earth, as some 650 million people watched on in awe.
Armstrong later told his biographer that, while he had thought ahead about what to say, it wasn’t too rehearsed. “What can you say when you step off of something?” he told biographer James R. Hansen. “Well, something about a step. It just sort of evolved during the period that I was doing the procedures of the practice takeoff and… all the other activities that were on our flight schedule at that time.” Although the quote has endured, Armstrong himself says it has been misquoted all along, and that he actually said, or at least meant to say, “one small step for a man.” (After many years and multiple attempts to clean up the audio quality, the Smithsonian National Air and Space Museum has concluded that the original quote is accurate.)
President John F. Kennedy assumed office during a tumultuous time in America’s history. But right from his inaugural address, he conveyed a spirit of hope and idealism in a resonant quote that went on to define his presidency. “Ask not what your country can do for you — ask what you can do for your country,” he famously said.
JFK’s inauguration, the first to be broadcast in color, was watched by some 38 million people. The speech, although credited principally to Kennedy, was also written by Kennedy’s longtime aide (and later, principal speechwriter) Ted Sorensen. Kennedy wanted a speech that would “set a tone for the era about to begin,” and he got just that. America was on the precipice of great social change, and the inaugural address encapsulated the country’s need for unity and the civic engagement the moment would call for.
Photo credit: Todd Warshaw/ Getty Images Sport via Getty Images
“Do You Believe in Miracles?” (1980)
One of the most iconic moments in sports history happened during the 1980 Winter Olympics in Lake Placid, New York. In the last few minutes of the men’s ice hockey medal round match between the United States and Soviet Union, the U.S. was, improbably, ahead by one goal. The Soviets were seasoned players known for their dominance in international hockey; they placed in the top three in every world championship and Olympic tournament they had played since 1954. The U.S. team, by comparison, was made up primarily of young college players who averaged 21 years old, making them the youngest players of any American Olympic hockey team in history.
No one expected a U.S. victory. A New York Times columnist even wrote that “unless the ice melts,” the USSR would once again be victorious. As the clock counted down, with just five seconds left and the U.S. still up by one, ABC sportscaster Al Michaels remarked, “Do you believe in miracles?” before letting out an elated “Yes!” as the clock ran out and the U.S. won 4-3. The victory was soon dubbed the “Miracle on Ice.” Two days later, the U.S. went on to clinch the gold medal after defeating Finland. A TV documentary about the road to gold used Michaels’ quote for its title, and in 2016, Sports Illustrated called the victory the “greatest moment in sporting history,” proving that a good underdog story can be better than fiction.
On June 12, 1987, during a ceremony at Berlin’s Brandenburg Gate for the city’s 750th anniversary, U.S. President Ronald Reagan delivered the now-famous line, “Mr. Gorbachev, tear down this wall.” The Berlin Wall, which had divided East and West Berlin since 1961, was more than just an imposing physical barrier; it symbolized the ideological divide between communism and democracy across Europe during the Cold War.
Reagan’s speech became a defining moment in his presidency — eventually. Although reactions were mixed at the time, the address gained favorable traction when the Berlin Wall finally fell two years later, on November 9, 1989. The line now stands as a pivotal moment in history, capturing an era of tense political dynamics — and, of course, solidifying Reagan’s legacy as “the great communicator.” The fall of the Berlin Wall was a historical turning point, signaling victory for democracy and peace. Soviet leader Mikhail Gorbachev even won the Nobel Peace Prize in 1990 for his role in putting the Cold War to an end.
Over the past century, the typical home kitchen has undergone a significant transformation, reflecting both social changes and new technology. In the 1920s and ’30s, kitchens were primarily utilitarian spaces with a focus on functionality and easy-to-clean surfaces. Appliances were limited, hand mixers had cranks, and gas ovens, which had replaced wood or coal-burning stoves in most homes, were starting to themselves be replaced by electric ovens.
The post-World War II consumerism of the late 1940s and 1950s brought bigger kitchens for entertaining and more labor-saving appliances, including blenders, mixers, and dishwashers. The kitchen space became more streamlined and functional, and the 1960s and 1970s brought countertop food processors and microwave ovens into the mainstream.
Open-plan kitchens and islands became increasingly popular in home design throughout the 1980s and ’90s, indicative of the kitchen’s role as a hub for family and friends to gather. That trend continued into the 21st century, along with a significant shift toward high-tech kitchens, smart appliances, and a focus on sustainability. Today’s kitchens — reflecting the changing ways we prepare, store, and consume food — look dramatically different than they did a century ago, making many once-popular items obsolete. Here are six things that your grandparents and great-grandparents might have had in their own home kitchens a century ago.
Photo credit: George Rinhart/ Corbis Historical via Getty Images
An Icebox
Before the widespread availability of electric refrigerators, iceboxes were used to keep perishable food cool. These wooden or metal boxes had a compartment for ice at the top, and fresh ice was delivered each week by an iceman. The design of the icebox allowed cold air to circulate around the stored items, while a drip pan collected the water as the ice melted. Naturally, iceboxes fell out of fashion as electric fridges went mainstream. In 1927, General Electric introduced the first affordable electric refrigeration, which relied on a refrigerant for cooling rather than ice.
Photo credit: FPG/ Archive Photos via Getty Images
A Butter Churn
Before commercial butter production made it possible to buy butter at the market, churning cream into butter was an activity done at home. The hand-crank butter churn was introduced in the mid-19th century, and it became the most commonly used household butter churn until the 1940s. In the early 20th century, the Dazey Churn & Manufacturing Company began producing glass churns that could make smaller quantities of butter much quicker than the larger, time-intensive churns. Once the butter was churned, it could then be poured or pressed into decorative molds for serving.
A Hoosier is a freestanding, self-contained kitchen cabinet that was popular in the early 1900s, named after the Hoosier Manufacturing Company that made it. Also known as a “Kitchen Piano” due to its shape, this kitchen necessity offered homemakers ample storage space and an additional work surface. Hoosier cabinets had numerous drawers and shelves for storing cookware and utensils, as well as features such as a flour bin with a built-in sifter, a sugar bin, a spice and condiment rack, a bread bin, a pull-out cutting board, and a cookbook holder. The all-in-one cabinet fell out of favor as kitchen designs began to incorporate built-in cabinets and islands for additional storage and counter space, but they’re still sometimes used for decorative storage.
Photo credit: Camerique/ Archive Photos via Getty Images
A Manual Hand Mixer
While the iconic KitchenAid stand mixer was patented more than 100 years ago in 1919, electric hand mixers weren’t commercially available until the 1960s. Before then, beating eggs or mixing other ingredients was done by hand, often with a manual hand mixer (also called a rotary egg beater). First developed in the 1850s, hand mixers had two beaters that rotated when you turned a crank. Though the style and mechanisms evolved over the years, manual hand mixers were still widely used in the 1920s, when only two-thirds of American households had electricity.
Even though ground coffee was available in bags and cans in the 1920s, and instant coffee was gaining popularity, household coffee grinders, such as a wall-mounted coffee grinder (or mill), were still a common kitchen appliance. According to a 1918 New-York Tribune article on the art of making perfect coffee, “The real coffee lover will always have a mill in the kitchen.” The wall-mounted, hand-crank style had a glass container that could hold a pound of coffee beans, and a container with tablespoon markings to catch the ground coffee.
There was a time when treasured family recipes were written on 3-by-5-inch index cards and stored in a box on the kitchen counter. Before the 1920s, most recipes were passed on by example — young women would learn how to make their grandmother’s pot roast by helping her in the kitchen. As such, handwritten recipes were generally a list of ingredients, often without quantity, and vague directions. As kitchen science developed, magazines began advertising recipe subscriptions delivered as preprinted, perforated cards. Women also started writing their own recipes on blank cards to collect and exchange, and the recipe box proved to be a more decorative and lasting storage solution than a shoebox. Like many vintage kitchen items, this nostalgic throwback still has novelty appeal, but the recipe box has largely been replaced by digital recipes stored on apps and websites.
Beginning in the 1830s, a combination of poverty, rapid industrialization, and immigration contributed to the rise of notorious street gangs throughout New York City. For the next several decades, these groups ran rampant until being largely replaced by organized crime syndicates toward the end of the 19th century. But during their heyday, gangs such as the Bowery Boys and Dead Rabbits ruled the streets of New York, particularly a neighborhood in southern Manhattan known as the Five Points. This turbulent period in New York City was marked by violence and corruption, events that were brought to the silver screen in Martin Scorsese’s 2002 historical drama Gangs of New York.
While that film is based on realities of the time, it also furthered several misconceptions about this crime-ridden era. We reached out to anthropologist R. Brian Ferguson, a professor at Rutgers University-Newark and author of the 2023 book Chimpanzees, War, and History, to learn more about this volatile period in NYC history. Ferguson has spent decades studying and teaching how conflict permeates throughout society, and was interviewed for the 2002 documentary Uncovering the Real Gangs of New York, a special feature included on DVD copies of the Scorsese film.
(Editor’s note: This interview has been edited for length and clarity.)
HISTORY FACTS: What was life like in New York City’s Five Points neighborhood?
FERGUSON: Well, the Five Points was from the intersection of different streets, and it began as a residential neighborhood but it was built on landfill from filling in a big lake. So it was wet, and it was sinking, which meant that it was full of diseases in the summer. By 1827, it was already disreputable. Mainly poor people who had no choice about where to live were there — it was the bottom for New York society.
For decades it became — not just in New York, but internationally — famous for incredible squalor and crime and drunkenness and prostitution. It became a symbol for all of that. It was also a highly political environment, and the politics of the time were more contentious in New York than what we’re seeing today in our own lives. It was really a tough time politically.
HISTORY FACTS: Speaking of politics, I know Tammany Hall was a big player in New York City. What was Tammany Hall and how did it play a role in local politics?
FERGUSON: Tammany Hall was the Democratic political machine. It won elections, gave out patronage; it was famous for corruption and vote fraud. But besides that, it was the only kind of government that did anything for the poorest of the poor. In the 1840s, it had found its base in immigrants who were pouring into New York, many of whom were Catholic, which Protestant America generally hated.
Tammany Hall was controlled by political ward politicians from the street up, using force. It wasn’t a top-down organization as it once was, but it was really responding to what was happening on the streets, like in the Five Points. The Five Points was its central power base because it was so densely populated. It was known as the “Bloody Ould Sixth Ward,” and the votes from there could control mayors, city government, even tip state and presidential elections.
HISTORY FACTS: Who were the predominant gangs at the time?
FERGUSON: Gangs were always changing; they rarely lasted more than a few years. They came and went by time and place and by politics. The movie by Scorsese is based on a book by Herbert Asbury, both called Gangs of New York, and both of those introduced a lot of inaccuracies. In the movie, the big gangs were the Dead Rabbits and the followers of Bill “the Butcher” Poole. The riot that did occur was between the Dead Rabbits and the Bowery Boys. The Dead Rabbits were a gang; whether the Bowery Boys were a gang or not — they were also kind of a social type — is not as clear.
The movie was inspired by the Bowery Boy-Dead Rabbit riot of 1857. That was a real thing that went on for hours with maybe 11 people dead, and it involved fighting — bricks, up to guns. It was the biggest gang clash that ever occurred in New York City. Not the biggest violence on the street, but the biggest gang clash. And that was the inspiration for Scorsese’s film adaptation.
HISTORY FACTS: You mentioned immigration — how did the gangs reflect the ethnic makeup of New York City at this time?
FERGUSON: The gangs were organized — the nucleus of the power structure were saloons and volunteer fire companies, which were omnipresent and very political. Leadership in a gang came by association with one of those, and leadership was based mainly on fists. Fighting in the street was extremely common. All neighborhoods had their ethnic character, but it was never pure; it was always a mix.
So the Five Points was mostly Irish-inhabited at this point, but not exclusively. Gangs were mostly Irish but wouldn’t turn away anybody who lived in the neighborhood who could fight. But they were also shaped from the top down. Politicians built their organizations based on the compositions of neighborhoods. It was both a cause and effect of the political organization that gave life to the gangs. And it wasn’t just mostly Irish, but you could say particular areas of Ireland. A whole building might be from one area.
But [in terms of the city’s general ethnic makeup] German immigration was big; [New York City] also had people who were native born and were seen as “true” Americans. Italians hadn’t come in yet; the Eastern European Jews hadn’t come in yet. But New York always had lots of different people in it, like Syrians were a big immigrant population.
Advertisement
Advertisement
HISTORY FACTS: How did immigration contribute to the rise of these gangs?
FERGUSON: The immigration was a big part of, to use a contemporary word, the intersectionality of street organizations back then. Most immigrants were also extremely poor. But it wasn’t just the immigrants — this is when industrialism was on the rise, unemployment was exploding for all, and the time around the 1850s was seen as mainly just rich and poor. [There was] little in between. And poverty was mapped onto the ethnic divisions.
Also, politicians would scare the immigrants with the specter of competition from freed slaves, and really conjured up racism to a hot degree. So, there were mixes in terms of how people were organized. The racist, and anti-abolitionist groups, were mostly poor and could include any of the poor. But nativism, which was anti-immigrant, excluded the Catholics, and the Catholics were a lot of the poor. So there were these different combinations possible, and the local ward politicians worked all of these permutations.
HISTORY FACTS: Is the Irish vs. “native” conflict as depicted in Scorsese’s film accurate?
FERGUSON: The Irish versus the “native” thing, it’s a yes and no. It’s not false, but it’s not really true either. The Bowery Boy-Dead Rabbit riot of 1857 was part of crises all across the United States in the time leading up to the Civil War. In New York state, this played out largely as a conflict between the state government and the city government. The state government in Albany was Protestant, Republican, and anti-immigrant, and the city government by this time was more immigrant- and Catholic-oriented and Democrat. So this was the polarization.
The state of New York then put through, in 1857, a kind of a coup, restructuring the city, which took over many of the city functions — like control of the Port of New York. But most important of all, they disbanded the police force at the time — the Tammany police force known as the Municipal Police — and created a new police force called the Metropolitan Police that were controlled from Albany. Tammany itself, besides the state and city thing, was extremely divided into two warring factions. So there was like a three-way struggle going on.
Nativism was a part of all of that, but who had political power, and who got the benefits of controlling corruption were at least as big or bigger issues. When I did research on the gangs that fought in 1857, they all had clear local political alignments, and one thing that was left out entirely of the film, and Asbury’s book, was that the fights that became riots began with attacks on the Metropolitan Police — that’s the state police force. That was clearly one of the biggest issues here.
Advertisement
Advertisement
HISTORY FACTS: What led to the rise of these gangs and their eventual downfall?
FERGUSON: Well, poor neighborhoods like the Five Points — but there were many — provided kind of the raw material: people who could fight and were looking for something to do, and were looking for a leg up. These could be shaped into adult gangs. But as an anthropologist, what I like to look at is the local organizations of the street that organized and raised up gangs into kind of political actors. Back at that time, there were two organizations: There was the saloon, which was the neighborhood center, and the volunteer fire companies, which were all over the city and connected to large political factions. And [all of these groups] were always fighting; they would fight each other all the time. So, the conflicts in 1857 went through these things, like volunteer fire companies and saloons, to raise these local street people up into named gangs and to pit them against each other.
If you look at the gangs around then, they’re very big in newspapers of the times. After that, they’re not so much. In later years, and I’ll just pick 1885 as an example, there were still street gangs all around the town, but they were less important politically. The reason was the then-boss of Tammany Hall, a guy named Dick Croker, had iron control and didn’t need [the gangs] as much.
Also, that was the Gilded Age of extreme capitalist fortunes, and the capitalists who had great control over the city supported the police, which by that point was the NYPD, to keep control of what they saw as dangerous classes — the people who lived in the slums. Otherwise, cops — if [the cops] kept [the people in the slums] from being a problem — could do whatever they wanted, which led over a couple of decades to police brutality and corruption.
And then there was a big scandal that came along in 1895. It was called the Lexow Investigation and it revealed that the New York Police Department was what they called “organized criminality” in New York City. It wasn’t allowing it, it was it. So, reform and another era of political turmoil in Tammany Hall led to new named gangs coming up. People might recognize the Monk Eastman gang or the Paul Kelly gang. And by about 1900, these were changing from what they used to be and taking over what the police had been pushed out of and had controlled, including gambling and prostitution and rackets and extortion.
That was a new era that led to the gangster era, and the gangsters in their peak generally led to less street crime because they were organized to make money. You didn’t want people to get mugged when they came out of a speakeasy. So, the area got less violent, less uncontrolled, as that developed. And as it went on, New York City went through the whole process of development, which is a much bigger topic about changing industrial structure and job structure and development of a middle class.
HISTORY FACTS: Going back to Scorsese’s movie, what did the film get right and what did it get wrong?
FERGUSON: It’s imaginary, like any movie; I don’t hold that against it. The plot, of course, is fiction. The film was loosely based on Herbert Asbury’s book, and Herbert Asbury really tried, but he had bad information. I’ve tracked down most of his sources in my own research. The movie did get the look right. Many details of the time are very real. They exaggerated certain things, like they made the Dead Rabbits look like they wore a particular kind of uniform, which, not really. No naval ships fired cannons on crowds, although soldiers did. The film left out the stench and the insects and the sewers in the street and all of that stuff. So you don’t get quite that depth of it, but it’s a movie. (Editor’s note: Ferguson recommended a book by Tyler Anbinder called Five Points for those interested in learning more about these details.)
Other big inaccuracies are due to the fact that the filmmakers had to compress time. And so Bill “the Butcher” [Poole] — the guy played by Daniel Day-Lewis — was dead a few years before the big Bowery Boy-Dead Rabbits riot. And Scorsese, consistent with his own film background, made Bill Poole a crime boss, getting a cut of everything. No, that came later. There’s nothing indicating that this was organized crime in that sense. Another thing is that Bill Poole worked for the politicians. He wouldn’t kill one of them, as he does in the film. There was a political hierarchy and he was a step down.
There were, in reality, lots of little turf fights all the time, but there wasn’t anything like Daniel Day-Lewis says to decide once and for all who’s going to be the lords of the Five Points. It wasn’t that kind of territorial control. And one big inaccuracy of the movie is the excessive violence, especially in the opening riot. Now, there was violence all the time, but with fists and bricks and sometimes up to guns. Most people in the poor neighborhoods didn’t own guns; they were too expensive. But there were chimneys all over the place and you could topple a chimney over and you’ve got a supply of bricks, which is what they did.
I think the thing that I have the biggest issue with in the film is that it leaves out how important was politics and everything that was going on, and how important was the role of the new state Metropolitan Police. But I’ll add, on a positive note, I think it was great that Scorsese brought in the Draft Riots [violent citywide protests against the Civil War draft and fueled by racial tension] — although, this was not a gang event, other than gang members participating in rioting mobs, individually. But I teach about the Draft Riots, and what I can tell you is that no one has heard about this incredible event in American national history. The Draft Riots tell you an awful lot about what was becoming America.
Advertisement
Advertisement
HISTORY FACTS: What gang-related sites from this time period are still standing?
FERGUSON: There are a lot of gang locations if you know where to look, walking on 2nd Avenue from 14th Street to Houston Street. And there’s more than a dozen significant locations, mainly shootings, that took place on that stretch, although that was mostly the later gangsters up to the beginning of Prohibition.
From the [Gangs of New York] film era, and for the Five Points, there’s really only one thing that remains. On the northwest corner of Baxter and Worth Street — this is between the courthouse district and Columbus Park — is the only remaining point. That point, I can’t go by that area without going by and standing on that point. I’ve seen lots of illustrations of the Five Points and I just imagine all those illustrations while I’m there and standing on that point. But that’s the only physical remnant that you can see.
As time went on, the Five Points kind of got toned down by mission and other reform efforts in the Five Points itself. The most squalid and dangerous part of New York moved just one block east to Mulberry Street. When they tore down the block known as Mulberry Bend, they didn’t cart the stuff away; they just tumbled everything into the basements. So when they were redoing Columbus Park, they cleaned away the surface and I could see all of these basements that were the Five Points, that were Mulberry Bend — they’re still there. But they’re underground.
If I can expand the scope a little bit for gangster sites, my favorite is many blocks north on Great Jones Street, which is in the East Village. Right on the south side of Great Jones Street, west of the Bowery, there are two buildings. One has a window on the second floor that has an arch to it. This window became famous because Andy Warhol bought it some years ago, and the artist [Jean-Michel] Basquiat had a studio there, and in fact died in that room. But that building was the headquarters of Paul Kelly’s gang. Paul Kelly, whose birth name was Paul Vaccarelli, is what my [current] research centers on, and I think he was the most successful gangster in New York City history. For one thing, he died in bed, which most gangsters didn’t.
____
R. Brian Ferguson is a New York City-based anthropologist. To learn more about his work, visit his website. His most recent work, Chimpanzees, War and History, is also available for purchase here.
Flowers have been collected and shared since ancient times, appreciated for their beauty, scent, and practical uses. The long tradition of giving flowers for special occasions has evolved over the centuries, but it’s still an enduring ritual that spans all cultures. From congratulations on the birth of a baby to condolences on the loss of a loved one, sending flowers continues to be one of the most popular ways to mark the momentous events of life. It’s so popular, in fact, that the worldwide cut flowers market was over $36 billion in 2022, and is projected to go over $45 billion by 2027. Valentine’s Day continues to be the biggest flower-giving day of the year, but it is far from the only special occasion marked by this ancient ritual. Here is a look at the fascinating role flowers have played throughout human history, from the evolution of flowering plants to the booming floral industry.
Around 80% of green plants are flowering plants, and the oldest flowers in the world date back to the Cretaceous Period more than 130 million years ago. Those first flowers didn’t resemble ones we know and love today: They were barely visible to the human eye and almost unrecognizable as flowers even under a microscope. The interaction between flowering plants and insects aided in the coevolution of both, with flowers developing strong fragrances, appealing colors, and larger petals to attract pollinators. It was these same traits that also appealed to the earliest human societies, which began to cultivate and use flowering plants in religious and cultural ceremonies.
Photo credit: DEA / W. BUSS/ De Agostini via Getty Images
The Flowers of Antiquity
Some of today’s most popular flowers for bouquets and floral arrangements were first cultivated thousands of years ago. The cultural significance of flowers has been reflected in the art and literature of ancient China, Egypt, Greece, and Rome. Roses, one of the most popular flowers for gifting, were first grown in gardens 5,000 years ago in China. The ancient Egyptians used flowers in religious ceremonies as offerings to the gods and the dead, decorated their war carts with flowers before going to battle, and painted and carved floral and leaf motifs into their art. The Greeks and Romans used flowers in similar ways, associating specific varieties with their gods and goddesses and using flowering plants in festivals, rituals, and for their own enjoyment.
In more recent history, cherry blossoms (sakura) have been revered in Japan since the Heian period (794–1185) and, because they bloom for only a short time in the spring, are associated with the transient nature of life. Marigolds, which have been a part of Mexican culture since the pre-Columbian era, were imported to India over 350 years ago and have become an integral part of wedding celebrations and Hindu festivals such as Diwali.
In Europe, the symbolic use of flowers developed in the medieval and Renaissance eras, when different flowers and flowering plants were linked to a variety of virtues and emotions. In the Victorian era of the 19th century, floriography, or the language of flowers, emerged as a way of communicating specific sentiments through the type, color, and even the arrangement of specific flowers. This form of flower code was a way of conveying one’s feelings in an era marked by restraint and discretion. Artist and writer Kate Greenaway’sLanguage in Flowers, published in 1884, was an indispensable floriography dictionary, providing the meanings of different flowers and their significance in bouquets and floral arrangements. For instance, if a gentleman wished to send a bouquet of flowers to his betrothed, he might include blue violets, signifying faithfulness, and white roses, meaning “I am worthy of you.”
Floristry — the cultivation, arrangement, and sale of cut flowers — developed around the mid-19th century. The Society of American Florists was established in Chicago in 1884 to advance floral artists and sales. As the 20th century began, the proliferation of floral shops and flower delivery services further popularized and commercialized the tradition of giving flowers. Mother’s Day, ranking second only to Valentine’s Day in flower-giving, became an official U.S. holiday in 1914, receiving enthusiastic support and promotion from the floral industry.
Though floriography isn’t as popular today as it was in the Victorian era, different flowers continue to carry specific meanings. Giving someone red roses still signifies feelings of love and desire, while white lilies are still considered traditional flowers in many cultures for both weddings (representing purity and new beginnings) and funerals (grief and remembrance). In a nod to the old-fashioned flower code, Kate Middleton’s 2011 wedding bouquet included white Sweet William blossoms, signifying gallantry (and referencing the groom, Prince William).
Today, flowers remain a popular gift for a wide variety of occasions, both to convey specific sentiments as well as to be enjoyed for their aesthetic beauty. With more than 15,000 retail florists in the United States alone, it is easier than ever to let someone know we’re thinking about them. Modern technology has even embraced the tradition of flower-giving with a series of floral emoji, including a rose, hibiscus, cherry blossom, sunflower, tulip, and flower bouquet. Now, “giving” flowers is as simple as dashing off a text message or writing an email, proving that the long history of flower-giving endures.