As we look back at American history, it’s crucial to take a moment to reflect on and recognize the contributions made by the nation’s Indigenous peoples, who are so often overshadowed by famous figures who came to the United States from other parts of the world. To commemorate this important part of America’s heritage, here’s a look at five notable Indigenous heroes and leaders who shaped the nation through their tireless efforts.
Geronimo (1829-1909)
A medicine man and leader of the Bedonkohe band of the Chiricahua Apache, Geronimo was born on the Gila River in New Mexico, where he was originally given the name Goyahkla, meaning “the one who yawns.” After the United States government forcibly relocated 4,000 Apaches to a reservation in San Carlos, Arizona, Geronimo led dozens of breakouts in an effort to return his community to their nomadic roots. Geronimo’s legacy is vast. His relationship with many American and Mexican civilians was complex, as he fought against colonialism but was made famous after appearing in Buffalo Bill’s “Wild West” sideshow and eventually in Theodore Roosevelt’s election parade. Geronimo’s tireless fight for Apache independence cemented him as a fearless crusader for freedom by the time of his death from pneumonia in 1909.
The son of a warrior, Sitting Bull was born in what is now South Dakota and was nicknamed “Slow” for his lack of fighting ability — that is, until he was branded Tatanka Yotanka (“Sitting Bull”) at age 14 after “counting coup” in a battle against the Crow Tribe. (“Counting coup” is a way to humiliate an enemy by riding close enough to touch them with a stick.) Sitting Bull eventually rose to become chief of the Hunkpapa Sioux, and fought tirelessly against the U.S. military, who sought to seize Indigenous land.
After fleeing to Canada to escape a vengeful army in the wake of the defeat of General George Armstrong Custer (and his 210 troops) in 1876 at the Battle of Little Bighorn, Sitting Bull returned to the U.S. in 1881 and was held prisoner at Standing Rock Reservation on Dakota Territory. His impact, however, could not be contained: After an Indigenous mystic claimed in 1889 that a ghost dance would eliminate the threat of white settlers on Native land, Sitting Bull allowed his followers to practice the dance — much to the horror of federal officials, who feared another uprising. Sitting Bull was killed by gunfire upon his arrest in 1890, and is remembered as a martyr for freedom.
Born near the Black Hills of South Dakota, Lakota Chief Crazy Horse was the son of a warrior with the same name, and at a young age he began showcasing his capacity for battle and bravery. Having helped lead the Sioux resistance against the U.S. military’s attempts to colonize the Great Plains throughout the 1860s and ’70s, Crazy Horse led a band of Lakota warriors against General Custer's 7th Cavalry Regiment during the Battle of Little Bighorn in 1876 (alongside Sitting Bull) before returning to the Northern Plains. Unfortunately, Crazy Horse and his community faced an unwavering enemy; forced to keep moving — and fighting — to evade federal resettlement, the chief and his 1,100 followers ultimately surrendered to the U.S. military at Fort Robinson in May 1877. There, in the wake of his arrest (and under the banner of truce), Crazy Horse was stabbed during a scuffle with U.S. soldiers and died of his injuries. He is remembered for his courage, leadership, and his endless perseverance against the colonizing forces.
Sacagawea was only around 16 years old when she carved her place in Native American history through her ability to communicate with different peoples. Kidnapped by the Hidatsa (Indigenous people of North Dakota) at age 12, Sacagawea was then claimed by French Canadian trader Toussaint Charbonneau as one of his wives at age 13. Despite this treatment, upon the arrival of explorers Meriwether Lewis and William Clark to Hidatsa territory in 1804, the young woman proved herself invaluable. Chosen by her husband to serve as interpreter as he and the explorers moved west, she rescued records and supplies from the river when the crew’s boat tipped and took on water, helped acquire horses from her brother when the expedition passed through Idaho, and saved her counterparts from starvation as they faced food shortages. Most importantly, her role as translator helped assure safety for both her own team and the Indigenous communities they crossed paths with. Her knowledge and wherewithal earned her momentous respect from the 45 white men who relied on her, and ultimately made the expedition a success. Her date of death remains a mystery. Following the expedition, Sacagawea and Charbonneau worked for the Missouri Fur Company in St. Louis in 1810, and it was believed that Sacagawea succumbed to typhus in 1812. However, some Native American oral histories claim that she lived until 1884 on the Shoshone lands where she was born.
Advertisement
Advertisement
Wilma Mankiller (1945-2010)
For 10 years, Wilma Mankiller served as the principal chief of the Cherokee Nation, the first woman to do so. Born on the territory in 1945, Mankiller and her family were moved to a housing project in California in the 1950s, where they endured culture shock, racism, and the effects of poverty, which shaped the future chief’s ethos. Mankiller returned to Cherokee territory in 1977, where she founded the Community Development Department for the Cherokee Nation, and advocated endlessly for improved education, health care, and housing services.
For these efforts, then-Principal Chief Ross Swimmer asked her to run as his deputy in 1983. Two years later, Swimmer stepped down to lead the Bureau of Indian Affairs, and Mankiller became principal chief, serving until 1995. She was celebrated for lowering infant mortality rates, boosting education, and working to ensure financial and social equality. Mankiller was inducted into the National Women’s Hall of Fame in 1993, received the Presidential Medal of Freedom in 1998, and continued to advocate for women’s rights and Indigenous rights until her death in 2010 at age 64.
Depending on where you lived and when you grew up, it’s possible you might have known more than one person with the same name. Maybe there was a Jennifer A. and a Jennifer L., or maybe you knew four different people named Michael. Year after year, decade after decade, there are trends in baby names that draw on history, religion, and cultural references. Here are the most popular baby names in the United States during each decade of the 20th century.
1900s
Between 1900 and 1909, the most popular name for boys in the U.S. was John, and the most popular girls’ name, by a long shot, was Mary. This is according to data from the U.S. Social Security Administration, based on people applying for Social Security cards. There were 84,591 applications under the name John, and 161,504 entries for Mary. These two names popped up time and time again throughout the 20th century. Both names come from the Bible — John is one of Jesus’ disciples, and Mary is the name of both Jesus’ mother and Mary Magdalene. After John, the most popular boys’ names of this decade were William, James, George, and Charles, and the most popular girls’ names after Mary were Helen, Margaret, Anna, and Ruth.
Between 1910 and 1919, the most popular names were once again John and Mary. In this decade, there were 376,312 registered Johns and 478,637 Marys. Why the sudden jump? For one, the Social Security Administration began collecting data in 1937, so anyone born before that was only counted if they applied for a Social Security card after 1937. (That means the data for the 1900s, 1910s, and 1920s is based on people who listed their birthdays in these decades despite obtaining cards later in life, and doesn’t count anyone born in this period that didn’t apply for a Social Security card.) The U.S. also saw a population spike as infant mortality rates decreased throughout the 20th century, thanks to advances in health care and better access to clean water.
In the 1910s, for the second decade in a row, the second most popular names for boys and girls were William and Helen, respectively, followed by James, Robert, and Joseph for boys, and Dorothy, Margaret, and Ruth for girls. William has long been a popular English name dating back to William the Conqueror, who became the first Norman king of England in the 11th century. Helen, meanwhile, has its origins in Greek mythology: Helen of Troy was a famous beauty, known as the “face that launched a thousand ships.”
Between 1920 and 1929, John finally fell out of the top spot, as the most popular name for boys was Robert, with 576,373 entries. Robert, like William, dates back to English royalty and translates to “bright with fame” or “shining.” Mary stayed strong for girls, with 701,755 registered applications. The 1920s saw continued population increases both in the U.S. and worldwide. This is sometimes credited to a baby boom that occurred after World War I and the Spanish influenza, but is largely due, as in the previous decade, to better health care.
Between 1930 and 1939, Robert and Mary stayed at the top of the list, with 590,787 Roberts and 572,987 Marys. Though there were more Roberts born this decade than in the previous one, there was a decline in the birth rate overall due to the strain that the Great Depression placed on families. (The overall population was still higher in 1940 than in 1930, at roughly 132 million versus 123 million people.) A few new interesting names entered the runner-up positions in the 1930s. In female names, Betty and Barbara grew in popularity. Betty is a nickname for Elizabeth, a versatile name with Hebrew origins that is also found in English royalty (namely, Queen Elizabeth I). Barbara, like Helen, comes from Greek, and is also the name of St. Barbara, the patron saint of armorers, miners, and artillerymen. For boys’ names, the runners-up after Robert were James, John, William, and Richard.
Advertisement
Advertisement
1940s
Between 1940 and 1949, the name Robert fell to the second spot after James, which had 795,753 entries. Mary remained the most popular name for girls at 640,066 entries. The name James derives from Hebrew, and, like John, stems from a number of uses in the Bible. Like many other popular names, James is also found in the English monarchy, as well as the Scottish monarchy. Though it’s fallen out of the top slots in recent years in the United States, James remains one of the most popular baby names in Scotland. The next most popular boys’ names in the 1940s were Robert, John, William, and Richard; for girls, the list included Linda, Barbara, Patricia, and Carol. Interestingly, while Linda was never the most popular name in any given year, it is the most popular American baby name of all time, translating to “beautiful” in Spanish and Portuguese. Patricia, on the other hand, had been popular in England long before its time in the states, as it was the name of Queen Victoria’s granddaughter.
Between 1950 and 1959, the names James and Mary remained at the top of the list with 843,711 and 625,601 entries, respectively. Not far behind James, however, was a new popular name: Michael. Michael, like James, stems from the Hebrew Bible, and variations of the name exist across a number of languages, such as Miguel in Spanish and Micha in German. After James and Michael, Robert, John, and David topped the list for boys’ names, while Linda, Patricia, Susan, and Deborah followed Mary for the most popular girls’ names.
Advertisement
Advertisement
1960s
Between 1960 and 1969, everything changed, as is fitting for this revolutionary decade. Both James and Mary were unseated from the No. 1 slot: Michael became the most popular name for boys at 833,102 entries, and Lisa for girls at 496,975 entries. In fact, there were almost 150,000 more Lisas than Marys in the 1960s. The name is another variation on the popular moniker Elizabeth, and even Elvis Presley picked it for his daughter, Lisa Marie, who was born in 1968. While not much else changed in boys’ names this decade, popular girls’ names saw the addition of newcomers Susan, Karen, and Kimberly.
Between 1970 and 1979, Michael remained the most popular name for boys, topping out the decade with 707,458 entries, while Jennifer unseated the short-lived reign of Lisa with 581,753 entries. There were more new names that cropped up in the second and third slots, however, including Christopher and Jason for boys. The name Jennifer, meanwhile, grew so popular, it became known as the “standard” name for a baby girl. The initial spike in Jennifers started 50 years prior with the appearance of the name in a George Bernard Shaw play called The Doctor’s Dilemma. After Jennifer, the most popular ’70s girls’ names were Amy, Melissa, Michelle, and Kimberly.
Advertisement
Advertisement
1980s
Between 1980 and 1989, Michael retained its title as the most popular name for boys, with 663,827 entries, while Jessica just barely unseated Jennifer as the most popular name for girls — there were 469,518 Jessicas versus 440,896 Jennifers. Jessica stems from the Hebrew Bible, where its original spelling was “Jeska”; the common spelling in English comes from William Shakespeare’s play The Merchant of Venice. The top five boys’ names in the 1980s were Michael, Christopher, Matthew, Joshua, and David, and the top five for girls were Jessica, Jennifer, Amanda, Ashley, and Sarah.
Between 1990 and 1999, Michael and Jessica stayed the most popular names for each gender, with 462,390 Michaels and 303,118 Jessicas. Still, there were fewer entries for both than in the previous decade, in part because a handful of newer, trendy names cropped up as well, such as Matthew, Justin, and Andrew for boys and Ashley and Tiffany for girls. Andrew, like James, is a popular name with links to Scotland, while Matthew goes back to the Bible. Ashley and Tiffany, meanwhile, reflect the trend of girls’ names ending in “y” — names such as Brittany, Courtney, Emily, and Kelsey took off in the beginning of the 21st century.
Some of the most profound moments in history can be encapsulated in a single, memorable quote. These succinct phrases, often pulled from longer speeches or events, distill complex ideas into digestible gems. At their best, they act as verbal snapshots, capturing the essence of historical moments with an emotional urgency that lingers and lets them resonate across generations. Martin Luther King Jr.’s rallying cry of “I have a dream” is easily one of the most famous such lines in history. Similarly, Neil Armstrong’s “That’s one small step for man, one giant leap for mankind” immortalizes a peak moment in humanity; the astronaut’s muffled voice as he spoke to the public on Earth from the moon is unforgettable.
These sound bites have become cultural shorthand for momentous events and the ideals they captured, and their historical weight will keep them in the cultural consciousness for years to come.
“I Have a Dream” (1963)
At the heart of Martin Luther King Jr.’s famous 1963 speech were four simple words: “I have a dream.” On August 28, from the steps of the Lincoln Memorial and against a backdrop of racial segregation and discrimination in the United States, King energized the crowd — and the world — with his dream of a better life for his family and all African Americans. “I have a dream,” King said, “that one day this nation will rise up and live out the true meaning of its creed: We hold these truths to be self-evident, that all men are created equal.” He employed the phrase again, several times, to great effect, throughout the speech. “I have a dream that my four little children will one day live in a nation where they will not be judged by the color of their skin but by the content of their character,” he said. “I have a dream today.” The urgent, eloquent delivery laid bare the need for change; “I have a dream” became a rallying cry for the civil rights movement, and remains not a relic of history, but a living aspiration to this day.
King’s speech was televised by major broadcasters to a large live audience. At the time, he was a nationally known figure, but this was the first time many Americans — including, reportedly, President John F. Kennedy — had ever seen him deliver a full address. Less than a year later, President Lyndon B. Johnson signed the Civil Rights Act of 1964; the following year saw the Voting Rights Act of 1965 come into law. These pieces of legislation were the biggest civil rights advancements since the end of the Civil War.
On July 20, 1969, the first human walked on the moon. As astronaut Neil Armstrong climbed down the ladder of Apollo 11’s lunar module and onto the moon’s surface, he encapsulated the profound moment with these words: “That’s one small step for man, one giant leap for mankind.” He spoke through a muddled transmission to Earth, as some 650 million people watched on in awe.
Armstrong later told his biographer that, while he had thought ahead about what to say, it wasn’t too rehearsed. “What can you say when you step off of something?” he told biographer James R. Hansen. “Well, something about a step. It just sort of evolved during the period that I was doing the procedures of the practice takeoff and… all the other activities that were on our flight schedule at that time.” Although the quote has endured, Armstrong himself says it has been misquoted all along, and that he actually said, or at least meant to say, “one small step for a man.” (After many years and multiple attempts to clean up the audio quality, the Smithsonian National Air and Space Museum has concluded that the original quote is accurate.)
President John F. Kennedy assumed office during a tumultuous time in America’s history. But right from his inaugural address, he conveyed a spirit of hope and idealism in a resonant quote that went on to define his presidency. “Ask not what your country can do for you — ask what you can do for your country,” he famously said.
JFK’s inauguration, the first to be broadcast in color, was watched by some 38 million people. The speech, although credited principally to Kennedy, was also written by Kennedy’s longtime aide (and later, principal speechwriter) Ted Sorensen. Kennedy wanted a speech that would “set a tone for the era about to begin,” and he got just that. America was on the precipice of great social change, and the inaugural address encapsulated the country’s need for unity and the civic engagement the moment would call for.
One of the most iconic moments in sports history happened during the 1980 Winter Olympics in Lake Placid, New York. In the last few minutes of the men’s ice hockey medal round match between the United States and Soviet Union, the U.S. was, improbably, ahead by one goal. The Soviets were seasoned players known for their dominance in international hockey; they placed in the top three in every world championship and Olympic tournament they had played since 1954. The U.S. team, by comparison, was made up primarily of young college players who averaged 21 years old, making them the youngest players of any American Olympic hockey team in history.
No one expected a U.S. victory. A New York Times columnist even wrote that “unless the ice melts,” the USSR would once again be victorious. As the clock counted down, with just five seconds left and the U.S. still up by one, ABC sportscaster Al Michaels remarked, “Do you believe in miracles?” before letting out an elated “Yes!” as the clock ran out and the U.S. won 4-3. The victory was soon dubbed the “Miracle on Ice.” Two days later, the U.S. went on to clinch the gold medal after defeating Finland. A TV documentary about the road to gold used Michaels’ quote for its title, and in 2016, Sports Illustrated called the victory the “greatest moment in sporting history,” proving that a good underdog story can be better than fiction.
Advertisement
Advertisement
“Tear Down This Wall” (1987)
On June 12, 1987, during a ceremony at Berlin’s Brandenburg Gate for the city’s 750th anniversary, U.S. President Ronald Reagan delivered the now-famous line, “Mr. Gorbachev, tear down this wall.” The Berlin Wall, which had divided East and West Berlin since 1961, was more than just an imposing physical barrier; it symbolized the ideological divide between communism and democracy across Europe during the Cold War.
Reagan’s speech became a defining moment in his presidency — eventually. Although reactions were mixed at the time, the address gained favorable traction when the Berlin Wall finally fell two years later, on November 9, 1989. The line now stands as a pivotal moment in history, capturing an era of tense political dynamics — and, of course, solidifying Reagan’s legacy as “the great communicator.” The fall of the Berlin Wall was a historical turning point, signaling victory for democracy and peace. Soviet leader Mikhail Gorbachev even won the Nobel Peace Prize in 1990 for his role in putting the Cold War to an end.
Over the past century, the typical home kitchen has undergone a significant transformation, reflecting both social changes and new technology. In the 1920s and ’30s, kitchens were primarily utilitarian spaces with a focus on functionality and easy-to-clean surfaces. Appliances were limited, hand mixers had cranks, and gas ovens, which had replaced wood or coal-burning stoves in most homes, were starting to themselves be replaced by electric ovens.
The post-World War II consumerism of the late 1940s and 1950s brought bigger kitchens for entertaining and more labor-saving appliances, including blenders, mixers, and dishwashers. The kitchen space became more streamlined and functional, and the 1960s and 1970s brought countertop food processors and microwave ovens into the mainstream.
Open-plan kitchens and islands became increasingly popular in home design throughout the 1980s and ’90s, indicative of the kitchen’s role as a hub for family and friends to gather. That trend continued into the 21st century, along with a significant shift toward high-tech kitchens, smart appliances, and a focus on sustainability. Today’s kitchens — reflecting the changing ways we prepare, store, and consume food — look dramatically different than they did a century ago, making many once-popular items obsolete. Here are six things that your grandparents and great-grandparents might have had in their own home kitchens a century ago.
An Icebox
Before the widespread availability of electric refrigerators, iceboxes were used to keep perishable food cool. These wooden or metal boxes had a compartment for ice at the top, and fresh ice was delivered each week by an iceman. The design of the icebox allowed cold air to circulate around the stored items, while a drip pan collected the water as the ice melted. Naturally, iceboxes fell out of fashion as electric fridges went mainstream. In 1927, General Electric introduced the first affordable electric refrigeration, which relied on a refrigerant for cooling rather than ice.
Before commercial butter production made it possible to buy butter at the market, churning cream into butter was an activity done at home. The hand-crank butter churn was introduced in the mid-19th century, and it became the most commonly used household butter churn until the 1940s. In the early 20th century, the Dazey Churn & Manufacturing Company began producing glass churns that could make smaller quantities of butter much quicker than the larger, time-intensive churns. Once the butter was churned, it could then be poured or pressed into decorative molds for serving.
A Hoosier is a freestanding, self-contained kitchen cabinet that was popular in the early 1900s, named after the Hoosier Manufacturing Company that made it. Also known as a “Kitchen Piano” due to its shape, this kitchen necessity offered homemakers ample storage space and an additional work surface. Hoosier cabinets had numerous drawers and shelves for storing cookware and utensils, as well as features such as a flour bin with a built-in sifter, a sugar bin, a spice and condiment rack, a bread bin, a pull-out cutting board, and a cookbook holder. The all-in-one cabinet fell out of favor as kitchen designs began to incorporate built-in cabinets and islands for additional storage and counter space, but they’re still sometimes used for decorative storage.
While the iconic KitchenAid stand mixer was patented more than 100 years ago in 1919, electric hand mixers weren’t commercially available until the 1960s. Before then, beating eggs or mixing other ingredients was done by hand, often with a manual hand mixer (also called a rotary egg beater). First developed in the 1850s, hand mixers had two beaters that rotated when you turned a crank. Though the style and mechanisms evolved over the years, manual hand mixers were still widely used in the 1920s, when only two-thirds of American households had electricity.
Advertisement
Advertisement
A Wall-Mounted Coffee Grinder
Even though ground coffee was available in bags and cans in the 1920s, and instant coffee was gaining popularity, household coffee grinders, such as a wall-mounted coffee grinder (or mill), were still a common kitchen appliance. According to a 1918 New-York Tribune article on the art of making perfect coffee, “The real coffee lover will always have a mill in the kitchen.” The wall-mounted, hand-crank style had a glass container that could hold a pound of coffee beans, and a container with tablespoon markings to catch the ground coffee.
There was a time when treasured family recipes were written on 3-by-5-inch index cards and stored in a box on the kitchen counter. Before the 1920s, most recipes were passed on by example — young women would learn how to make their grandmother’s pot roast by helping her in the kitchen. As such, handwritten recipes were generally a list of ingredients, often without quantity, and vague directions. As kitchen science developed, magazines began advertising recipe subscriptions delivered as preprinted, perforated cards. Women also started writing their own recipes on blank cards to collect and exchange, and the recipe box proved to be a more decorative and lasting storage solution than a shoebox. Like many vintage kitchen items, this nostalgic throwback still has novelty appeal, but the recipe box has largely been replaced by digital recipes stored on apps and websites.
Beginning in the 1830s, a combination of poverty, rapid industrialization, and immigration contributed to the rise of notorious street gangs throughout New York City. For the next several decades, these groups ran rampant until being largely replaced by organized crime syndicates toward the end of the 19th century. But during their heyday, gangs such as the Bowery Boys and Dead Rabbits ruled the streets of New York, particularly a neighborhood in southern Manhattan known as the Five Points. This turbulent period in New York City was marked by violence and corruption, events that were brought to the silver screen in Martin Scorsese’s 2002 historical drama Gangs of New York.
While that film is based on realities of the time, it also furthered several misconceptions about this crime-ridden era. We reached out to anthropologist R. Brian Ferguson, a professor at Rutgers University-Newark and author of the 2023 book Chimpanzees, War, and History, to learn more about this volatile period in NYC history. Ferguson has spent decades studying and teaching how conflict permeates throughout society, and was interviewed for the 2002 documentary Uncovering the Real Gangs of New York, a special feature included on DVD copies of the Scorsese film.
(Editor’s note: This interview has been edited for length and clarity.)
HISTORY FACTS: What was life like in New York City’s Five Points neighborhood?
FERGUSON: Well, the Five Points was from the intersection of different streets, and it began as a residential neighborhood but it was built on landfill from filling in a big lake. So it was wet, and it was sinking, which meant that it was full of diseases in the summer. By 1827, it was already disreputable. Mainly poor people who had no choice about where to live were there — it was the bottom for New York society.
For decades it became — not just in New York, but internationally — famous for incredible squalor and crime and drunkenness and prostitution. It became a symbol for all of that. It was also a highly political environment, and the politics of the time were more contentious in New York than what we’re seeing today in our own lives. It was really a tough time politically.
HISTORY FACTS: Speaking of politics, I know Tammany Hall was a big player in New York City. What was Tammany Hall and how did it play a role in local politics?
FERGUSON: Tammany Hall was the Democratic political machine. It won elections, gave out patronage; it was famous for corruption and vote fraud. But besides that, it was the only kind of government that did anything for the poorest of the poor. In the 1840s, it had found its base in immigrants who were pouring into New York, many of whom were Catholic, which Protestant America generally hated.
Tammany Hall was controlled by political ward politicians from the street up, using force. It wasn’t a top-down organization as it once was, but it was really responding to what was happening on the streets, like in the Five Points. The Five Points was its central power base because it was so densely populated. It was known as the “Bloody Ould Sixth Ward,” and the votes from there could control mayors, city government, even tip state and presidential elections.
HISTORY FACTS: Who were the predominant gangs at the time?
FERGUSON: Gangs were always changing; they rarely lasted more than a few years. They came and went by time and place and by politics. The movie by Scorsese is based on a book by Herbert Asbury, both called Gangs of New York, and both of those introduced a lot of inaccuracies. In the movie, the big gangs were the Dead Rabbits and the followers of Bill “the Butcher” Poole. The riot that did occur was between the Dead Rabbits and the Bowery Boys. The Dead Rabbits were a gang; whether the Bowery Boys were a gang or not — they were also kind of a social type — is not as clear.
The movie was inspired by the Bowery Boy-Dead Rabbit riot of 1857. That was a real thing that went on for hours with maybe 11 people dead, and it involved fighting — bricks, up to guns. It was the biggest gang clash that ever occurred in New York City. Not the biggest violence on the street, but the biggest gang clash. And that was the inspiration for Scorsese’s film adaptation.
HISTORY FACTS: You mentioned immigration — how did the gangs reflect the ethnic makeup of New York City at this time?
FERGUSON: The gangs were organized — the nucleus of the power structure were saloons and volunteer fire companies, which were omnipresent and very political. Leadership in a gang came by association with one of those, and leadership was based mainly on fists. Fighting in the street was extremely common. All neighborhoods had their ethnic character, but it was never pure; it was always a mix.
So the Five Points was mostly Irish-inhabited at this point, but not exclusively. Gangs were mostly Irish but wouldn’t turn away anybody who lived in the neighborhood who could fight. But they were also shaped from the top down. Politicians built their organizations based on the compositions of neighborhoods. It was both a cause and effect of the political organization that gave life to the gangs. And it wasn’t just mostly Irish, but you could say particular areas of Ireland. A whole building might be from one area.
But [in terms of the city’s general ethnic makeup] German immigration was big; [New York City] also had people who were native born and were seen as “true” Americans. Italians hadn’t come in yet; the Eastern European Jews hadn’t come in yet. But New York always had lots of different people in it, like Syrians were a big immigrant population.
Advertisement
Advertisement
HISTORY FACTS: How did immigration contribute to the rise of these gangs?
FERGUSON: The immigration was a big part of, to use a contemporary word, the intersectionality of street organizations back then. Most immigrants were also extremely poor. But it wasn’t just the immigrants — this is when industrialism was on the rise, unemployment was exploding for all, and the time around the 1850s was seen as mainly just rich and poor. [There was] little in between. And poverty was mapped onto the ethnic divisions.
Also, politicians would scare the immigrants with the specter of competition from freed slaves, and really conjured up racism to a hot degree. So, there were mixes in terms of how people were organized. The racist, and anti-abolitionist groups, were mostly poor and could include any of the poor. But nativism, which was anti-immigrant, excluded the Catholics, and the Catholics were a lot of the poor. So there were these different combinations possible, and the local ward politicians worked all of these permutations.
HISTORY FACTS: Is the Irish vs. “native” conflict as depicted in Scorsese’s film accurate?
FERGUSON: The Irish versus the “native” thing, it’s a yes and no. It’s not false, but it’s not really true either. The Bowery Boy-Dead Rabbit riot of 1857 was part of crises all across the United States in the time leading up to the Civil War. In New York state, this played out largely as a conflict between the state government and the city government. The state government in Albany was Protestant, Republican, and anti-immigrant, and the city government by this time was more immigrant- and Catholic-oriented and Democrat. So this was the polarization.
The state of New York then put through, in 1857, a kind of a coup, restructuring the city, which took over many of the city functions — like control of the Port of New York. But most important of all, they disbanded the police force at the time — the Tammany police force known as the Municipal Police — and created a new police force called the Metropolitan Police that were controlled from Albany. Tammany itself, besides the state and city thing, was extremely divided into two warring factions. So there was like a three-way struggle going on.
Nativism was a part of all of that, but who had political power, and who got the benefits of controlling corruption were at least as big or bigger issues. When I did research on the gangs that fought in 1857, they all had clear local political alignments, and one thing that was left out entirely of the film, and Asbury’s book, was that the fights that became riots began with attacks on the Metropolitan Police — that’s the state police force. That was clearly one of the biggest issues here.
Advertisement
Advertisement
HISTORY FACTS: What led to the rise of these gangs and their eventual downfall?
FERGUSON: Well, poor neighborhoods like the Five Points — but there were many — provided kind of the raw material: people who could fight and were looking for something to do, and were looking for a leg up. These could be shaped into adult gangs. But as an anthropologist, what I like to look at is the local organizations of the street that organized and raised up gangs into kind of political actors. Back at that time, there were two organizations: There was the saloon, which was the neighborhood center, and the volunteer fire companies, which were all over the city and connected to large political factions. And [all of these groups] were always fighting; they would fight each other all the time. So, the conflicts in 1857 went through these things, like volunteer fire companies and saloons, to raise these local street people up into named gangs and to pit them against each other.
If you look at the gangs around then, they’re very big in newspapers of the times. After that, they’re not so much. In later years, and I’ll just pick 1885 as an example, there were still street gangs all around the town, but they were less important politically. The reason was the then-boss of Tammany Hall, a guy named Dick Croker, had iron control and didn’t need [the gangs] as much.
Also, that was the Gilded Age of extreme capitalist fortunes, and the capitalists who had great control over the city supported the police, which by that point was the NYPD, to keep control of what they saw as dangerous classes — the people who lived in the slums. Otherwise, cops — if [the cops] kept [the people in the slums] from being a problem — could do whatever they wanted, which led over a couple of decades to police brutality and corruption.
And then there was a big scandal that came along in 1895. It was called the Lexow Investigation and it revealed that the New York Police Department was what they called “organized criminality” in New York City. It wasn’t allowing it, it was it. So, reform and another era of political turmoil in Tammany Hall led to new named gangs coming up. People might recognize the Monk Eastman gang or the Paul Kelly gang. And by about 1900, these were changing from what they used to be and taking over what the police had been pushed out of and had controlled, including gambling and prostitution and rackets and extortion.
That was a new era that led to the gangster era, and the gangsters in their peak generally led to less street crime because they were organized to make money. You didn’t want people to get mugged when they came out of a speakeasy. So, the area got less violent, less uncontrolled, as that developed. And as it went on, New York City went through the whole process of development, which is a much bigger topic about changing industrial structure and job structure and development of a middle class.
HISTORY FACTS: Going back to Scorsese’s movie, what did the film get right and what did it get wrong?
FERGUSON: It’s imaginary, like any movie; I don’t hold that against it. The plot, of course, is fiction. The film was loosely based on Herbert Asbury’s book, and Herbert Asbury really tried, but he had bad information. I’ve tracked down most of his sources in my own research. The movie did get the look right. Many details of the time are very real. They exaggerated certain things, like they made the Dead Rabbits look like they wore a particular kind of uniform, which, not really. No naval ships fired cannons on crowds, although soldiers did. The film left out the stench and the insects and the sewers in the street and all of that stuff. So you don’t get quite that depth of it, but it’s a movie. (Editor’s note: Ferguson recommended a book by Tyler Anbinder called Five Points for those interested in learning more about these details.)
Other big inaccuracies are due to the fact that the filmmakers had to compress time. And so Bill “the Butcher” [Poole] — the guy played by Daniel Day-Lewis — was dead a few years before the big Bowery Boy-Dead Rabbits riot. And Scorsese, consistent with his own film background, made Bill Poole a crime boss, getting a cut of everything. No, that came later. There’s nothing indicating that this was organized crime in that sense. Another thing is that Bill Poole worked for the politicians. He wouldn’t kill one of them, as he does in the film. There was a political hierarchy and he was a step down.
There were, in reality, lots of little turf fights all the time, but there wasn’t anything like Daniel Day-Lewis says to decide once and for all who’s going to be the lords of the Five Points. It wasn’t that kind of territorial control. And one big inaccuracy of the movie is the excessive violence, especially in the opening riot. Now, there was violence all the time, but with fists and bricks and sometimes up to guns. Most people in the poor neighborhoods didn’t own guns; they were too expensive. But there were chimneys all over the place and you could topple a chimney over and you’ve got a supply of bricks, which is what they did.
I think the thing that I have the biggest issue with in the film is that it leaves out how important was politics and everything that was going on, and how important was the role of the new state Metropolitan Police. But I’ll add, on a positive note, I think it was great that Scorsese brought in the Draft Riots [violent citywide protests against the Civil War draft and fueled by racial tension] — although, this was not a gang event, other than gang members participating in rioting mobs, individually. But I teach about the Draft Riots, and what I can tell you is that no one has heard about this incredible event in American national history. The Draft Riots tell you an awful lot about what was becoming America.
Advertisement
Advertisement
HISTORY FACTS: What gang-related sites from this time period are still standing?
FERGUSON: There are a lot of gang locations if you know where to look, walking on 2nd Avenue from 14th Street to Houston Street. And there’s more than a dozen significant locations, mainly shootings, that took place on that stretch, although that was mostly the later gangsters up to the beginning of Prohibition.
From the [Gangs of New York] film era, and for the Five Points, there’s really only one thing that remains. On the northwest corner of Baxter and Worth Street — this is between the courthouse district and Columbus Park — is the only remaining point. That point, I can’t go by that area without going by and standing on that point. I’ve seen lots of illustrations of the Five Points and I just imagine all those illustrations while I’m there and standing on that point. But that’s the only physical remnant that you can see.
As time went on, the Five Points kind of got toned down by mission and other reform efforts in the Five Points itself. The most squalid and dangerous part of New York moved just one block east to Mulberry Street. When they tore down the block known as Mulberry Bend, they didn’t cart the stuff away; they just tumbled everything into the basements. So when they were redoing Columbus Park, they cleaned away the surface and I could see all of these basements that were the Five Points, that were Mulberry Bend — they’re still there. But they’re underground.
If I can expand the scope a little bit for gangster sites, my favorite is many blocks north on Great Jones Street, which is in the East Village. Right on the south side of Great Jones Street, west of the Bowery, there are two buildings. One has a window on the second floor that has an arch to it. This window became famous because Andy Warhol bought it some years ago, and the artist [Jean-Michel] Basquiat had a studio there, and in fact died in that room. But that building was the headquarters of Paul Kelly’s gang. Paul Kelly, whose birth name was Paul Vaccarelli, is what my [current] research centers on, and I think he was the most successful gangster in New York City history. For one thing, he died in bed, which most gangsters didn’t.
____
R. Brian Ferguson is a New York City-based anthropologist. To learn more about his work, visit his website. His most recent work, Chimpanzees, War and History, is also available for purchase here.
Flowers have been collected and shared since ancient times, appreciated for their beauty, scent, and practical uses. The long tradition of giving flowers for special occasions has evolved over the centuries, but it’s still an enduring ritual that spans all cultures. From congratulations on the birth of a baby to condolences on the loss of a loved one, sending flowers continues to be one of the most popular ways to mark the momentous events of life. It’s so popular, in fact, that the worldwide cut flowers market was over $36 billion in 2022, and is projected to go over $45 billion by 2027. Valentine’s Day continues to be the biggest flower-giving day of the year, but it is far from the only special occasion marked by this ancient ritual. Here is a look at the fascinating role flowers have played throughout human history, from the evolution of flowering plants to the booming floral industry.
The First Flowers
Around 80% of green plants are flowering plants, and the oldest flowers in the world date back to the Cretaceous Period more than 130 million years ago. Those first flowers didn’t resemble ones we know and love today: They were barely visible to the human eye and almost unrecognizable as flowers even under a microscope. The interaction between flowering plants and insects aided in the coevolution of both, with flowers developing strong fragrances, appealing colors, and larger petals to attract pollinators. It was these same traits that also appealed to the earliest human societies, which began to cultivate and use flowering plants in religious and cultural ceremonies.
Some of today’s most popular flowers for bouquets and floral arrangements were first cultivated thousands of years ago. The cultural significance of flowers has been reflected in the art and literature of ancient China, Egypt, Greece, and Rome. Roses, one of the most popular flowers for gifting, were first grown in gardens 5,000 years ago in China. The ancient Egyptians used flowers in religious ceremonies as offerings to the gods and the dead, decorated their war carts with flowers before going to battle, and painted and carved floral and leaf motifs into their art. The Greeks and Romans used flowers in similar ways, associating specific varieties with their gods and goddesses and using flowering plants in festivals, rituals, and for their own enjoyment.
In more recent history, cherry blossoms (sakura) have been revered in Japan since the Heian period (794–1185) and, because they bloom for only a short time in the spring, are associated with the transient nature of life. Marigolds, which have been a part of Mexican culture since the pre-Columbian era, were imported to India over 350 years ago and have become an integral part of wedding celebrations and Hindu festivals such as Diwali.
In Europe, the symbolic use of flowers developed in the medieval and Renaissance eras, when different flowers and flowering plants were linked to a variety of virtues and emotions. In the Victorian era of the 19th century, floriography, or the language of flowers, emerged as a way of communicating specific sentiments through the type, color, and even the arrangement of specific flowers. This form of flower code was a way of conveying one’s feelings in an era marked by restraint and discretion. Artist and writer Kate Greenaway’sLanguage in Flowers, published in 1884, was an indispensable floriography dictionary, providing the meanings of different flowers and their significance in bouquets and floral arrangements. For instance, if a gentleman wished to send a bouquet of flowers to his betrothed, he might include blue violets, signifying faithfulness, and white roses, meaning “I am worthy of you.”
Floristry — the cultivation, arrangement, and sale of cut flowers — developed around the mid-19th century. The Society of American Florists was established in Chicago in 1884 to advance floral artists and sales. As the 20th century began, the proliferation of floral shops and flower delivery services further popularized and commercialized the tradition of giving flowers. Mother’s Day, ranking second only to Valentine’s Day in flower-giving, became an official U.S. holiday in 1914, receiving enthusiastic support and promotion from the floral industry.
Though floriography isn’t as popular today as it was in the Victorian era, different flowers continue to carry specific meanings. Giving someone red roses still signifies feelings of love and desire, while white lilies are still considered traditional flowers in many cultures for both weddings (representing purity and new beginnings) and funerals (grief and remembrance). In a nod to the old-fashioned flower code, Kate Middleton’s 2011 wedding bouquet included white Sweet William blossoms, signifying gallantry (and referencing the groom, Prince William).
Today, flowers remain a popular gift for a wide variety of occasions, both to convey specific sentiments as well as to be enjoyed for their aesthetic beauty. With more than 15,000 retail florists in the United States alone, it is easier than ever to let someone know we’re thinking about them. Modern technology has even embraced the tradition of flower-giving with a series of floral emoji, including a rose, hibiscus, cherry blossom, sunflower, tulip, and flower bouquet. Now, “giving” flowers is as simple as dashing off a text message or writing an email, proving that the long history of flower-giving endures.
For all the formulaic sitcoms and talk shows that have run throughout the history of television, there are a number of times when audiences have witnessed true ingenuity. From memorable commercials to shocking plot twists, television events that may seem commonplace today once revolutionized the medium. Ever since the demonstration of the first television in 1926, the small screen has been a reflection of larger shifts in American society. With that in mind, here are five historic firsts in television history.
The First Official TV Commercial
On July 1, 1941, at 2:29 p.m., viewers tuning in to the NBC-owned WNBT television station saw something they had never seen before. Before that day’s broadcast of the Brooklyn Dodgers vs. Philadelphia Phillies baseball game, the first authorized TV commercial hit the airwaves. The inaugural ad was produced by Bulova watches and ran for about 60 seconds, featuring visuals of a clock superimposed over a map of the United States with the accompanying voice-over, “America runs on Bulova time.”
The watchmaker paid just $9 to broadcast the advertisement ($4 for air fees and $5 for station fees), a far cry from the exorbitant advertising prices of today. WNBT was also the only station to advertise that day, though other networks soon followed suit. The Federal Communications Commission had previously implemented an advertising ban that forbade television commercials, though broadcasters still ran ads without authorization. The FCC finally issued 10 commercial licenses on May 2, 1941 — ushering in a new chapter in television history.
Laugh tracks are an indelible part of sitcom television, and it all began in 1950 with a little-known program called The Hank McCune Show. The sitcom debuted on local stations in 1949 and centered around a fictional television variety show host. By the time the series made its network debut on September 9, 1950, it was accompanied by roaring laughter from a laugh track despite the lack of any live studio audience. One review from Variety magazine said, “Although the show is lensed on film without a studio audience, there are chuckles and yucks dubbed in… the practice may have unlimited possibilities.”
The laugh track was invented by mechanical engineer Charles Douglass, who was formerly a radar technician in the Navy. After leaving the military, Douglass created a device that came to be known as the “Laff Box.” A rudimentary version of Douglass’ invention debuted on The Hank McCune Show, though it took him three years to perfect his invention. Each 3-foot-tall Laff Box was handmade by Douglass and could hold 32 reels of 10 laughs apiece. By the 1960s, Douglass was supplying his much-coveted Laff Box to such iconic television programs as The Munsters and Gilligan’s Island.
Television spinoffs are standard practice today, and we have The Gene Autry Show to thank for kicking off the concept in 1950. The show delighted TV audiences, following the exploits of the titular singing cowboy and his trusted horse, Champion. When The Gene Autry Show wrapped up in 1956, the series was so popular that it inspired television’s first spinoff, a series called The Adventures of Champion, which ran for 26 episodes between 1955 and 1956. While the concept of a spinoff was unusual at the time, it became significantly more popular with the debut of The Andy Griffith Show in 1960. Starring Griffith himself, the hit series actually originated from a single episode of The Danny Thomas Show titled “Danny Meets Andy Griffith,” in which Griffith debuted the character of Sheriff Andy Taylor. The episode was a hit, and the concept earned a television run of its own.
Television has long played a key role in American politics, and few televised political events have a stronger impact than presidential debates, the first of which aired in 1956. Though neither President Dwight D. Eisenhower nor his challenger Adlai Stevenson participated in a televised debate themselves that year, both were represented by proxies, with Senator Margaret Chase Smith filling in for Eisenhower and former First Lady Eleanor Roosevelt representing Stevenson. The two women debated the issues live on-air on November 4, 1956, with Eisenhower coming out victorious in the election shortly thereafter.
The first televised presidential debate between the actual candidates occurred four years later, with a CBS broadcast on September 26, 1960. This debate pitted Senator John F. Kennedy against Vice President Richard Nixon, and was the first of four televised debates in advance of that year’s election. Kennedy was widely considered the winner of the debate, which many have speculated was due to his charismatic presence on camera compared to Nixon, who declined to wear makeup and appeared visibly sweaty. Though Nixon fared better in future debates, this moment in TV history helped Kennedy gain a valuable early lead in the polls, which he maintained en route to winning the presidency.
Advertisement
Advertisement
The First Scripted Birth
In 1948, the sitcom Mary Kay and Johnny — starring real-life married couple Mary Kay and Johnny Stearns — made history by incorporating the actress’ pregnancy into the show, becoming the first TV show to depict a pregnancy and birth. When the couple’s son, Christopher, was born on December 19, 1948, Mary Kay was notably absent from the live taping. In response, Johnny wrote a 15-minute episode that featured him pacing around a hospital waiting room awaiting his son’s birth. Unfortunately, all but one full episode of Mary Kay and Johnny was lost in the 1970s, and the show’s impact was forgotten over time. It was a few years later that another sitcom, I Love Lucy, also depicted a pregnancy and birth, proving to be significantly more impactful given the show’s popularity.
When actress Lucille Ball — the biggest television star at the time — became pregnant in 1952, producers needed to figure out how to interweave her real changing appearance with her character on I Love Lucy. At first, network executives suggested coming up with methods for concealing Ball’s baby bump, such as having her hide behind chairs. But Desi Arnaz — Ball’s husband and co-star — found those suggestions insulting, and fought back. After a conversation between Arnaz, CBS, and advertiser Philip Morris, the latter signed off on a plan to incorporate Ball’s real-life pregnancy into the plot of the show.
While the word “pregnancy” remained forbidden, the show’s characters spoke using synonyms such as “expecting” in reference to Lucy’s storyline baby. The character’s pregnancy was revealed during a December 8, 1952, episode titled “Lucy Is Enceinte” — “enceinte” being the French word for pregnant. With Ball scheduled to deliver her actual child on January 19, 1953, CBS scheduled the pretaped birth episode for that very same evening. “Lucy Goes to the Hospital” was a major television event, attracting more than 44 million American viewers and helping pave the way for talking about other previously taboo topics on television.
The term “secret society” encompasses a wide variety of exclusive and clandestine organizations, many of which have been in existence for centuries and count some of history’s most influential figures among their members. Secret societies pique our curiosity because they often keep their activities and objectives concealed from nonmembers and the public. Though there are exceptions, the intentions of these exclusive groups are generally not nefarious; for instance, some college clubs can be considered secret societies because they have private rituals and traditions whose symbolism and mystique serve to create a sense of belonging and shared purpose.
The most prominent secret societies have left their mark on history, from wielding their influence over governments to shaping the course of labor and religious movements. Because of the secretive nature of these organizations, their historical origins can be complex to track down, and are often debated by historians and scholars. From the medieval beginnings of the Freemasons to the puzzling origins of Cicada 3301, here are six unusual facts about these mysterious groups.
One of the World’s Oldest Secret Societies Still Flourishes Today
The history of the Freemasons dates back to the Middle Ages and the guilds of skilled stonemasons who regulated the qualifications of the stoneworkers. Their work required stonemasons to travel, encouraging a more open-minded worldview. The modern Freemasonry society was founded in England in 1717 and quickly spread throughout Europe and the American colonies. The organization established guidelines not only for stonework, but also for the moral and spiritual values of its members. Today, there are over 6 million Freemasons around the world. They still use the same system of secret rituals — including handshakes, passwords, and symbols — that have been used since the 18th century, but in recent years the group has begun making moves toward modernization and transparency. In 2021, the Freemasons issued the first annual report in their 300-year history.
The Real Illuminati Was Interested in Enlightenment for All
The name “Illuminati” has been used to refer to various groups, both real and fictional, since the 15th century. But the group most closely linked to the name dates back almost 250 years to the Bavarian Illuminati, formally known as the Order of the Illuminati. The short-lived secret society was founded in Ingolstadt, Bavaria, in 1776 by German professor Adam Weishaupt, who wanted to create “a state of liberty and moral equality, freed from the obstacles which subordination, rank, and riches, continually throw in our way.” Taking inspiration from the Freemasons and French Enlightenment philosophers, Weishaupt formed a secret society that climbed to more than 2,000 members in Bavaria, France, Hungary, Italy, and other regions where Enlightenment ideas were taking hold. The Bavarian government eventually shut down the Illuminati in 1784, prohibiting the creation of any groups not authorized by law. But there were those who believed the society went underground, spawning a number of conspiracy theories that linked the group to world events, from the French Revolution to the 9/11 terror attacks.
Enslaved Women Founded America’s Oldest Secret Society of Black Women
The annals of history are filled with the names of secret societies whose membership was exclusive to men, but women have also had a role in creating these clandestine groups. The United Order of Tents is the oldest organization of Black women in the United States, founded by two formerly enslaved women, Annetta M. Lane and Harriet R. Taylor, in Norfolk, Virginia, in 1867. The organization, which still maintains chapters throughout the United States, is believed to have supported operations of the Underground Railroad. During the turbulence of the Reconstruction era, the group provided mutual aid and support to the Black community, serving as a “tent of salvation” turning their time of need.
An Ancient Secret Society Inspired a Video Game Franchise
In the 11th century, the Nizari Ismailis were a group of powerful medieval Shiite Muslims in Persia and Syria. The group used guerilla tactics to outwit their enemies, including Christian Crusaders arriving in the Holy Land. Hated by other Muslim groups, they were given the name Hashishin, a pejorative Arabic word meaning “hashish user,” which was later westernized by Crusaders as “Assassins” and the English word came to mean a paid killer. The group fell to the Mongols in the 13th century, but the legend of the Nizari Ismailis lives on. The video game “Assassin’s Creed” creates a fictionalized world based around the Assassins and another ancient secret society, the Knights Templar, a military order established in the 12th century and endorsed by the Catholic Church. The Knights Templar served as protectors of Christian pilgrims and Crusader states in the Holy Land, but their objectives differed from those of the Nizari Ismailis. Though the timelines of the two groups overlap, there is no historical evidence to support that the two groups fought each other.
Advertisement
Advertisement
One Secretive Club Has Its Own Elite Summer Camp
Among all of the exclusive societies, the Bohemian Club may be the only one with its own elite summer camp. Founded in 1872 in San Francisco as an exclusive gentlemen’s club for journalists, artists, and musicians, the Bohemian Club expanded to include international political and business leaders as well. Bohemian Grove, the club’s privately owned 2,700-acre campground in the redwood groves of Sonoma County, provides a gathering place for a two-week summer encampment that includes secret rituals, performances, and private discussions that have changed the course of history. At the 1967 Bohemian Grove encampment, Richard Nixon and Ronald Reagan reportedly decided which of them would pursue the Republican presidential nomination. At a September 1942 gathering at Bohemian Grove, physicist Robert Oppenheimer attended an S-1 Executive Committee planning meeting to finalize details for the Manhattan Project, leading to the development of the atomic bomb. Given the privileged and private membership and traditions, it’s not surprising that the group has been the subject of conspiracy theories and protests and, most recently, a lawsuit alleging wage theft and labor violations.
Though secret organizations have existed for centuries, the internet has given rise to new ones in recent years. One of the most enigmatic is Cicada 3301, which appeared online in 2012. The group claimed to be searching for “highly intelligent individuals” by presenting a series of complex digital puzzles based on cryptographic techniques including ciphers, codes, and steganography. Solving the puzzles involved a wide range of knowledge about coding, programming, literature, art, and other disciplines. There was some speculation that the puzzles were being used as a recruitment tool for intelligence agencies, while others thought they might be part of the promotion of a new game, though no attempt was ever made to monetize the puzzles. Those who solved the early puzzles were presented with additional challenges and login credentials for a darknet site. The group’s identity remains unknown and they have been largely silent since 2014.
Under the watchful eye of the Statue of Liberty, Ellis Island was the entry point for countless immigrants who came to America at the turn of the 20th century. For a little over 60 years, from 1892 until its closure in 1954, the U.S. Immigration Station on Ellis Island processed more than 12 million immigrants, forever changing the culture of the United States. Today, Ellis Island is a place with a past as complicated as it is influential. Here are five facts about the singular role this 27.5-acre island played in American history.
The Island Has Had at Least Seven Different Names
Before Europeans colonized North America, Ellis Island was known as Kioshk, or Gull Island, by Mohegan Indigenous peoples. In 1630, the island was purchased by the Dutch, who went on to call it Little Oyster Island for its abundance of, you guessed it, oysters. Later, in the 1700s, the island became the site of a number of hangings and got the nickname “Gibbet Island,” meaning “gallows.” Over the years, the site was also known as Bucking Island, Dyre Island, and Anderson’s Island, until in 1774, the land was purchased by Samuel Ellis, who ran a tavern on the little spit of mud. Ellis died in 1794 and ownership of his namesake island remained with the Ellis family until 1806, when it was sold to a man named John A. Berry, who then sold it to the U.S. government in 1808.
Inspection Took Half a Day — and Not Everyone Passed
For European immigrants who deboarded their ships in good health and with papers in order, the inspection process lasted about half a day. Inspections consisted of a number of physicals as well as a reading test, along with a series of questions, including whether they already had family in America, if they’d ever been to prison, and if they were an anarchist. (The wave of immigration through Ellis Island coincided with a rise in fears about communism and anarchy in the United States.) Up to 20% of the immigrants who went into Ellis Island were detained for either political, legal, or health reasons, and around 2% were sent home.
More Than 1 Million Immigrants Were Processed in 1907
On April 17, 1907, Ellis Island processed its highest number of immigrants in one day: 11,747 individuals. That year was the immigration facility’s most prolific, and it processed just over 1 million new arrivals. The island’s heyday ended after 1924, when the National Origins Act (part of the Immigration Act of 1924) restricted the number of immigrants who could come to the United States.
Some 40% of Americans Can Trace Their Roots to Ellis Island
The majority of immigrants who came through Ellis Island arrived from Southern and Eastern Europe, escaping a number of difficulties ranging from poverty to religious or ethnic persecution. Today, it’s believed that almost half of Americans can trace part of their ancestry back to Ellis Island. Modern visitors can stop by the Family History Center at the Ellis Island National Museum of Immigration to explore their families’ roots.
Ellis Island Is Located in Both New York and New Jersey
Ellis Island sits in New York Harbor between the states of New York and New Jersey, and though it is technically owned by the federal government as a historically protected site, it is officially located in both the Empire State and the Garden State. In 1998, the U.S. Supreme Court ruled that both New York and New Jersey could lay claim to the island: The main building that tourists visit is located in New York, and a 21-acre portion of the island that was filled in later is located in New Jersey.
In 1967, San Francisco’s Haight-Ashbury district became the home base for a burgeoning counterculture. Known as the “Summer of Love,” the social movement was defined by a collective rejection of mainstream values and an embrace of ideals centered around peace, love, and personal freedom. An estimated 100,000 young people descended on the area; these artists, musicians, and drifters — collectively referred to as “hippies” — created an unforgettable cultural shift, touching everything from the way we view the self, to innovations in music, fashion, and art, and our approach to making an impact on society. More than 50 years later, the Summer of Love still dances freely in America’s memory.
The Summer of Love Actually Started in the Winter
Contrary to its name, the Summer of Love actually kicked off in the wintertime. In January 1967, in San Francisco’s Golden Gate Park, more than 20,000 people who shared a desire for peace, personal empowerment, and unity gathered for an event called the Human Be-In. It was a loud and proud harbinger to the blossoming counterculture movement set to congregate in Haight-Ashbury in just a few months.
The idea for the Human Be-In — also known as the “Gathering of the Tribes” — sprung from the similar, but much smaller, Love Pageant Rally that was held on October 6, 1966 — the day that California made LSD illegal. Organizers Allen Cohen and Michael Bowen, co-founders of the underground newspaper the San Francisco Oracle, wanted to re-create the peace and unity of that day, only on a larger scale. Their aim for the Human Be-In was to spread positivity and bridge the counterculture’s anti-war and hippie communities, while raising awareness around the pressing issues of the time: questioning authority, rethinking consumerism, and opposing the Vietnam War. On January 14, 1967, the idea came together. Counterculture icons such as Beat poet Allen Ginsberg and LSD advocate Timothy Leary spoke to the masses — the latter famously urged participants to “turn on, tune in, drop out” — and the Grateful Dead, Jefferson Airplane, and other legends performed at the event. The optimism that collective action could have a tangible impact on society felt stronger than ever. San Francisco Chronicle columnist Ralph Gleason said it was “truly something new,” calling it “an affirmation, not a protest… a promise of good, not evil.” The wheels for the Summer of Love were in motion.
The Summer of Love not only introduced a cultural revolution — it also marked a turning point in pop culture. It made stars of some of music’s most enduring names and introduced major music festivals as we know them today. After the inaugural Human Be-In, other similar events unfolded around the world, laying the blueprint for large outdoor live performances. The first event to specifically call itself a music festival took place on June 10 and 11, 1967, on Mount Tamalpais in Marin County, just north of San Francisco. The KFRC Fantasy Fair and Magic Mountain Music Festival featured performances by the Doors, Jefferson Airplane, the Byrds, Steve Miller Band, and many others, and is considered America’s first true rock festival. One week later, another pivotal event — the centerpiece of the Summer of Love — changed live music forever.
The Monterey Pop Festival took place across three days, June 16, 17, and 18. Organized by influential figures in the music scene, including John Phillips of the Mamas and the Papas, former Beatles publicist Derek Taylor, and record producer Lou Adler, the event attracted upwards of 200,000 attendees over the weekend. Prior to the festival, the release of “San Francisco (Be Sure to Wear Flowers in Your Hair)” by Scott McKenzie, a song penned by Phillips to promote the event, garnered significant global attention, becoming not only a chart-topping hit, but a driving force in enticing young people to join the hippies in Haight-Ashbury that summer. Press coverage turned Monterey Pop into a worldwide media spectacle. Iconic images from the event captured in a 1968 documentary by D.A. Pennebaker became lasting symbols of the hippie movement. The festival also catapulted artists such as Jimi Hendrix, Janis Joplin, Otis Redding, and The Who to fame, thanks to their legendary performances during that weekend. Monterey became the template for the modern festival industry, showcasing emerging artists alongside blockbuster bands in a massive outdoor setting.
American Counterculture Was Catapulted Into the Mainstream
Although little attention had previously been given to the burgeoning free-love community, national media flocked to the Human Be-In and the events that followed. During the Summer of Love, scenes from Haight-Ashbury were reported on by major print and broadcast outlets across the world, instilling fear of strange new unknowns in some, inspiring others, and nonetheless firmly planting counterculture ideals and visuals front and center for America to see.
The cultural revolution was further bolstered by the music of the era. Psychedelic rock, folk, and protest songs became anthems of the movement, resonating with both the youth and older generations. Eventually, the anti-establishment sentiments and activism of the counterculture began to influence mainstream politics and social movements. Issues such as civil rights, environmentalism, gender equality, and opposition to the Vietnam War gained broader support and attention as these ideas permeated mainstream discourse. Though the Summer of Love was itself short-lived, its legacy continued to shape popular culture, fashion, music, and social norms for decades to come.