The Macy’s Thanksgiving Day Parade began in 1924, and has since become an essential kickoff to the holiday season in the United States. The parade, organized by the retail giant Macy’s, is renowned for its massive character balloons; for most of the event’s existence, these balloons have taken on a life of their own. Some iconic balloon characters such as Snoopy and Pikachu have appeared in different variations every year for decades, while others have come and gone. Each year, the procession draws millions of spectators in person and tens of millions more watching at home. It’s a spectacle that has grown to be an integral part not just of the holidays, but of American culture. Here are five facts you might not know about the parade, from its Christmas origins, to its role in the war effort, to just how long it takes to inflate one of those famous balloons.
Photo credit: Education Images/ Universal Images Group via Getty Images
It Was Originally Known as the Macy’s Christmas Parade
In 1924, Macy’s flagship New York City department store completed a major renovation that made it the largest store in the world. To entice people into its more than 1 million square feet of shopping space at the start of the busy holiday season, the retailer planned a parade for Thanksgiving morning. This first parade, which took place on November 27, was called the Macy’s Christmas Parade.
The inaugural Macy’s Christmas Parade spanned 6 miles and featured live animals from the Central Park Zoo, including bears, elephants, monkeys, and more. Store employees didn’t just march in the parade: A lot of them were immigrants from Europe and helped to plan it, wanting to incorporate elements of their traditional holiday festivities. The parade’s famous balloons weren’t around yet, but floats that year featured Mother Goose favorites Little Miss Muffet and Little Red Riding Hood, made to match the store’s holiday window display. The final float featured Santa on his sleigh, a tradition that remains today — even though the celebration was advertised as the Macy’s Thanksgiving Day Parade starting in 1935.
Photo credit: George Rinhart/ Corbis Historical via Getty Images
Felix the Cat Was the First Balloon Character
In 1927, after three years of using live animals, the parade introduced its first set of iconic balloon characters. The visionary behind the balloons was Anthony Frederick Sarg, a puppeteer and theatrical designer who also designed Macy’s elaborate holiday window displays, and was the parade’s artistic director at the time. Several caricatures made headlines that year: There was a towering “human behemoth” standing at 21 feet tall, a colossal dinosaur measuring 60 feet and escorted by an entourage of cavemen, a 25-foot dachshund, and what was described by The New York Times as “gigantic turkeys and chickens and ducks of heroic size.” The parade also introduced its first character balloon that year: Felix the Cat. Felix is largely considered to be the world’s first animated film star, and around the time of his unprecedented appearance in the Macy’s parade, the beloved cartoon cat also made history as the first cartoon character to be licensed for merchandise.
Macy’s Used to Let the Balloons Float Away After the Parade
Beginning in the late 1920s, some of the balloon floats were released into the sky at the end of the parade. Filled with helium instead of air, they were designed to slowly deflate over the course of about a week. Macy’s expected the balloons to eventually turn up in various parts of the country, and rewards were offered to those who found the balloons and returned them. The hope was that the balloons would help spread word of the parade — and its namesake retailer — as the event wasn’t yet broadcast on television.
Unfortunately, it didn’t go as smoothly as planned. In 1928, just days after that year’s parade, a tiger balloon came down on the roof of a Long Island home. It caused chaos in the neighborhood and a tug-of-war ensued; eventually, the balloon burst into “dozens of fragments,” The New York Times reported. In 1931, a giant dragon was hit by a plane before falling into Long Island’s Jamaica Bay and being torn apart by a crowd that had gathered at the shore. The same year, a cat balloon — largely believed to be Felix but reportedly just similar-looking — caught fire on a telephone wire. And a massive blue hippopotamus went missing and was subsequently hunted at sea. Despite the problems, balloons were released again in 1932, and once more in 1933, before the stunt was retired.
In 1942, during World War II, Macy’s President Jack Straus announced that the parade would be canceled in order to devote resources to the war effort. At the time, the balloons were made of rubber, which was needed to manufacture everything from soldiers’ footwear to planes. Rubber rationing was in effect in the United States, and civilians were encouraged to donate any extra they had to the U.S. forces.
“We are turning ourselves over, body and soul, with no strings attached, to the New York City Salvage Committee,” read the cancellation notice, which was written from the perspective of the parade’s balloons. “Destined for the rubber scrap pile, we will perhaps find our way into tires for tanks, or maybe life rafts. Wherever we’re most needed, we’ll be glad to serve our country.” Straus made the announcement alongside New York City Mayor Fiorello LaGuardia, who dramatically deflated a green dragon balloon with a knife. Macy’s eventually donated 650 pounds of balloon rubber to the war effort. When the war ended in 1945, the parade returned.
The team of skilled artists and designers who bring the parade’s surreal inflatable visions to life refer to themselves as “balloonatics.” The process of creating these massive balloons is a meticulous and highly coordinated effort that often begins as soon as the previous parade ends. More than 50 people work year-round at the Macy’s Parade Studio in New Jersey. Although the time it takes to make each balloon varies, experts have said it takes an average of five months. Depending on the size of the balloon, it can take up to 90 minutes just to fill it with helium.
In the fall of 1621, a group of Pilgrims and Wampanoag gathered in Plymouth, Massachusetts, for a harvest feast. This event celebrated the Pilgrims’ first successful corn harvest, a skill they had been taught by an Indigenous guide named Squanto, who helped the European settlers survive in the unfamiliar territory. The feast lasted for three days and occurred sometime between September 21 and November 11. The meal they shared is now considered to be the first Thanksgiving dinner, though the complicated legacy of this inaugural event can’t be ignored.
The Thanksgiving holiday today celebrates a myth of unity and friendship between Indigenous peoples and European colonists, but the reality is much more complex. While the Wampanoag did help Puritan settlers upon their arrival in 1620 and 1621, European colonists went on to massacre and displace millions of Indigenous people in the decades that followed. It’s a dark chapter in the nation’s history that we’ve only recently begun to reckon with, even as we celebrate gratitude and togetherness each Thanksgiving.
Another common myth associated with this holiday is the food itself. Today, more than 400 years later, dishes such as turkey and mashed potatoes are synonymous with Thanksgiving. But many of the modern holiday staples are more recent inventions. The first Thanksgiving dinner was notably different from today’s traditions, at least according to the scant historical accounts we have of the gathering, namely a letter from diplomat Edward Winslow and a letter penned by William Bradford, the governor of Plymouth Colony.
While it’s difficult to know what exactly was eaten at the first Thanksgiving, it’s possible to piece together a menu based on these accounts and the crops that were available around Plymouth at the time. With that in mind, I, along with several friends, set out to recreate some of the dishes that were likely served at the first Thanksgiving feast.
The highlight of modern Thanksgiving meals is the turkey, but the inaugural harvest feast likely centered around roasted waterfowl. According to Winslow, the local governor “sent foure men on fowling” to collect meat for the meal. Though wild turkeys were present in the region, the centerpiece of a dinner at the time was more likely to be duck, goose, or a similar bird.
Ingredients
One 5-pound duck
2 ½ teaspoons salt
½ teaspoons ground black pepper
10 black peppercorns
4 medium onions
1 handful of parsley
2 cups of red wine
⅓ cup minced parsley leaves
1 teaspoon ground ginger
¼ cup dried currants
½ teaspoon ground mace
¼ cup cranberries
1 tablespoon sugar
4 tablespoons unsalted butter
For our recreation, we used a recipe for roasted duck accompanied by a sauce made of cranberries and wine as found in the 2005 book Giving Thanks: Thanksgiving Recipes and History, From Pilgrims to Pumpkin Pie. Given there were no actual recipes written down at the first Thanksgiving meal, this recipe, as well as all the others I followed, recreates each dish’s authentic taste using ingredients that were likely available at the time.
Interestingly, this recipe calls for the duck to be boiled before roasting. Also, there’s no stuffing the bird like we do today — instead, it’s cooked in a broth with onions and parsley, lightly seasoned with salt and pepper, and then roasted over a bed of onions.
The result was a roasted duck that looked and tasted as if it had been created using modern cooking techniques. The skin was crisp and well seasoned, and though some of the meat was dry, the bird was largely moist and succulent. The sauce was a nice touch, providing a tangy and somewhat sweet complement to the bird.
It’s worth mentioning that there’s some historical argument about how much wine and sugar the Pilgrims may have had. They could have used what remained of the original provisions brought over on the Mayflower in late 1620, but it’s unlikely they had much of either available. Cranberries, on the other hand, were a seasonal highlight of local Wampanoag fare.
Samp
One of the first side dishes we tried making was samp, a kind of corn porridge. Based on a Wampanoag dish called nasaump — a traditional meal consisting of dried corn, berries, and nuts — samp was the Pilgrims’ attempt at creating a porridge with a consistency akin to modern oatmeal.
Ingredients:
2 cups coarse corn grits
4 cups water
1 cup milk
¼ cup sugar
We followed a simple recipe inspired by a description of the dish from the book Two Voyages to New England, written by English traveler John Josselyn in the 1600s. The recipe used corn-based grits, water, sugar, and milk. (I used cow’s milk, though it was more likely they used goat’s milk at the time.) The corn grits and water are stirred together until thick and warm, and milk and sugar are mixed in before serving.
The result was a thick, somewhat sweet, but otherwise rather bland mixture that was perfectly edible but largely uninteresting. On the table it looked like a bowl of mashed potatoes but boasted a thicker texture. Eaten on its own, samp was one of the more forgettable aspects of our meal, though it provided a nice and simple contrast to the tart sauce accompanying the duck.
Advertisement
Advertisement
Seethed Mussels With Parsley and Vinegar
Seafood may not be common at Thanksgiving dinners today, but it was a major part of the first feast. The Pilgrims and Wampanoag likely incorporated local fish such as cod and bass into their menu, as well as various shellfish harvested from nearby waters. For our seafood element, we followed a recipe for mussels “seethed” in parsley and vinegar, based on a description found in A Booke of Cookerie, published in 1620.
Ingredients:
4 pounds of mussels
2 tablespoons butter
½ cup chopped parsley
½ cup red wine vinegar
¾ teaspoon salt
¼ teaspoon ground black pepper
2 garlic cloves
For this recipe, the mussels are scrubbed clean while the other ingredients are brought to a boil. The mussels are then added and steamed until all of the shells are open, then immediately served.
One interesting note is that we used butter in this recipe, despite the fact that cows didn’t arrive in the region until 1623, a couple of years after the first Thanksgiving. In fact, there were few cows anywhere in the United States until the late 16th century, when several thousand were brought from Mexico into modern-day New Mexico. However, early colonial ships including the Mayflower brought over European goods such as butter and oil, so it’s possible that butter was on hand for the feast.
The seethed mussels were surprisingly close to what you’d find in a modern seafood restaurant. The sauce was similar to a white wine sauce, though we didn’t use any actual wine. Instead, the shells were cooked in red wine vinegar, which provided a nice tang, as well as butter for added creaminess. The mussels were tender and the fresh parsley contributed a nice light, peppery taste. Even though I’m not particularly fond of most shellfish, I found this dish surprisingly enjoyable.
Stewed Pumpkin
In the 17th century, pumpkins were known as “pumpions,” and were prevalent throughout modern-day New England. Often they were cut into dices and stewed until soft — but to make things simpler, we used pumpkin puree.
Ingredients:
4 cups pumpkin puree
3 tablespoons butter
2 teaspoons apple cider vinegar
1 teaspoon ground ginger
½ teaspoon salt
Much like the recipe for samp, we used a recipe for stewed pumpkin based on Josselyn’s description of the meal in the 1672 book Two Voyages to New England. In this simple dish, pumpkin and butter are added to a bowl and mixed together until the butter melts. During the cooking process, ingredients such as apple cider vinegar and ginger are also added in order to make the dish “tart like an Apple,” as per the recipe.
The final product had a very light tart flavor, but largely tasted like warm pumpkin more than anything else. As a fan of pumpkin, I enjoyed the taste, though it was rather one-note and uninteresting. In terms of appearance, the stewed pumpkin looked just like a bowl of sweet potatoes. (Potatoes weren’t widely available in America at the time, so they were left out of our meal entirely. Potatoes were slowly introduced stateside throughout the 17th century and first grown on a large scale in 1719.)
Advertisement
Advertisement
Wampanoag Autumn Sobaheg
The fifth and final dish we prepared was a traditional Wampanoag stew known as sobaheg. This recipe comes from the same book as the roast duck recipe, Giving Thanks: Thanksgiving Recipes and History, From Pilgrims to Pumpkin Pie. It required the longest cooking time of any dish at nearly three hours, and called for venison (deer meat) as its central component. According to Winslow’s account, several hunters “went out and killed five Deer, which they brought to the Plantation and bestowed on our Governor.”
Ingredients:
½ cup dried white beans
½ cup coarse grits
1 pound of venison
1 teaspoon salt
1 small acorn square
1 cup of peeled turnips
¼ cup powdered walnuts
This stew mixes venison, white beans, grits, squash (we used acorn squash), and turnips (which we substituted for the similar sunchoke). The meal cooks for several hours, during which it develops a thicker consistency, before being served in a bowl next to the main plates.
In my opinion, the sobaheg was the tastiest food we ate all night, a surprise given that it was seasoned only with salt. The soft and naturally sweet turnips and squash were a good counterpoint to the more savory venison flavor, as all the elements combined to create a hearty and warming stew.
On the surface, the dishes we cooked from the inaugural Thanksgiving were clearly lacking in both seasoning and leafy vegetation — at least by our modern standards. Still, the meal was surprisingly tasty. The dishes were all warm and filling, creating the perfect menu to consume during crisp autumn weather. It’s also surprising how modern many of the dishes felt, given they were cooked more than 400 years ago.
Steve Liss/ The Chronicle Collection via Getty Images
Author Mark DeJoy
November 9, 2023
Love it?45
The first line of the preamble to the U.S. Constitution contains the oft-referenced statement of purpose, “to form a more perfect union.” Presidential elections have served as a significant (if not the most significant) part of the process behind that intention, as a quadrennial evaluation of the not-yet-perfect union’s direction. As with any growth process though, there’s bound to be some, well, awkward phases — and the United States certainly has had them. Entire political parties have come and gone, constitutional amendments have been necessitated, and there’s been all manner of outright oddity throughout the history of U.S. presidential elections. Here are some of the most bizarre moments.
If anything proves that partisan politics and electoral machinations are nearly as old as the United States itself, it’s the election of 1800, when Federalist Party incumbent President John Adams sought reelection against Democrat-Republican Vice President Thomas Jefferson. The already-bizarre premise of opposing parties holding the presidency and vice presidency was made possible at the time by a law stipulating that the presidential candidate who earned the second-most number of electoral votes became Vice President. In the election of 1796, Jefferson lost the presidency to Adams by only three votes, and the 1800 election was a rematch between the political rivals.
That time, with another narrow margin likely, both parties turned toward influencing electors, whose votes decided the winning candidate in states where there was not yet a popular vote. Jefferson wrote of his intent to sway electors in New York, Pennsylvania, and New Jersey in a letter to James Madison. Federalist Senator Charles Carroll accused Jefferson and his supporters of also attempting to use “arts and lies” to manipulate votes in Federalist-leaning Maryland. From there, the accusations, well, escalated. Jefferson-supporting pamphleteer James Callendar claimed that John Adams was a hermaphrodite. Federalist newspapers accused Jefferson of maintaining a harem at Monticello.
When the votes were finally cast, the election ended in a tie between Jefferson and… his intended running mate, Aaron Burr. How? Each elector had two votes to cast, but there was no distinction at the time between a vote for President versus a vote for Vice President. Casting one vote for Jefferson and one vote for Burr was in effect a vote for each as President. The Constitution called for resolving this tie between the Democrat-Republican candidates with a vote in the House of Representatives, which was controlled by, you guessed it, the Federalist Party.
The task at hand was to vote on who, between Jefferson and Burr, would be President, but the Federalists saw an opportunity to seize power, either by delaying the proceedings past the end of Adams’ term, or attempting to invalidate enough votes to give Adams the majority. Others advocated for supporting Burr. Between February 11 and February 16, 35 rounds of voting took place, each ending in deadlock. Finally, after much lobbying by Alexander Hamilton against Burr, the 36th ballot resulted in Jefferson being appointed President. In the wake of the turbulent election, the 12th Amendment was ratified in order to prevent a repeat ordeal in 1804.
Photo credit: Kean Collection/ Archive Photos via Getty Images
1840: William Henry Harrison vs. Martin Van Buren
If William Henry Harrison is known today, it’s for the obscurity of his mere 31 days in office. But the campaign leading to his presidency was a rollicking and often rowdy phenomenon that sparked a voter turnout of more than 80%, an increase of nearly 23 percentage points from the previous election.
The election pitted Harrison and running mate John Tyler of the upstart Whig Party against incumbent Democratic President Martin Van Buren during a period of economic strife caused by the Panic of 1837. Harrison’s campaign played off of his military fame for his victory at the Battle of Tippecanoe, with the slogan “Tippecanoe and Tyler Too.” It also attacked Van Buren with accusations of living in aristocratic luxury. The Van Buren campaign and its supporters countered by painting the 67-year-old Harrison as too elderly and frail for the presidency. An editorial in the Baltimore Republicanmocked Harrison with the line, “Give him a barrel of hard cider, and settle a pension on him… he will sit the remainder of his days in his log cabin by the side of the fire and study moral philosophy!”
The Whigs, however, embraced the hard cider and log cabin imagery, and built the rest of the campaign around it. They leaned into the association with the “everyman,” and organized cider- and whiskey-fueled mass rallies. There were songs, stump speeches, and all manner of bric-à-brac emblazoned with cider kegs and log cabins. There were also the 10- to 12-foot slogan-covered balls Whigs would roll down the streets while chanting in support of the candidates. It all led to Harrison shellacking Van Buren in the election, albeit not quite as might be expected: The lopsided victory was in the Electoral College, 234 to 60, but the popular vote margin was only about 150,000 votes. No need to pity Van Buren, though. He later remarked, “The two happiest days of my life were those of my entrance upon the office and my surrender of it.”
Incumbent President Ulysses S. Grant’s Republican Party was beginning to fracture leading into the June 1872 National Convention. A reform wing calling itself Liberal Republicans had held its own convention the previous month, nominating New York Tribune founder and editor Horace Greeley as its candidate. Overestimating the power of this new faction, the Democratic Party refrained from nominating its own candidate and instead threw its support behind Greeley, despite Greeley’s history of pointed criticism of the Democratic Party.
Almost immediately, Greeley was lambasted in the press. The New York Timescalled the Democratic Convention that nominated him “the ghastliest of political shows.” Political cartoons were especially harsh, depicting him as mousey or infantile. Greeley soldiered on, making campaign stops in New Jersey, Pennsylvania, Ohio, Kentucky, and Indiana between September 19 and 29, and giving nearly 200 speeches in that short span. Unfortunately for the candidate, his running mate Benjamin Gratz Brown completely undermined that effort by giving an incoherent drunken speech at Yale, and then fainting during an event in New York City.
The Greeley campaign never really mounted a serious threat to Grant. “I have been assailed so bitterly that I hardly knew whether I was running for the presidency or the penitentiary,” Greeley lamented. Grant won reelection easily with 55.6% of the popular vote. In a bizarre and tragic twist, Greeley died on November 29, before the Electoral College could cast its ballots. Because of this, the 63 votes he would have earned were dispersed among other candidates. It remains the only time in U.S. history that a candidate has died in the interim between the popular vote and the Electoral College vote.
The 1872 election was also notable for another reason: Though not a legal candidate (she was under 35 years old), Victoria Woodhull also ran in the 1872 election, making her the first woman to campaign for President of the United States.
Photo credit: Central Press/ Hulton Archive via Getty Images
1964: Lyndon B. Johnson vs. Barry Goldwater vs. a Fabricated “Jewish Mother”
The 1964 presidential election took place less than a year after the assassination of John F. Kennedy; the Vietnam occupation was approaching its midpoint, and segregationist Alabama Governor George Wallace was running a primary campaign in the northern U.S. on a platform of outright racism. Into this fraught atmosphere stepped a wisecracking independent write-in candidate named Yetta Bronstein, with slogans such as “We need a Jewish mother in the White House,” A mink coat in every closet,” and “If you want simple solutions, then you gotta be simple.” Calling herself a “Jewish housewife” running for a political party called the “Best Party,” Bronstein managed to attract media attention and invitations for radio interviews, wherein she advocated for increasingly kooky things such as adding “truth serum” to the Senate drinking fountains, and putting a nude photo of Jane Fonda on postage stamps.
Yetta Bronstein was a complete fabrication, though. A character invented by husband-and-wife hoaxers Alan and Jeanne Abel (and played by Jeanne in radio interviews), Yetta was conceived as a way to poke fun at credulous media. In the real world, Lyndon B. Johnson and Barry Goldwater were engaging in some of the most grueling mudslinging yet, culminating in the legendary Johnson “Daisy” campaign ad (officially titled “Peace, Little Girl”), in which a young girl counts petals as a nuclear countdown cuts to a mushroom cloud. Ultimately, Johnson trounced Goldwater by more than 15 million votes in the popular election, and 486 Electoral College votes to Goldwater’s 52. Yetta Bronstein didn’t get a single vote.
The 1988 presidential election was an open field, with Ronald Reagan finishing out his last term as President. The primary season on the Democratic side included a slate of relative upstart contenders referred to in overtly derisive political commentary as “the Seven Dwarfs”: Bruce Babbitt, Joe Biden, Michael Dukakis, Richard Gephardt, Al Gore, Jesse Jackson, and Paul Simon. On the Republican side, Vice President George H.W. Bush, Bob Dole, Jack Kemp, and televangelist Reverend Pat Robertson were the contenders, escaping a derisive nickname of their own for reasons that are lost to history.
Almost immediately, the Democratic side was beset with paparazzi-style scandals: Early, pre-Seven Dwarfs contender Gary Hart was caught having an affair with a woman who accompanied him on a luxury yacht called, all too on-the-nose, Monkey Business. Another scandal followed when a tape surfaced with footage of Joe Biden speaking at the Iowa State Fair and quoting British Labour Party leader Neil Kinnock without attribution. The resulting furor, and Biden’s subsequent mishandling of it, prompted him to drop out of the race. When TIME magazine reported that the tape came from the Dukakis campaign, the campaign initially denied the report, before eventually coming clean. Dukakis aides John Sasso and Paul Tully also stepped down.
Meanwhile, George H.W. Bush was emerging as the Republican nominee, despite receiving frequent criticism for not communicating the priorities of a Bush presidency (“the vision thing,” as Bush himself rather flippantly put it). Once Dukakis became the Democratic nominee, Bush fully dismissed “the vision thing” in favor of Lee Atwater-aided negative campaigning, successfully painting Dukakis as soft on crime with the viciousness of a Thomas Jefferson-John Adams-era series of invectives.
Another enduringly famous attack ad was one that the Dukakis campaign itself unintentionally provided the imagery for. The ad featured an unfortunately goofy Dukakis video op with the candidate perched upon an M1 Abrams tank and wearing a too-large helmet that looked more Great Gazoo than commander in chief. Bush won the election with 53.4% of the popular vote and a whopping 426 electoral votes.
Transcendental Graphics/ Archive Photos via Getty Images
Author Kristina Wright
November 9, 2023
Love it?130
First developed in the late 1820s, photography combined art and science into one medium capable of capturing an image in the moment. The innovation transformed recorded history into something that could be documented in pictures as well as text. As the technology advanced, the medium exploded in popularity, making it possible for families to create snapshots of memories for future generations to appreciate. These early photographic portraits transport us back in time, painting a picture of a different way of life: Families were larger, clothes were bulkier, and postures were noticeably stiff and formal. But perhaps the most conspicuous difference of all is that no one ever seemed to smile.
The somber expressions preserved in early photographs might lead us to assume that past generations led austere and joyless lives. However, the lack of joviality in these snapshots can be attributed to several other factors. Here’s the truth behind those stern expressions in old photos.
In the earliest days of photography, the lengthy exposure periods made it impractical to photograph people. For instance, French inventor Nicéphore Niépce’s 1826 “View from the Window at Le Gras,” credited as the oldest surviving photograph, required an exposure time of eight hours. It was more than a decade before Louis Daguerre’s 1839 invention of the daguerreotype made portrait photography practical. But even then, it was a relatively slow and meticulous process that required the subject to remain still for as long as 20 minutes.
By the early 1840s, photographic technology had advanced further, and the daguerreotype images that once required a 20-minute exposure neededonly 20 seconds to process. Still, even modern photo subjects understand the difficulty of maintaining an open-mouthed smile for any amount of time. It only takes a few moments for a candid smile to turn into something more like an awkward grimace. And anyone who has dealt with a restless child can attest that more than a few seconds of remaining motionless is a formidable challenge. To minimize movement and guarantee a sharp image, children were sometimes put into restraints for the length of a photo shoot.
Additionally, until the 20th century, the expense of photographic equipment and the toxic and dangerous chemicals needed to process film meant that most photographs were taken by professional photographers working out of studios or traveling with their equipment. A photography session was a time-consuming and pricey undertaking; it cost the average person as much as three or more months’ salary, and a person might only be photographed a few times in their life. The requirement for stillness, combined with the novelty and cost of posing for a professional photographer, created an atmosphere where it was simply easier to maintain a neutral or serious expression. But even once the technology existed to capture more relaxed expressions, it was a long time before smiling in photos became the norm.
Though technological limitations are frequently cited as the reason for the solemn expressions in old photographs, it wasn’t the only reason our ancestors so often appeared solemn in front of the camera. One notable feature shared by artist portraits from the 17th and 18th centuries and photographs from the early 19th century is the presence of stoic, enigmatic expressions on the subjects’ faces. As portrait artist Miss La Creevy observes in Charles Dickens’ novel Nicholas Nickleby, only two types of expressions existed in portraiture: “the serious and the smirk.”
Before photography, a painted portrait was the only way to preserve someone’s image for posterity. Having your portrait painted was an activity associated with wealth and social status, and accordingly, the art form had its own rules and expectations. This formal portraiture proved to be a big influence on early photographers, who featured their subjects in ways that represented their social status, occupation, or other interests. The social mores associated with painted portraits carried over into photographic portraiture, and smiling was discouraged.
Photo credit: Heritage Images/ Hulton Archive via Getty Images
Social Etiquette Frowned Upon Smiling
Some historians believe that advancements and accessibility in dental care may have contributed to more smiles eventually being captured on film. Other experts disagree, noting that for centuries, a lack of dental care was the norm and thus wasn’t considered to detract from a person’s physical appeal. Still, smiling for a photograph wasn’t commonplace in the early days of photography. In fact, instead of the modern directive to “say cheese!” to produce a wide, toothy grin, some photographers in Victorian-era England asked people to say “prunes,” forcing them to tighten their lips for a more socially acceptable expression based on the beauty standards and etiquette of the time.
In an era where open-mouthed grins were considered unacceptable and a smile was believed to signify someone was poor, drunk, lewd, or otherwise corrupt, it was rare for someone to choose to smile in a portrait — and even less likely that a photographer would encourage it. That all changed, however, with Kodak’s democratization of photography in the early 20th century.
As photography became more accessible in the late 19th century, a wider variety of people took and sat for photographs, and what was acceptable in portrait photography became less rigid. In 1888, Kodak founder George Eastman started a photographic revolution that put cameras in the hands of amateur photographers and gave them an instruction manual on how to take good photos. In 1900, the Kodak Brownie camera was marketed for children and sold for just $1, creating a photography craze that appealed to adults as well.
By the 1920s, a century after the first landscape photographs were captured on film, more relaxed postures and a greater variety of expressions, including closed- and open-mouthed smiles, were common in both amateur and professional photography. With the advent of color photography, the popularity of candid photos, and the rise of affordable personal cameras, capturing an array of expressions — including moments of genuine joy — became the gold standard.
There’s nothing more frustrating than working your socks off only to see someone else get all the credit for your efforts. Spare a thought, then, for the minds behind some of history’s most significant innovations, who, despite months, years, or in some cases lifetimes of work, find someone else’s name ignominiously attached to their invention.
Sometimes inventions are miscredited in the public consciousness simply because a more famous name becomes associated with the creation. For example, Thomas Edison and Henry Ford — two of modern history’s most well-known innovators — are often credited with things they didn’t actually invent, through no fault of their own. Then there are the more insidious misattributions. In some instances, an idea has been copied or outright stolen, robbing the true inventor of their glory; in others, a more senior or prominent member of a team is given credit despite not coming up with the original idea. See, for example, the Matilda effect, in which notable discoveries made by women have often been misattributed to the men they worked with.
Here are some notable inventions in history that are frequently credited to the wrong person, from the flush toilet to the iPod.
No name in the history of toilets is more famous than that of plumber Thomas Crapper, partly because his name appeared on the once-ubiquitous Crapper brand of toilets, and partly because Crapper is a humorously appropriate name for a toilet (the slang word “crap” existed before Thomas Crapper). Crapper, however, did not invent the flushing device with which he is so associated. He did patent the U-bend and floating ballcock — key components of the modern toilet — in the late 1880s, but he never held a patent for the flush toilet. Much earlier, in 1596, John Harington, an English courtier and the godson of Queen Elizabeth I, described what can be considered the first flush toilet, which involved a 2-foot-deep bowl and a massive 7.5 gallons of water per flush. (Only two working models were made, one in Haringon’s own home and one in Queen Elizabeth’s palace.) The first patent for a flushable toilet was granted to the Scottish inventor Alexander Cumming in 1775.
The Italian polymath Galileo Galilei is often credited with inventing the telescope, and it’s easy to see why. He gave birth to modern astronomy with his telescope-assisted discoveries about our moon, the moons of Jupiter, and other celestial bodies. Galileo made his first telescope in 1609 after hearing about the “perspective glasses” being made in the Netherlands. But the first person to apply for a patent for a telescope was Dutch eyeglass-maker Hans Lippershey in 1608, a year before Galileo. His telescope could magnify objects only three times, but it was nonetheless a landmark in the history of optics. (By comparison, by the end of 1609, Galileo had developed a telescope that magnified objects 20 times.) Whether Lippershey should be credited as the inventor of the telescope remains an open debate, as it is entirely possible that others created similar devices before he filed his patent.
Thomas Edison is often — and incorrectly — given all the credit for inventing the lightbulb. But the lightbulb was actually the result of a process that began before Edison was even born. In 1802, English chemist Humphry Davy used a voltaic pile (invented by Alessandro Volta, after whom the volt is named) to create the first “electric arc lamp” between charcoal electrodes. His rudimentary lamp was too bright and burned out too quickly, but it was nonetheless an important breakthrough. Other scientists worked to refine the lightbulb, but problems with filaments and batteries made these early bulbs impractical for everyday use. In 1860, English physicist Joseph Swan developed a primitive electric light that utilized a filament of carbonized paper in an evacuated glass bulb. Lack of a good vacuum and an adequate electric source ultimately made it inefficient, but it did pave the way for later innovations, including those by Edison. Edison purchased some of his predecessor’s patents, improved upon them, and came up with his own lightbulb, which, while not the first overall, was the first to be commercially viable.
Photo credit: Culture Club/ Hulton Archive via Getty Images
The Automobile
One commonly held misconception is that Henry Ford invented the automobile. In reality, the development of the automobile can be traced back to Nicolas-Joseph Cugnot, a French military engineer who, in 1769, built a steam-powered tricycle for hauling artillery. Due to its steam-powered nature, not everyone accepts Cugnot’s invention as the first true auto. Instead, that distinction often goes to vehicles made by two Germans, Karl Friedrich Benz and Gottlieb Daimler, who — working entirely separately — developed their own gasoline-powered automobiles in 1886, in two different German cities. Benz actually drove his three-wheeled vehicle in 1885, and it is regarded as the first practical modern automobile and the first commercially available car in history. As for Henry Ford, his name is forever remembered in auto history for the Model T, which he mass-produced using an innovative moving assembly line, making automobiles available to middle-class Americans.
Advertisement
Advertisement
Photo credit: Maurice Ambler/ Picture Post via Getty Images
Monopoly
Since the 1930s, it’s been common knowledge that Charles Darrow invented Monopoly, an idea that both he and the game’s manufacturer, Parker Brothers, freely propagated (it was printed in the instructions for decades). But it’s not quite true. Darrow got the idea for the game — which made him a millionaire — from a left-wing feminist named Elizabeth Magie. Magie created and patented an early version of Monopoly, called The Landlord’s Game, in 1903, about three decades before Darrow. Darrow learned about the game from a couple who had played it in Atlantic City (which is where many of the game’s street names come from) and made a few changes: The original game included a wealth tax, public utilities, and was designed as a protest against the big monopolists of her time. It had two sets of rules, one that allowed players to create monopolies and crush their opponents, and an anti-monopolist version that rewarded all players when wealth was created (the latter demonstrating what Magie believed to be a morally superior path). It’s only in recent years that Magie has started to receive the credit for inventing one of the world’s most popular and iconic board games.
Photo credit: Justin Sullivan/ Getty Images News via Getty Images
The iPod
Portable digital audio players have existed since the mid-1990s, but it was Apple’s iPod that revolutionized the industry upon its release in 2001. Yet it wasn’t the engineers at Apple who invented the iPod — not entirely, at least. British inventor Kane Kramer actually developed the technology behind the iPod as far back as 1979. His credit card-sized music player, which looked very similar to the iPod, could store only 3.5 minutes of music, but he was sure the storage capacity would increase over time. Unfortunately for Kramer, internal problems at his company ultimately led to his patent lapsing, at which point the technology became public. Apple later acknowledged Kramer’s involvement in inventing the technology behind the iPod.
Advertisement
Advertisement
6 Myths and Misconceptions About George Washington
George Washington undoubtedly led an extraordinary life, which makes it hard to separate legend from reality. He was the only U.S. President to be unanimously elected to office, despite having no formal schooling past the age of 15, and he remains one of the most famous military leaders in United States history. Common depictions of Washington include a young Virginian boy chopping down cherry trees and, later, a dignified statesman proudly posing in a powdered wig. But some of the best-known aspects of the former President’s life aren’t historically accurate. Here are six common myths about the famous founding father.
Myth: Washington Chopped Down His Father’s Cherry Tree
Washington telling his father, “I cannot tell a lie… I did cut it with my hatchet,” is coincidentally one of the biggest lies about this larger-than-life figure. Legend says that George Washington received a hatchet as a gift when he was 6 years old, and took the ax to one of his father’s beloved cherry trees. During the subsequent confrontation with his father, he came clean, unable to tell a lie. The encounter was recreated in artist John C. McRae’s 1867 engraving “Father, I Can Not Tell a Lie: I Cut the Tree.” There’s just one problem: None of this ever happened. The famous legend was devised by biographer Mason Locke Weems in the 1806 edition of his book “The Life of Washington.” Published shortly after Washington’s death, the book immortalized the founding father as a national hero with a steadfast moral compass despite his faults, including the ownership of hundreds of enslaved people at his Mount Vernon estate.
Photo credit: MPI/ Archive Photos via Getty Images
Myth: He Was the First President to Live in the White House
It’s a common misconception that George Washington lived in the White House — he was the first President, after all — but the building wasn’t completed until 1800, one year after Washington’s death. The President’s successor, John Adams, was the first commander in chief to call the White House home. However, Washington did play a large role in the planning and construction of the famous residence. He chose the site of what was then called the “President’s House” at what is now 1600 Pennsylvania Avenue. The first cornerstone of the White House was laid in October 1792. During this time, Washington lived in executive residences in New York and Philadelphia, both of which served as the nation’s capital before it was moved to Washington, D.C., in 1800.
Half of this myth is correct: Washington definitely did wear dentures, and a set is even on display at Mount Vernon. He began losing teeth in his 20s and was forced to wear painful dentures for the rest of his life. Eighteenth-century dentures were a little different from modern versions and were made with all sorts of unique materials, including human teeth, cow and horse teeth, ivory (possibly of elephant or hippopotamus origin), and metal alloys such as lead-tin, copper, and silver. Wood, however, was never used in the construction of Washington’s dentures, though due to the discoloring of some of the materials, they might have appeared wooden, fueling this myth.
Washington is often touted as a military genius and expert strategist, but he actually lost many battles to the British during the American Revolution. In fact, of the 17 Revolutionary War battles the general was present for, he won six, lost seven, and four ended in a draw. His unparalleled reputation was earned by the battles he did win, including the Battle of Trenton and the Siege of Yorktown, the latter of which was the decisive battle of the Revolutionary War. As commander of the Continental Army, Washington displayed resilience, determination, and leadership, and earned the respect of his soldiers and the rest of the country, who in turn selected him as the first President of the new nation.
Advertisement
Advertisement
Photo credit: Print Collector/ Hulton Fine Art Collection via Getty Images
Myth: He Wore a Powdered Wig
Portraits of Washington in his older years showcase his distinct, neatly coiffed hair, pulled into a ponytail in typical founding father fashion, but it’s a common misconception that this was a wig. Powdered wigs from this era looked identical to Washington’s hairdo, but the military commander still had a full head of hair. Instead, he used white powder on his hair to make it appear brighter — white hair was very fashionable during the 18th century. What’s more, Washington was one of five redheaded Presidents, as seen in his younger portraits.
Photo credit: Fine Art/ Corbis Historical via Getty Images
Myth: He Skipped a Silver Dollar Across the Potomac
Legend says that Washington once tossed a silver dollar a mile across the Potomac River to the other side. This enduring claim plays into the mythos surrounding his physical strength and larger-than-average height (he was 6 feet, 2 inches tall). But the tall tale probably stemmed from stories of lesser feats, including Washington’s grandson’s claim that the former President tossed a piece of slate across the Rappahannock River, which is much narrower than the Potomac. Indeed, silver dollars didn’t even exist during Washington’s lifetime. The use of silver for coins didn’t begin until 1794 (five years before his death), and true silver dollars weren’t minted until 1878.
World War II was one of the most transformative events of the 20th century. It was the largest war ever fought, with more than 50 nations and 100 million troops involved, and it reshaped geopolitics, resulting in the United States and Soviet Union emerging as major world powers leading into the Cold War. This far-reaching war also inspired new global peacekeeping efforts, including the creation of the United Nations, and it brought to light incredibly courageous acts of humanity from soldiers and civilians alike. Here are the stories of six daring heroes of the Second World War.
Calvin L. Graham was the youngest U.S. military member during WWII, and is still the youngest recipient of the Purple Heart and Bronze Star. It wasn’t unusual for boys to lie about their age to enlist, but Graham was just 12 years old when he forged his mother’s signature and headed to Houston to enlist. The 125-pound, 5-foot-2 boy was miraculously cleared for naval service and assigned to the USS South Dakota as an anti-aircraft gunner.
On November 14, 1942, the South Dakota was ambushed by Japanese forces at the Battle of Guadalcanal. Graham was severely burned and thrown down three stories of the ship, but still mustered the strength to tend to his severely wounded shipmates. He was honored for his heroism, but when his mother found out about the honor, she informed the Navy of his real age and he was stripped of his medals and thrown into the brig for three months. In 1978, President Jimmy Carter learned of Graham’s story and restored his medals, except for his Purple Heart, which wasn’t restored until two years after Graham’s death.
Polish soldiers stationed in Iran during the war were met with great surprise when a shepherd traded them a Syrian brown bear cub for a Swiss army knife and some canned goods. The cub’s mother was likely killed by hunters, so the soldiers adopted him, giving him the name “Wojtek,” meaning “joyful warrior” in Polish — a title he soon lived up to. His caretaker, a soldier named Peter Prendys, taught the bear how to salute, wave, and march, and Wojtek became a great morale booster.
In 1944, Wojtek was given the rank of private and a serial number (pets were banned in the Polish army), and he shipped off to Italy with his unit. That May, the bear even joined combat during the Battle of Monte Cassino, carrying supplies to his fellow troops, according to witnesses. He was promoted to the rank of corporal for his bravery. After the war, Wojtek found his forever home at the Edinburgh Zoo in 1947. A bronze statue of the bear and Prendys still stands in downtown Edinburgh today.
Army Colonel Ruby Bradley of the U.S. Army Nurse Corps was working at Camp John Hay in the Philippines when she was taken prisoner by the Japanese army in 1941. She became a POW at the Santo Tomas Internment Camp in Manila — but she didn’t let it break her spirit. Bradley immediately went to work helping her fellow POWs by offering medical aid and smuggling food and medicine to those in need. She assisted on 230 major surgeries and delivered 13 babies during her 37 months at the camp. Bradley and her fellow nurses became known as the “Angels in Fatigues.”
In February 1945, the camp was finally liberated, and Bradley — who was malnourished from giving her food rations to children — went home. She continued her career in the Army, amassing 34 decorations, medals, and awards (including the Bronze Star Medal), making her one of the most decorated women in U.S. military history.
General Benjamin O. Davis Jr. faced racial discrimination from the very beginning of his military career. He was only the fourth Black cadet in the history of the United States Military Academy at West Point before joining the Army in 1936. After being stationed in Alabama, he received the opportunity of a lifetime: squadron commander of the first all-Black unit in the Army Air Forces. This unit of 1,000 Black pilots became known as the Tuskegee Airmen, renowned for their exceptional achievements in combat despite the discrimination they faced.
Davis led the 99th Fighter Squadron during their 1943 deployment against Axis forces in North Africa, and later that year, he commanded the 332nd Fighter Group to fight on the front lines in Italy. During his two-year command of the Tuskegee Airmen, Davis and his crew sank more than 40 enemy ships and downed more than twice the number of aircraft they lost, earning them a reputation as a formidable fighting squadron. Their impressive record wasn’t just a message to the enemy; it broke racial barriers at home, furthering the fight for desegregation and equal rights. Davis had a life of public service and was promoted to four-star general by President Bill Clinton in 1998.
For Navy Lieutenant Susan Ahn Cuddy, entry into military service was personal. Her father, Dosan Ahn Chang Ho, died while imprisoned by the Japanese in 1938. He was incarcerated for anti-Japanese activism as a known leader for the Korean independence movement. Despite growing anti-Asian sentiments during WWII, Cuddy wanted to honor her father and fight against the Japanese, so she enlisted in the U.S. Navy in 1942. She was the first female Asian American naval officer and eventually became the first female gunnery officer, training pilots to fire a .50-caliber machine gun. She later worked with codebreakers at the Naval Intelligence Office while using her knowledge of the Korean language. Even there, Cuddy faced discrimination — one of her superiors wouldn’t let her access classified documents. After the war, Cuddy worked at the National Security Agency during the Cold War. She died peacefully in her sleep in 2015 at the age of 100.
On the morning of December 7, 1941, George Walters, a crane operator at the Pearl Harbor dockyard in Hawaii, awoke to a devastating surprise attack by Japanese forces. Walters ran to a massive crane next to the USS Pennsylvania and began moving it back and forth on its track to shield the ship from an onslaught of rounds from Japanese fighters and dive bombers. He even attempted to knock planes out of the sky with the boom. The protected gunners onboard the Pennsylvania were able to return fire. Later, a bomb exploded on the dock next to Walter’s crane, knocking him out of the fight. He survived with a concussion, and it’s believed that his actions helped save the ship from certain destruction. The story of Walters’ heroism was featured in Walter Lord’s 1957 book “Day of Infamy.” Walters continued to work at the shipyard for 25 years following the attack. Lewis Walters, George’s son, was a young shipyard apprentice at the time who witnessed his father’s bravery firsthand.
Footwear is so integral to the human experience, it’s hard to imagine a time in history when it didn’t exist. To be without shoes in modern life would pose a significant problem — can you imagine leaving your home and walking even a single city block barefoot? The degree to which footwear is essential for enhanced mobility means that it arguably could even be considered our first vehicle. Whether you’re a bona fide shoe-lover or someone who takes footwear for granted, it’s worth thinking about the lineage of these things we put on our feet to carry ourselves through the world. Let’s go on a quick walkabout to explore the history of footwear.
How far back in human history do shoes go? Anthropologists estimate that humans first began wearing some form of sturdy foot covering at least 40,000 years ago, based on changes in toe bones. The oldest surviving pair of shoes is what’s referred to as the Fort Rock sandals, woven sagebrush bark sandals made by Indigenous people in what’s now southeast Oregon and northern Nevada about 10,200 to 9,300 years ago (according to radiocarbon dating). Similar variants of these sandals were made by the Klamath Tribes up until the 20th century.
As for fully enclosed shoes, archaeologists made a surprising discovery during a 2010 dig in an Armenian cave: well-preserved shoes made from tanned cowhide that date back 5,500 years. In other words, the world’s oldest leather shoes. Aside from being made of a familiar modern material, the shoes were also laced along a center seam. Renowned designer Manolo Blahnik commented, “It is astonishing how much this shoe resembles a modern shoe!”
By the year 1305, King Edward I’s decree that an inch should equate to three dried barleycorns became the basis for English shoe sizing. That reference standard soon became relevant beyond the size of the whole shoe, as a fashion craze for shoes with exaggeratedly long points gripped 14th-century Europe. Known as poulaines, or crakows, the shoes were a status symbol in the truest sense; the impracticality of the design and its prevention of the wearer engaging in any kind of labor was the, well, point. The longer the poulaine, the more prosperity the shoe conveyed. Perhaps not surprisingly, poulaines also came to be considered racy, and clergymen disdained them as “claws of devils.” By 1463, English King Edward IV passed a sumptuary law limiting toe length to 2 inches (or, six dried barleycorns). This law, combined with the changing tides of fashion, caused late-15th-century shoe style preferences to veer toward a wide-toe shoe (and yes, eventually the width of the shoe was restricted, too). But even as shoe designs changed, a link between footwear and status remained.
The high heel was originally a type of riding footwear worn for centuries by the Persian military, but in 1599, when Shah Abbas I sent a diplomatic envoy to the courts of Spain, Russia, and Germany, the heel caught on with nobility throughout Western Europe. Echoing the message conveyed in the past by the poulaine, the impractical (for anything other than gripping stirrups) design of the heel was an indication of the wearer’s leisurely lifestyle. In 1670, French King Louis XIV codified the status of the heel in an official capacity, declaring in edict that high heels were to be worn by noblemen only. Yes, men.
It was during the Enlightenment of the 18th century that the gendering of the high heel shifted. Individualism and ideas of merit and hard work rendered the idle aristocrat unfashionable. Productivity became intertwined with masculinity, and the symbol of leisure that was the high heel was then considered unmanly. As men adopted lower heels, heel heights for women increased, and the function shifted. For women, high heels were intended to conceal the wearer’s feet beneath her dress, to make the feet appear smaller and dainty.
Photo credit: Joe Raedle/ Getty Images News via Getty Images
A Shoe for Every Use
The Industrial Revolution of the 19th century changed the course of design and function for shoes (like it did for nearly every material thing). Factories sped up manufacturing, sewing machines allowed for more embroidery and canvas work, and new processes (such as vulcanization) and dyes allowed for a greater variety of materials to be used in shoemaking. The North British Rubber Company (now Hunter) formed in 1856 and began making rubberized versions of Wellington boots that were eminently waterproof. Pumps emerged as fashionable women’s shoes in the 1870s.
The 21st century is still young, but dare we say the controversial Crocs (perhaps the shoe world’s version of pineapple on pizza) have earned a place in history since their inception in 2002? Or perhaps the equally polarizing “minimalist shoe”? Both designs are unconventional, arguably ostentatious, and seem to communicate something about the priorities of the wearer — a tale as old as a poulaine.
As we look back at American history, it’s crucial to take a moment to reflect on and recognize the contributions made by the nation’s Indigenous peoples, who are so often overshadowed by famous figures who came to the United States from other parts of the world. To commemorate this important part of America’s heritage, here’s a look at five notable Indigenous heroes and leaders who shaped the nation through their tireless efforts.
Photo credit: Hulton Archive/ Archive Photos via Getty Images
Geronimo (1829-1909)
A medicine man and leader of the Bedonkohe band of the Chiricahua Apache, Geronimo was born on the Gila River in New Mexico, where he was originally given the name Goyahkla, meaning “the one who yawns.” After the United States government forcibly relocated 4,000 Apaches to a reservation in San Carlos, Arizona, Geronimo led dozens of breakouts in an effort to return his community to their nomadic roots. Geronimo’s legacy is vast. His relationship with many American and Mexican civilians was complex, as he fought against colonialism but was made famous after appearing in Buffalo Bill’s “Wild West” sideshow and eventually in Theodore Roosevelt’s election parade. Geronimo’s tireless fight for Apache independence cemented him as a fearless crusader for freedom by the time of his death from pneumonia in 1909.
The son of a warrior, Sitting Bull was born in what is now South Dakota and was nicknamed “Slow” for his lack of fighting ability — that is, until he was branded Tatanka Yotanka (“Sitting Bull”) at age 14 after “counting coup” in a battle against the Crow Tribe. (“Counting coup” is a way to humiliate an enemy by riding close enough to touch them with a stick.) Sitting Bull eventually rose to become chief of the Hunkpapa Sioux, and fought tirelessly against the U.S. military, who sought to seize Indigenous land.
After fleeing to Canada to escape a vengeful army in the wake of the defeat of General George Armstrong Custer (and his 210 troops) in 1876 at the Battle of Little Bighorn, Sitting Bull returned to the U.S. in 1881 and was held prisoner at Standing Rock Reservation on Dakota Territory. His impact, however, could not be contained: After an Indigenous mystic claimed in 1889 that a ghost dance would eliminate the threat of white settlers on Native land, Sitting Bull allowed his followers to practice the dance — much to the horror of federal officials, who feared another uprising. Sitting Bull was killed by gunfire upon his arrest in 1890, and is remembered as a martyr for freedom.
Born near the Black Hills of South Dakota, Lakota Chief Crazy Horse was the son of a warrior with the same name, and at a young age he began showcasing his capacity for battle and bravery. Having helped lead the Sioux resistance against the U.S. military’s attempts to colonize the Great Plains throughout the 1860s and ’70s, Crazy Horse led a band of Lakota warriors against General Custer's 7th Cavalry Regiment during the Battle of Little Bighorn in 1876 (alongside Sitting Bull) before returning to the Northern Plains. Unfortunately, Crazy Horse and his community faced an unwavering enemy; forced to keep moving — and fighting — to evade federal resettlement, the chief and his 1,100 followers ultimately surrendered to the U.S. military at Fort Robinson in May 1877. There, in the wake of his arrest (and under the banner of truce), Crazy Horse was stabbed during a scuffle with U.S. soldiers and died of his injuries. He is remembered for his courage, leadership, and his endless perseverance against the colonizing forces.
Photo credit: MPI/ Archive Photos via Getty Images
Sacagawea (c. 1788-1812 or 1884)
Sacagawea was only around 16 years old when she carved her place in Native American history through her ability to communicate with different peoples. Kidnapped by the Hidatsa (Indigenous people of North Dakota) at age 12, Sacagawea was then claimed by French Canadian trader Toussaint Charbonneau as one of his wives at age 13. Despite this treatment, upon the arrival of explorers Meriwether Lewis and William Clark to Hidatsa territory in 1804, the young woman proved herself invaluable. Chosen by her husband to serve as interpreter as he and the explorers moved west, she rescued records and supplies from the river when the crew’s boat tipped and took on water, helped acquire horses from her brother when the expedition passed through Idaho, and saved her counterparts from starvation as they faced food shortages. Most importantly, her role as translator helped assure safety for both her own team and the Indigenous communities they crossed paths with. Her knowledge and wherewithal earned her momentous respect from the 45 white men who relied on her, and ultimately made the expedition a success. Her date of death remains a mystery. Following the expedition, Sacagawea and Charbonneau worked for the Missouri Fur Company in St. Louis in 1810, and it was believed that Sacagawea succumbed to typhus in 1812. However, some Native American oral histories claim that she lived until 1884 on the Shoshone lands where she was born.
Advertisement
Advertisement
Photo credit: Peter Turnley/ Corbis Historical via Getty Images
Wilma Mankiller (1945-2010)
For 10 years, Wilma Mankiller served as the principal chief of the Cherokee Nation, the first woman to do so. Born on the territory in 1945, Mankiller and her family were moved to a housing project in California in the 1950s, where they endured culture shock, racism, and the effects of poverty, which shaped the future chief’s ethos. Mankiller returned to Cherokee territory in 1977, where she founded the Community Development Department for the Cherokee Nation, and advocated endlessly for improved education, health care, and housing services.
For these efforts, then-Principal Chief Ross Swimmer asked her to run as his deputy in 1983. Two years later, Swimmer stepped down to lead the Bureau of Indian Affairs, and Mankiller became principal chief, serving until 1995. She was celebrated for lowering infant mortality rates, boosting education, and working to ensure financial and social equality. Mankiller was inducted into the National Women’s Hall of Fame in 1993, received the Presidential Medal of Freedom in 1998, and continued to advocate for women’s rights and Indigenous rights until her death in 2010 at age 64.
Advertisement
Advertisement
The Most Popular Baby Names Throughout the 20th Century
Depending on where you lived and when you grew up, it’s possible you might have known more than one person with the same name. Maybe there was a Jennifer A. and a Jennifer L., or maybe you knew four different people named Michael. Year after year, decade after decade, there are trends in baby names that draw on history, religion, and cultural references. Here are the most popular baby names in the United States during each decade of the 20th century.
Between 1900 and 1909, the most popular name for boys in the U.S. was John, and the most popular girls’ name, by a long shot, was Mary. This is according to data from the U.S. Social Security Administration, based on people applying for Social Security cards. There were 84,591 applications under the name John, and 161,504 entries for Mary. These two names popped up time and time again throughout the 20th century. Both names come from the Bible — John is one of Jesus’ disciples, and Mary is the name of both Jesus’ mother and Mary Magdalene. After John, the most popular boys’ names of this decade were William, James, George, and Charles, and the most popular girls’ names after Mary were Helen, Margaret, Anna, and Ruth.
Photo credit: FPG/ Archive Photos via Getty Images
1910s
Between 1910 and 1919, the most popular names were once again John and Mary. In this decade, there were 376,312 registered Johns and 478,637 Marys. Why the sudden jump? For one, the Social Security Administration began collecting data in 1937, so anyone born before that was only counted if they applied for a Social Security card after 1937. (That means the data for the 1900s, 1910s, and 1920s is based on people who listed their birthdays in these decades despite obtaining cards later in life, and doesn’t count anyone born in this period that didn’t apply for a Social Security card.) The U.S. also saw a population spike as infant mortality rates decreased throughout the 20th century, thanks to advances in health care and better access to clean water.
In the 1910s, for the second decade in a row, the second most popular names for boys and girls were William and Helen, respectively, followed by James, Robert, and Joseph for boys, and Dorothy, Margaret, and Ruth for girls. William has long been a popular English name dating back to William the Conqueror, who became the first Norman king of England in the 11th century. Helen, meanwhile, has its origins in Greek mythology: Helen of Troy was a famous beauty, known as the “face that launched a thousand ships.”
Between 1920 and 1929, John finally fell out of the top spot, as the most popular name for boys was Robert, with 576,373 entries. Robert, like William, dates back to English royalty and translates to “bright with fame” or “shining.” Mary stayed strong for girls, with 701,755 registered applications. The 1920s saw continued population increases both in the U.S. and worldwide. This is sometimes credited to a baby boom that occurred after World War I and the Spanish influenza, but is largely due, as in the previous decade, to better health care.
Photo credit: Hulton Archive/ Archive Photos via Getty Images
1930s
Between 1930 and 1939, Robert and Mary stayed at the top of the list, with 590,787 Roberts and 572,987 Marys. Though there were more Roberts born this decade than in the previous one, there was a decline in the birth rate overall due to the strain that the Great Depression placed on families. (The overall population was still higher in 1940 than in 1930, at roughly 132 million versus 123 million people.) A few new interesting names entered the runner-up positions in the 1930s. In female names, Betty and Barbara grew in popularity. Betty is a nickname for Elizabeth, a versatile name with Hebrew origins that is also found in English royalty (namely, Queen Elizabeth I). Barbara, like Helen, comes from Greek, and is also the name of St. Barbara, the patron saint of armorers, miners, and artillerymen. For boys’ names, the runners-up after Robert were James, John, William, and Richard.
Between 1940 and 1949, the name Robert fell to the second spot after James, which had 795,753 entries. Mary remained the most popular name for girls at 640,066 entries. The name James derives from Hebrew, and, like John, stems from a number of uses in the Bible. Like many other popular names, James is also found in the English monarchy, as well as the Scottish monarchy. Though it’s fallen out of the top slots in recent years in the United States, James remains one of the most popular baby names in Scotland. The next most popular boys’ names in the 1940s were Robert, John, William, and Richard; for girls, the list included Linda, Barbara, Patricia, and Carol. Interestingly, while Linda was never the most popular name in any given year, it is the most popular American baby name of all time, translating to “beautiful” in Spanish and Portuguese. Patricia, on the other hand, had been popular in England long before its time in the states, as it was the name of Queen Victoria’s granddaughter.
Photo credit: George Marks/ Hulton Archive via Getty Images
1950s
Between 1950 and 1959, the names James and Mary remained at the top of the list with 843,711 and 625,601 entries, respectively. Not far behind James, however, was a new popular name: Michael. Michael, like James, stems from the Hebrew Bible, and variations of the name exist across a number of languages, such as Miguel in Spanish and Micha in German. After James and Michael, Robert, John, and David topped the list for boys’ names, while Linda, Patricia, Susan, and Deborah followed Mary for the most popular girls’ names.
Between 1960 and 1969, everything changed, as is fitting for this revolutionary decade. Both James and Mary were unseated from the No. 1 slot: Michael became the most popular name for boys at 833,102 entries, and Lisa for girls at 496,975 entries. In fact, there were almost 150,000 more Lisas than Marys in the 1960s. The name is another variation on the popular moniker Elizabeth, and even Elvis Presley picked it for his daughter, Lisa Marie, who was born in 1968. While not much else changed in boys’ names this decade, popular girls’ names saw the addition of newcomers Susan, Karen, and Kimberly.
Between 1970 and 1979, Michael remained the most popular name for boys, topping out the decade with 707,458 entries, while Jennifer unseated the short-lived reign of Lisa with 581,753 entries. There were more new names that cropped up in the second and third slots, however, including Christopher and Jason for boys. The name Jennifer, meanwhile, grew so popular, it became known as the “standard” name for a baby girl. The initial spike in Jennifers started 50 years prior with the appearance of the name in a George Bernard Shaw play called The Doctor’s Dilemma. After Jennifer, the most popular ’70s girls’ names were Amy, Melissa, Michelle, and Kimberly.
Between 1980 and 1989, Michael retained its title as the most popular name for boys, with 663,827 entries, while Jessica just barely unseated Jennifer as the most popular name for girls — there were 469,518 Jessicas versus 440,896 Jennifers. Jessica stems from the Hebrew Bible, where its original spelling was “Jeska”; the common spelling in English comes from William Shakespeare’s play The Merchant of Venice. The top five boys’ names in the 1980s were Michael, Christopher, Matthew, Joshua, and David, and the top five for girls were Jessica, Jennifer, Amanda, Ashley, and Sarah.
Between 1990 and 1999, Michael and Jessica stayed the most popular names for each gender, with 462,390 Michaels and 303,118 Jessicas. Still, there were fewer entries for both than in the previous decade, in part because a handful of newer, trendy names cropped up as well, such as Matthew, Justin, and Andrew for boys and Ashley and Tiffany for girls. Andrew, like James, is a popular name with links to Scotland, while Matthew goes back to the Bible. Ashley and Tiffany, meanwhile, reflect the trend of girls’ names ending in “y” — names such as Brittany, Courtney, Emily, and Kelsey took off in the beginning of the 21st century.
Advertisement
Advertisement
Another History Pick for You
Today in History
Get a daily dose of history’s most fascinating headlines — straight from Britannica’s editors.