One hundred years is a long time in the life of a city. New technologies emerge and wane, people come and go, cultural factors ebb and flow. But not all cities change at the same rate; some stay comparatively similar to their older incarnations, while others become drastically different. Here’s a glimpse at what a few iconic state capitals looked like a century ago.
Credit: Buyenlarge/ Archive Photos via Getty Images
Atlanta, Georgia
Atlanta was named after the Western and Atlantic Railroad, for which it was a terminus. In the early 20th century, the city was well established as a major railway hub, and the downtown was built around its first train station. Hotels were concentrated in an area near the station (called, fittingly, Hotel Row) in order to serve train travelers, and by the 1920s, masonry high-rises created the city’s skyline.
Like many cities during this period, Atlanta was beginning to expand its roads in order to accommodate increasing numbers of cars. In the 1920s, the city built three major viaducts to allow traffic to bypass the high number of railroad crossings. The Central Avenue, Pryor Street, and Spring Street (later renamed Ted Turner Drive) viaducts not only improved vehicle safety, but also led to development outside the city’s downtown core.
Though Boston was established as a colonial port city as early as 1630, a wave of immigration between 1880 and 1921 fueled a population boom and a sense of transition similar to what many younger cities were facing at the time. An expanding population created a need for a building boom, and changes wrought by the Industrial Revolution were at the forefront. The industrialization of nearby Springfield, Massachusetts led to a high population of mechanics and engineers in that city, and it became a hub for the nascent automotive industry. Rolls-Royce selected Springfield as the site of its U.S. factory, and many other early auto manufacturers were based in the area. In fact, Massachusetts claimed to have manufactured more cars at the beginning of the 20th century than Detroit, Michigan. Cars were particularly popular in Boston — more so than in many other cities — and 1 in 8 Bostonians were car owners by 1913. This led to the construction of a large number of buildings dedicated to automobiles, including garages, repair shops, car dealers, and more.
In terms of architecture, the city’s affluent Beacon Hill neighborhood appears very similar today to how it looked in the 1920s, with well-preserved colonial-style and Victorian buildings. However, little remains of Boston’s once-abundant theater district, which reached a peak count of 40 theaters by 1935.
Nashville has a storied history as a center of American popular music, but that history was in its very infancy 100 years ago. The famous Grand Ole Opry didn’t begin until the middle of the 1920s, first broadcasting as “the WSM Barn Dance,” and at the time, it was hardly the institution it would become later. In those days, it was purely a radio show broadcast out of the WSM studio on the fifth floor of the National Life and Accident Insurance building, with only as many spectators as could fit in the limited confines of the station’s Studio A.
Unlike other major capitals, Nashville wasn’t a city of high-rises — the 12-story Stahlman Building was the tallest building from the time of its completion in 1908 until the L&C Tower was built in the 1950s — and much of the low-rise brick and masonry buildings from the last century are preserved today. This is particularly true along the First Avenue front of the Cumberland River, and along SecondAvenue, formerly known as Market Street.
Though Austin’s population began steadily growing around the end of the 19th century, in 1920 it was only the 10th-largest city in Texas, with a population just under 35,000. Its visual focal point was the Texas State Capitol Building (the UT Tower didn't exist yet), and the surrounding downtown consisted of low- and mid-rise buildings with brick or stone facades — an aesthetic that was more “Main Street” than “metropolis.” Cars weren’t quite as dominant in Austin as in larger cities of the time, and horse-drawn carriages were still seen on the streets.
Phoenix is another city that had a relatively small population in 1920 — just around 29,000 — but it was still the largest city in a state that had existed for only eight years. Because of this, Phoenix had the flashiness and bustle of an up-and-coming city, despite its small size. The city’s first skyscraper, the Heard Building, was even the site of a stunt climbing performance shortly after it was built. Nonetheless, the Heard Building’s height of seven stories might not pass for consideration as a skyscraper in larger cities. The 10-story Luhrs Building surpassed it in height when it opened in 1924, and the 16-story Hotel Westward Ho became the city’s tallest building in 1928. It held that title for more than 30 years, as the vast availability of land surrounding Phoenix disincentivized vertical construction, in favor of outward land expansion.
Sacramento is often overshadowed by other iconic California cities, but 100 years ago it boasted a downtown of ornate classical architecture, was home to the largest manufacturing train yard in the western United States, and served as a major retail hub for the region. Vital downtown structures of the time — such as Sacramento City Hall, Memorial Auditorium, the California State Life Building, and the Federal Building — were all built during a construction boom that occurred between 1912 and 1932. But there isn’t much evidence of this architectural period today, as even some surviving buildings, such as Odd Fellows Hall, have been remodeled with simpler midcentury-style facades.
Throughout the 1920s, New York City’s Harlem neighborhood served as the vibrant headquarters of a transformative period in African American art, literature, music, and social justice leadership. This movement, known as the Harlem Renaissance, was a catalyst for celebrating African American culture and heritage, giving the Black community newfound ownership of their experiences and pride in how their stories were told. It also sought to challenge racial stereotypes and forge social and political equality, planting ideas that would be meaningful for years to come. Here, told in six facts about the movement, is the story of the Harlem Renaissance.
From the 1910s until the 1970s, approximately 6 million Black Americans made their way from the Southern U.S. to Northern, Midwestern, and Western states, fleeing racial discrimination and economic hardships, and seeking better work and education opportunities. Known as the Great Migration, this mass movement transformed the country’s demographic landscape and was a major impetus for the Harlem Renaissance. By the 1920s, some 200,000 newcomers had made the New York City neighborhood of Harlem home; at just 3 square miles in size, the neighborhood had the largest concentration of Black people in the world, with people from all backgrounds, including artists, laborers, scholars, and writers. By the early ’20s, a vibrant cultural community was blossoming in this small corner of Upper Manhattan.
Photo credit: Stock Montage/ Archive Photos via Getty Images
Magazines Were Crucial to the Movement
In March 1924, a dinner party at Harlem’s Civic Club brought together a group of emerging and established writers and publishers. That gathering is now widely regarded as kicking off the Harlem Renaissance. The movement encompassed a wide variety of creative arts, but at its core was the literary scene. Two publications in particular emerged as crucial platforms for burgeoning African American writers at the time. One was The Crisis, founded in 1910 as the official magazine of the National Association for the Advancement of Colored People (NAACP) by renowned civil rights activist and sociologist W.E.B. Du Bois. It initially focused on social and political issues but expanded its content to include a wider representation of African American life and ideas. The magazine provided space for then-unknown writers such as Langston Hughes, Claude McKay, and Jean Toomer to share their literary works, and by the start of the 1920s, The Crisis was distributing 100,000 copies a month. Another magazine, Opportunity, emerged in the early 1920s, and also amplified creative Black voices, including that of Zora Neale Hurston.
Some of the 20th century’s most important African American creatives and activists emerged during the Harlem Renaissance. In literature, leaders such as Zora Neale Hurston and Langston Hughes, as well as Countee Cullen, among others, sought equality while celebrating Black identity and heritage. Meanwhile, jazz and blues maestros such as Louis Armstrong, Duke Ellington, and Bessie Smith became powerful musical voices reflecting the social climate of the time. The Harlem Renaissance was also a battleground for social justice, with activists such as W.E.B. Du Bois and Marcus Garvey advocating for equal rights and laying crucial groundwork for the civil rights movement. Today, the era’s vibrancy is immortalized in celebrated visual art from painters Beauford Delaney, Archibald Motley, and sculptor Augusta Savage. The collective impact of these icons continues to reverberate today.
The Harlem Renaissance Music Scene Was Boosted by Prohibition
While jazz music predates the Harlem Renaissance, its popularity soared during the 1920s “Jazz Age” — and one of the genre’s unlikely benefactors was Prohibition. In January 1920, the United States banned the manufacture and sale of “intoxicating liquors,” and the law quickly spurred the creation of underground bars known as speakeasies. By the mid-1920s, thousands of New York speakeasies were competing for business, and to attract more of the predominantly wealthy, white crowds, entertainment was brought in regularly — jazz music legends such as Louis Armstrong and Duke Ellington were among the most popular performers at the time. Speakeasy culture led to ample and well-paying opportunities for Black musicians, and it introduced jazz music to a broader audience beyond Harlem.
Despite its name, the Harlem Renaissance wasn't confined solely to Upper Manhattan. While Harlem was the symbolic capital, this revival of Black culture resonated as a nationwide movement in urban centers such as Chicago (whose own movement was known as the Chicago Black Renaissance) and Los Angeles, while also finding its way to smaller pockets throughout the country. In Texas, African American poet Bernice Love Wiggins had several works published in local papers, and self-published a poetry collection that has drawn comparisons to some of the era’s literary greats, including Hurston and Hughes. Short story writer and poet Anita Scott Coleman, from New Mexico, was considered one of the American West’s most important Harlem Renaissance contributors. Her work was published in the influential magazine The Crisis, and she eventually published more than 30 short stories in other periodicals. Aaron Douglas, one of the most renowned visual artists of the Harlem Renaissance, began his career in Kansas, but was eventually convinced by Opportunity magazine founder Charles S. Johnson to bring his talents to the movement’s epicenter in Harlem, extending his influence even further.
The 1929 stock market crash and the onset of the Great Depression plunged the United States into economic turmoil, and as a result, most of the Harlem Renaissance’s patronage and financial support — not to mention its artistic energy — waned. The end of Prohibition in 1933 dealt a final blow to the remaining nightlife scene, and by the mid-1930s, many of the onetime pillars of the community had moved on in search of other work to make ends meet. By 1935, the initial optimism and empowerment fostered by the Harlem Renaissance withered in the face of deep-rooted racial prejudices and inequalities that persisted despite the era’s advancements. In March of that same year, the tensions culminated in a defining event, when rumors about the arrest and police treatment of a Latino teenager accused of shoplifting spread throughout Harlem, sparking a deadly riot that has come to be seen as the official end of the Harlem Renaissance. But while the era was over, its influence lived on. African American thinkers, artists, and activists gained recognition and validation like never before during the Harlem Renaissance, and the consciousness that the movement cultivated helped fuel the Civil Rights Movement throughout the 1950s and ’60s.
The 1960s marked an exciting new era in air travel. The inclusion of jet engines on commercial planes led to the emergence of larger, faster vessels such as the Boeing 707, and made flying a more affordable and accessible way to travel. It was also a luxury experience: The golden age of air travel cultivated an in-flight atmosphere akin to a cocktail party as guests dined, drank, and smoked en route to their destination.
These glamorous flights were a far cry from the buslike airplanes of the 21st century. Aircraft interiors in the 1960s were more roomy and colorful, and boasted a cultivated aesthetic that applied to everything from flight attendants’ uniforms to dining utensils. Between 1958 and 1972, almost half of all Americans had taken to the friendly skies. Here’s a glimpse of what they would have experienced.
Photo credit: HUM Images/ Universal Images Group via Getty Images
Airports Were Bright, Spacious, and a Testament to Midcentury Design
In the middle of the 20th century, airport guests were treated to state-of-the-art architecture. Features such as floating staircases, designer seats, and minimalist accents made terminals warm and comfortable; bars and lounges were bright and capacious. Today, some travelers can still live like it’s the 1960s: In 2019, JFK Airport in New York City opened the TWA Hotel, an homage to the airport’s original TWA terminal. The refurbished terminal was updated to accommodate overnight guests and diners; the retro space incorporates design elements from the terminal’s original rooms and lounges.
Flight Attendants Were at the Forefront of Fashion
In 1964, advertising executive Mary Wells Lawrence worked with Braniff International Airways to launch the airline’s new identity, a rebranding dubbed “End of the Plain Plane.” The stylish campaign started a movement. Airlines quickly began adopting Braniff’s approach to pops of colors and bold prints, and flight attendant uniforms became canvases for fashion designers such as Emilio Pucci and Jean Lewis.
Airlines Encouraged On-Flight Smoking and Served Gourmet Meals
Airline passengers in the 1960s could expect the full restaurant experience, including access to cigarettes. Smoking was only discouraged when on the ground (for fear of igniting refueling fumes), but guests were free to smoke cigarettes, pipes, and cigars while in the air. Gourmet food was just as available: Lobster, ham leg, and caviar were regularly served and even prepared tableside midflight, creating a fine dining experience.
The combination of free alcohol and a lack of smartphones led to an abundance of onboard drinking in the ’60s. Champagne was offered to first-class passengers, and upper-deck lounges were equipped with walk-up bars. In 1964, in-flight movies were introduced using closed-circuit technology, offering a necessary alternative to partying at 30,000 feet.
In the 1960s, there was no need to show up to the airport two hours before a flight, as security was much more lax than it is today. In fact, an ID wasn’t even needed to board an airplane until the early 21st century, and you were basically guaranteed to make a flight if you showed up 30 minutes before departure. Friends and family members could escort departing guests right up to their gate, and security screenings weren’t implemented until 1973, when passengers began submitting to metal detectors and bag searches.
In the golden age of flight, even passengers in economy experienced today’s business-class legroom and complimentary sustenance. First-class guests, meanwhile, had access to cocktail lounges, reclining seats, six-course meals, and fashion shows put on by flight attendants. Flying in the 1960s was also a much more social experience. Onboard, passengers could schmooze with fellow guests, including celebrities.
Passengers paid premium prices to experience air travel in the 1960s. Before the Airline Deregulation Act of 1978, the federal government controlled ticket rates and set fees high to ensure airlines would prosper. International flights were four to five times more expensive than they are now, and could cost as much as $500, the equivalent of more than $4,000 today. A round-trip flight within the United States was less, but could set passengers back roughly $100 to $400 (the equivalent of about $800 to $3,000), considerably more than a domestic flight today.
H. Armstrong Roberts/ClassicStock/ Archive Photos via Getty Images
Author Rachel Gresh
January 4, 2024
Love it?200
Elvis was on the radio, TheEd Sullivan Show was on the TV, and scores of people were hightailing it to the suburbs — this was 1950s America. It was a young nation, with 31% of its 151 million residents under age 18, and it was on the brink of change. Birth rates continued to rise at unprecedented levels, giving way to a new generation of “baby boomers.” The “nuclear family” (describing married couples with kids at home) was ingrained in the culture; more than half of all people (68% of men and 66% of women) were married. By the time the ’60s rolled around, many of these cultural norms would be upended, but this generation left a lasting mark on American society. Here is a snapshot of family life in the 1950s, by the numbers.
Post-World War II America saw a rapid increase in birth rates lasting from 1946 through 1964. It became known as the “baby boom,” and the 1950s were smack dab in the middle of it. During the ’50s, around 4 million babies were born every year in America, a sharp increase from the previous average, around 2.7 million births annually between 1910 and 1945. By the end of the boom, around 77 million babies had been born. This influx of births was due to many positive aspects of the postwar era, including low unemployment rates, a burgeoning economy, low interest rates, and a strengthened middle class.
In alignment with the nuclear family mindset, most ’50s households consisted of a married couple, and typically only one spouse worked (generally the man). In 1950, only 29% of working-age women living in the U.S. held a job — but nearly half of single working-age women (46.4%) worked. The number decreased dramatically among married working-age women; less than a quarter of them (21.6%) held jobs.
By 1960, the number of working women in America increased from 16.5 million to nearly 22.5 million, a 35% increase (despite only a 14% increase in the population of working-age women). The five most popular jobs for women of this era were secretary (stenographer or typist), salesperson (retail), schoolteacher, bookkeeper, and apparel factory worker.
Advertisement
Advertisement
Photo credit: Camerique/ Archive Photos via Getty Images
Mortgage Rates Averaged Around 2.5%
The housing market of the 1950s was booming. An increasing number of Americans were leaving busy urban lifestyles behind in favor of the suburbs. Mortgage rates ranged between 2.1% in 1950 and 2.6% in 1959. For the 16 million World War II veterans living in 1950, the G.I. Bill lowered mortgage rates even more. For many, it was the perfect time to purchase a home.
One of the most recognizable examples of 1950s suburban neighborhoods must be the “Levittowns,” named after real estate developer William Levitt, who built thousands of houses in planned communities around the mid-Atlantic during the late 1940s and early ’50s. The most famous of these communities was in Long Island, New York, where during peak construction, one house was built every 16 minutes.
Around 4.4 million homes had television sets by 1950. This might sound like a lot for the era, but it was only 9% of households. By the end of the decade, the figure spiked to 90% of households, marking a transformational decade for entertainment. Television programming, especially the American sitcom, became a staple of family life. These shows epitomized the stereotypical American family unit, from the Cleavers of Leave It to Beaver to the Warrens of Father Knows Best.
Although the golden age of Hollywood was nearing its end in the 1950s, cinemas were still as popular as ever, and fortunately for moviegoers, this pastime didn’t cost a fortune. In 1950, one theater ticket cost 46 cents, which was less than the price of a dozen eggs (60 cents). A family of four could go to the movies for the price of around two gallons of milk (one gallon cost 83 cents) — a feat that is not likely accomplished today.
Families flocked to theaters to see Disney’s Cinderella, the top-grossing film of 1950. Released on February 15, the film grossed more than $52 million that year and sold nearly 99 million tickets. Other top-grossing films of 1950 included King Solomon’s Mines, Father of the Bride, and All About Eve.
More Than Half of All Households Had Children at Home
Due to the ongoing baby boom, most American households had young children at home in the 1950s. Census records show that around 52% of households had children under 18 at home in 1950; in 2019, that number was down to 41%. Families were large during the decade: Around 58% of households had between three and five members, 21% had more than six members, 18% had two members, and only 3% had one member. The average family unit size has steadily declined since its peak during the late 1950s and early ’60s. In 2022, the average American family size was 3.13 people.
With so many Americans moving to the suburbs in the ’50s, more and more families depended on a car to get around. In 1954, most U.S. households (64%) owned one car. Between 1954 and 1960, the number of one-car families rose from 30.1 million households to 32.4 million. Multicar ownership wasn’t popular — a little more than 8% of households owned two cars in 1954, and only 0.9% had three or more cars. (Owning two cars became slightly more common by the end of the decade.) Just how much did a car set you back during the 1950s? Two popular family cars, the Cadillac DeVille and the Oldsmobile 88 Fiesta, cost around $3,523 and $3,541, respectively, which would be around $37,000 today.
Advertisement
Advertisement
7 Things You Would Find on a City Street 100 Years Ago
Minnesota Historical Society/ Corbis Historical via Getty Images
Author Tony Dunnell
December 20, 2023
Love it?100
If we could travel back 100 years and land on a typical city street, we’d probably be mightily discombobulated. Some things would seem familiar: the buzz of the urban environment, people walking this way and that, and buildings with facades that could well still exist today. But looking around, we’d soon realize that we weren’t in Kansas anymore — or if we were, it would be Kansas City in the 1920s.
A century ago, America was going through a monumental change. For the first time in U.S. history, more people were living in urban areas than in rural areas. The cities were booming, and for many middle-class Americans, the 1920s were a decade of unprecedented prosperity. People were earning more and spending more, advertising had reached new levels of sophistication, and the automobile was changing the way we live.
So, before you step into that time machine, you’d better brace yourself. Here are seven things you’d find in a city street a century ago, back in the dizzying days of the Roaring ’20s.
Before the development of practical light bulbs, street lights typically used piped coal gas, oil, or kerosene as fuel. The first electric streetlights were installed in Paris in 1878, but these used unwieldy and harsh arc lamps. Then came inventors such as Joseph Swan in the U.K. and Thomas Edison in the U.S., both of whom patented revolutionary incandescent light bulbs in 1880. Incandescent street lamps became the norm in many cities throughout the world, and the 1920s saw a wave of patents filed for innovative new street lighting. These electric lights, however, were often placed where they were needed rather than lining a whole street. So, 100 years ago, a city street at night would not have been as brightly lit as it is today, and pedestrians would often find themselves walking from one pool of yellowish light to the next.
Public phones and phone booths began appearing in the U.S. not long after Alexander Graham Bell patented the first telephone in 1876. By the 1920s, wooden phone booths were a fairly common sight on many city streets, but the wooden construction meant they were hard to maintain, limiting their popularity. In some cities, you’d be more likely to come across a public telephone room, which contained multiple booths. Individual outdoor phone booths became truly commonplace in the 1950s, when glass and aluminum became the booth-building materials of choice. Today, of course, public phones are heading rapidly toward extinction, now that most everyone can carry a phone in their pocket.
The art deco style flourished in the United States during the 1920s, in both the visual arts and architecture, as well as product design. Walking down a city street 100 years ago, art deco would have been everywhere, from the facades of grand buildings such as the Empire State Building and the Chrysler Building, to the window displays of newly emerging department stores such as Macy’s and Saks. The style, characterized by bold geometric patterns, vibrant colors, and glamorous details, became synonymous with the opulence and extravagance that defined the Roaring ’20s.
Thankfully, modern child labor laws ensure that we don’t see children working in the streets anymore. But 100 years ago, it was a common sight. In 1920, about a million children aged 10 to 15 were working in America, out of a total population of about 12 million children in that age range. The most visible were those working in city streets, in jobs such as flower seller, shoe shine, and courier. Children carried messages — and sometimes money and sales slips — throughout the city, facilitating daily commerce for banks, factories, and offices. Even more notable were the “newsies,” young children (some as young as 5) who sold newspapers in the street. But by the end of the decade, a growing preference for home delivery and tougher child labor laws led to the decline of the “newsie” in urban America.
If we traveled back 100 years, one of the first things we might notice is the fashion of the day. Men would be walking the streets wearing three-piece suits, thin bow ties, wingtip shoes, and the then-ubiquitous fedora hat. Sportswear was also becoming acceptable menswear, thanks in large part to the growing popularity of golf, which brought longer “plus four” trousers and wide-legged oxford bag pants to the urban milieu. Women’s fashion, meanwhile, reflected the newfound freedoms of the day. The dresses of the 1920s were loose, straight, and slender, with shorter hemlines. This was typified by the flapper style of the Jazz Age, with dropped waistlines and calf-revealing dresses — clothing that was stylish but also allowed women to move. New hairstyles completed the look, with bobs and waves becoming the defining cuts of the ’20s.
Today, we don’t encounter many horses on our city streets, but go back 100 years and you’d still occasionally see peddlers, milk trucks, coal wagons, and fire wagons using horse-drawn carriages. The heyday of horses, however, was coming to an end. In 1916, for instance, there were 46,662 horse-drawn vehicles in Chicago. By the end of the 1920s, this number had plummeted, and by 1940 there were fewer than 2,000 horse-drawn vehicles in the city. In New York City, meanwhile, the last horse-drawn fire engine was retired in 1922. The rise of the automobile had begun in earnest, bringing about a permanent change in the very nature of city streets.
Arguably no invention changed the everyday lives of Americans in the 20th century more than the automobile. Between 1900 and 1920, the number of cars increased dramatically, from 8,000 to 8 million. By 1929, there were more than 23 million automobiles on American roads. These early vehicles, of course, looked very different from the cars we drive today. A hundred years ago, the car of choice was the Model T Ford, which brought driving to the masses. By the early 1920s, more than half of the registered automobiles in the world were Model Ts. They were so popular that Henry Ford himself once quipped, “There’s no use trying to pass a Ford, because there’s always another one just ahead.” Pedestrians, however, found it hard to adapt to the new laws of the street. With the introduction of jaywalking laws — facilitated by automobile manufacturers themselves — the streets became a place for cars, rather than for pedestrians, horse-drawn carts, and children at play, as they once had been.
The 20th century produced an array of iconic toys that captured the public’s imagination and, in some cases, continue to delight young people worldwide. The Slinky, originating in the 1940s, and the Rubik’s Cube, first sold in the United States in the early 1980s, have remained more or less the same since their invention, invoking a nostalgic simplicity. Other toys, such as LEGO and Barbie, have offered up countless iterations, weathering changing trends to endure in popularity and appeal. The legacy of these toys is in more than just their entertainment value — it’s in the way they reflected or even set cultural trends, interests, and technological advancements. Here are some of the most popular toys throughout the 20th century, many of which are still around today.
In the early 1940s, United States industry was largely focused on producing goods for the war effort, and it was during this time that the Slinky was accidentally invented. Richard James, a mechanical engineer, stumbled on the idea in 1943 while working with tension springs for naval equipment at a Philadelphia shipyard. After accidentally knocking some of his prototypes off a shelf, James couldn’t help but notice the way one of them “walked” down a stack of books on his desk. He worked on this strange spring — which his wife named “Slinky” after seeing the word in the dictionary — over the next two years. By the end of 1945, James got an initial run of 400 Slinkys into a local department store. It wasn’t until he staged a live demonstration, however, that the product’s popularity picked up, and the toy sold out. Within the first 10 years, he sold 100 million. The Slinky has endured for decades, not only as a popular toy on its own, but also through licensing and its iconic jingle — the longest-running jingle in television advertising history.
LEGO is known for its colorful modular plastic bricks, but when the company started in Denmark in 1932, it made wooden toys such as cars and yo-yos. Plastic toys didn’t come along until the late 1940s, when founder Ole Kirk Christiansen developed the forerunner of the buildable bricks we know today, known at the time as Automatic Binding Bricks. In 1958, the modern LEGO brick was patented, with an updated interlocking design that became its signature.
Through a deal with Samsonite, LEGO made its way to Canada and the U.S. in the early 1960s, but the iconic toy didn’t truly find its footing in North America until the early 1970s. The New York Times claimed the toy had been “ineptly marketed” since its stateside arrival, and the then-head of LEGO’s U.S. operations called the deal with Samsonite “a disaster.” In 1973, however, the company took over its own U.S. production and sales and, per the Times, sales “soared.” LEGO grew to be much more than a toy in the ensuing decades — it became an entertainment empire. Throughout it all, the company has stood by its name, which also happens to be its guiding principle: LEGO is an abbreviation of the Danish words “leg godt,” meaning “play well.”
When Mattel released the first Barbie doll on March 9, 1959, it was the first time that most children had seen a three-dimensional, adult-bodied doll — the norm at the time were baby dolls designed to be taken care of. Ruth Handler, the co-founder of Mattel and creator of Barbie, had a different idea. After watching her daughter Barbara, the toy’s namesake, play with paper dolls, Handler envisioned a doll that was a little bit older and could inspire more aspirational play: Young girls could see their future selves in the doll, instead of a child to nurture. Barbie’s initial launch at the New York Toy Fair faced some skepticism from other toy industry executives, but Handler’s instincts were right: Around 300,000 Barbies sold within the first year. As beloved as Barbie was, though, she also courted controversy. Early on, detractors were uncomfortable with the doll’s figure. Barbie was at times criticized for being too conventional; other times, too progressive. But the doll’s popularity endured as the company diversified her looks, skin tones, body types, and, of course, jobs: Throughout her lifetime, Barbie has explored more than 250 different careers. The cultural phenomenon continues to this day: Around 1 billion Barbie dolls have been sold, and in 2023, the first live-action movie based on Barbie became the year’s biggest release.
Photo credit: Justin Sullivan/ Getty Images News via Getty Images
G.I. Joe
Following Mattel’s major Barbie breakout, rival toy company Hasbro sought a similar success story. Barbie thrived by marketing primarily to young girls, and Hasbro aimed to fill a gap in the market with a toy made for boys. In the early 1960s, toy maker Stan Weston approached Hasbro with an idea for a military toy, but was turned down. One Hasbro executive, Don Levine, saw the toy’s potential, however, and workshopped the idea until the company approved. It wouldn’t be called a doll, though — Hasbro created the term “action figure” to market the new product, and even forbade anyone in the company from referring to it as a doll. Released in 1964, the original G.I. Joe line consisted of four 12-inch figures, one for each of the U.S. military branches: the Army, Navy, Air Force, and Marines. The action figure took off, and within two years, G.I. Joe accounted for almost 66% of Hasbro’s overall profits. The franchise eventually created less military-centric characters, expanded to comic books and animated series, and embraced sci-fi, espionage, and team-based narratives that have carried the toy as a symbol of adventure and heroism across generations.
At first glance, a Rubik’s Cube appears simple, but the mathematically complex puzzle is anything but, and solving it is a problem that has captivated the public ever since the toy’s invention. Created by Hungarian architect and professor Ernő Rubik in 1974, the first "Magic Cube," as he called it, resulted from months of work assembling blocks of wood with rubber bands, glue, and paper. After painting the faces of the squares, Rubik started twisting the blocks around, and it took him weeks to get it back to its original state. One month later, he finally did. He patented the toy Rubik’s “Buvos Kocka,” or “Magic Cube,” and it first appeared in Hungarian toy shops in 1977. Within two years, 300,000 Hungarians had bought the puzzling cube. By 1980, an American toy company was on board, and international sales of the renamed Rubik’s Cube took off — 100 million were sold in three years. As quickly as the craze started, however, it seemed to fade. TheNew York Timesreported in 1982 that it had “become passe,” replaced by “E.T. paraphernalia…[and] electronic video games.” But the toy has nonetheless endured, and to date, an estimated 350 million colorful cubes have been sold, making it one of the bestselling puzzles in history.
Photo credit: Barbara Alper/ Archive Photos via Getty Images
Cabbage Patch Kids
Known for their one-of-a-kind features, unique names, and adoption certificates, Cabbage Patch Kids caused a full-on frenzy in the 1980s, leading to long lines at stores — and even riots. Although the dolls are known as the invention of Xavier Roberts, whose signature is on every doll, the origin story reportedly starts with a folk artist named Martha Nelson Thomas. In the late 1970s, Thomas was selling her handmade “doll babies” at craft fairs in Louisville, Kentucky. Roberts reportedly resold the doll babies at his own store for a while, but eventually remade and renamed them Cabbage Patch Kids. (Thomas eventually took Roberts to court over the copyright, but the pair settled in 1985.) In 1982, Roberts licensed his dolls to the Coleco toy company, and the following year, thanks to a robust advertising campaign, demand was much greater than supply , sparking angry mobs of disappointed parents that holiday season. Around 3 million Cabbage Patch Kids had been “adopted” by the end of 1983, and over the next two years, sales topped half a billion dollars. The doll’s popularity faded quickly after that, but Cabbage Patch Kids remain toy store fixtures to this day.
Advertisement
Advertisement
Photo credit: Chesnot/ Getty Images News via Getty Images
Tamagotchi
In the early 1990s, video game consoles were household staples, and by the end of the decade, tech toys such as Tickle Me Elmo and Furbies caused consumer crazes. But one pocket-sized toy that combined the best of both worlds was a ’90s must-have: the virtual pet. The handheld pixelated companions required regular feeding and playing, imbuing users with responsibility and emotional attachment, and engaging them in a type of continual play that was relatively new at the time.
The most popular virtual pet was the Tamagotchi, created by Japanese toy company Bandai. It was released in the United States on May 1, 1997, six months after it was launched (and subsequently sold 5 million units) in Japan. After the first day of the toy’s U.S. release, some stores were already sold out. Within the year, there were several competing virtual pets: GigaPets and Digimon offered different pet options and more gameplay. The constant connectivity of the virtual pets led to schoolbans, and as the internet gained traction in the late ’90s and early 2000s, online versions such as Neopets all but replaced the Tamagotchi. Virtual pets had an undeniable influence on future trends in gaming and handheld electronic devices, and while the toy has gone through several iterations and relaunches over the years, the original Tamagotchi largely remains a nostalgic relic of the ’90s.
Advertisement
Advertisement
The Most Popular Baby Names Throughout the 20th Century
Depending on where you lived and when you grew up, it’s possible you might have known more than one person with the same name. Maybe there was a Jennifer A. and a Jennifer L., or maybe you knew four different people named Michael. Year after year, decade after decade, there are trends in baby names that draw on history, religion, and cultural references. Here are the most popular baby names in the United States during each decade of the 20th century.
Between 1900 and 1909, the most popular name for boys in the U.S. was John, and the most popular girls’ name, by a long shot, was Mary. This is according to data from the U.S. Social Security Administration, based on people applying for Social Security cards. There were 84,591 applications under the name John, and 161,504 entries for Mary. These two names popped up time and time again throughout the 20th century. Both names come from the Bible — John is one of Jesus’ disciples, and Mary is the name of both Jesus’ mother and Mary Magdalene. After John, the most popular boys’ names of this decade were William, James, George, and Charles, and the most popular girls’ names after Mary were Helen, Margaret, Anna, and Ruth.
Photo credit: FPG/ Archive Photos via Getty Images
1910s
Between 1910 and 1919, the most popular names were once again John and Mary. In this decade, there were 376,312 registered Johns and 478,637 Marys. Why the sudden jump? For one, the Social Security Administration began collecting data in 1937, so anyone born before that was only counted if they applied for a Social Security card after 1937. (That means the data for the 1900s, 1910s, and 1920s is based on people who listed their birthdays in these decades despite obtaining cards later in life, and doesn’t count anyone born in this period that didn’t apply for a Social Security card.) The U.S. also saw a population spike as infant mortality rates decreased throughout the 20th century, thanks to advances in health care and better access to clean water.
In the 1910s, for the second decade in a row, the second most popular names for boys and girls were William and Helen, respectively, followed by James, Robert, and Joseph for boys, and Dorothy, Margaret, and Ruth for girls. William has long been a popular English name dating back to William the Conqueror, who became the first Norman king of England in the 11th century. Helen, meanwhile, has its origins in Greek mythology: Helen of Troy was a famous beauty, known as the “face that launched a thousand ships.”
Between 1920 and 1929, John finally fell out of the top spot, as the most popular name for boys was Robert, with 576,373 entries. Robert, like William, dates back to English royalty and translates to “bright with fame” or “shining.” Mary stayed strong for girls, with 701,755 registered applications. The 1920s saw continued population increases both in the U.S. and worldwide. This is sometimes credited to a baby boom that occurred after World War I and the Spanish influenza, but is largely due, as in the previous decade, to better health care.
Photo credit: Hulton Archive/ Archive Photos via Getty Images
1930s
Between 1930 and 1939, Robert and Mary stayed at the top of the list, with 590,787 Roberts and 572,987 Marys. Though there were more Roberts born this decade than in the previous one, there was a decline in the birth rate overall due to the strain that the Great Depression placed on families. (The overall population was still higher in 1940 than in 1930, at roughly 132 million versus 123 million people.) A few new interesting names entered the runner-up positions in the 1930s. In female names, Betty and Barbara grew in popularity. Betty is a nickname for Elizabeth, a versatile name with Hebrew origins that is also found in English royalty (namely, Queen Elizabeth I). Barbara, like Helen, comes from Greek, and is also the name of St. Barbara, the patron saint of armorers, miners, and artillerymen. For boys’ names, the runners-up after Robert were James, John, William, and Richard.
Between 1940 and 1949, the name Robert fell to the second spot after James, which had 795,753 entries. Mary remained the most popular name for girls at 640,066 entries. The name James derives from Hebrew, and, like John, stems from a number of uses in the Bible. Like many other popular names, James is also found in the English monarchy, as well as the Scottish monarchy. Though it’s fallen out of the top slots in recent years in the United States, James remains one of the most popular baby names in Scotland. The next most popular boys’ names in the 1940s were Robert, John, William, and Richard; for girls, the list included Linda, Barbara, Patricia, and Carol. Interestingly, while Linda was never the most popular name in any given year, it is the most popular American baby name of all time, translating to “beautiful” in Spanish and Portuguese. Patricia, on the other hand, had been popular in England long before its time in the states, as it was the name of Queen Victoria’s granddaughter.
Photo credit: George Marks/ Hulton Archive via Getty Images
1950s
Between 1950 and 1959, the names James and Mary remained at the top of the list with 843,711 and 625,601 entries, respectively. Not far behind James, however, was a new popular name: Michael. Michael, like James, stems from the Hebrew Bible, and variations of the name exist across a number of languages, such as Miguel in Spanish and Micha in German. After James and Michael, Robert, John, and David topped the list for boys’ names, while Linda, Patricia, Susan, and Deborah followed Mary for the most popular girls’ names.
Between 1960 and 1969, everything changed, as is fitting for this revolutionary decade. Both James and Mary were unseated from the No. 1 slot: Michael became the most popular name for boys at 833,102 entries, and Lisa for girls at 496,975 entries. In fact, there were almost 150,000 more Lisas than Marys in the 1960s. The name is another variation on the popular moniker Elizabeth, and even Elvis Presley picked it for his daughter, Lisa Marie, who was born in 1968. While not much else changed in boys’ names this decade, popular girls’ names saw the addition of newcomers Susan, Karen, and Kimberly.
Between 1970 and 1979, Michael remained the most popular name for boys, topping out the decade with 707,458 entries, while Jennifer unseated the short-lived reign of Lisa with 581,753 entries. There were more new names that cropped up in the second and third slots, however, including Christopher and Jason for boys. The name Jennifer, meanwhile, grew so popular, it became known as the “standard” name for a baby girl. The initial spike in Jennifers started 50 years prior with the appearance of the name in a George Bernard Shaw play called The Doctor’s Dilemma. After Jennifer, the most popular ’70s girls’ names were Amy, Melissa, Michelle, and Kimberly.
Between 1980 and 1989, Michael retained its title as the most popular name for boys, with 663,827 entries, while Jessica just barely unseated Jennifer as the most popular name for girls — there were 469,518 Jessicas versus 440,896 Jennifers. Jessica stems from the Hebrew Bible, where its original spelling was “Jeska”; the common spelling in English comes from William Shakespeare’s play The Merchant of Venice. The top five boys’ names in the 1980s were Michael, Christopher, Matthew, Joshua, and David, and the top five for girls were Jessica, Jennifer, Amanda, Ashley, and Sarah.
Between 1990 and 1999, Michael and Jessica stayed the most popular names for each gender, with 462,390 Michaels and 303,118 Jessicas. Still, there were fewer entries for both than in the previous decade, in part because a handful of newer, trendy names cropped up as well, such as Matthew, Justin, and Andrew for boys and Ashley and Tiffany for girls. Andrew, like James, is a popular name with links to Scotland, while Matthew goes back to the Bible. Ashley and Tiffany, meanwhile, reflect the trend of girls’ names ending in “y” — names such as Brittany, Courtney, Emily, and Kelsey took off in the beginning of the 21st century.
Over the past century, the typical home kitchen has undergone a significant transformation, reflecting both social changes and new technology. In the 1920s and ’30s, kitchens were primarily utilitarian spaces with a focus on functionality and easy-to-clean surfaces. Appliances were limited, hand mixers had cranks, and gas ovens, which had replaced wood or coal-burning stoves in most homes, were starting to themselves be replaced by electric ovens.
The post-World War II consumerism of the late 1940s and 1950s brought bigger kitchens for entertaining and more labor-saving appliances, including blenders, mixers, and dishwashers. The kitchen space became more streamlined and functional, and the 1960s and 1970s brought countertop food processors and microwave ovens into the mainstream.
Open-plan kitchens and islands became increasingly popular in home design throughout the 1980s and ’90s, indicative of the kitchen’s role as a hub for family and friends to gather. That trend continued into the 21st century, along with a significant shift toward high-tech kitchens, smart appliances, and a focus on sustainability. Today’s kitchens — reflecting the changing ways we prepare, store, and consume food — look dramatically different than they did a century ago, making many once-popular items obsolete. Here are six things that your grandparents and great-grandparents might have had in their own home kitchens a century ago.
Photo credit: George Rinhart/ Corbis Historical via Getty Images
An Icebox
Before the widespread availability of electric refrigerators, iceboxes were used to keep perishable food cool. These wooden or metal boxes had a compartment for ice at the top, and fresh ice was delivered each week by an iceman. The design of the icebox allowed cold air to circulate around the stored items, while a drip pan collected the water as the ice melted. Naturally, iceboxes fell out of fashion as electric fridges went mainstream. In 1927, General Electric introduced the first affordable electric refrigeration, which relied on a refrigerant for cooling rather than ice.
Photo credit: FPG/ Archive Photos via Getty Images
A Butter Churn
Before commercial butter production made it possible to buy butter at the market, churning cream into butter was an activity done at home. The hand-crank butter churn was introduced in the mid-19th century, and it became the most commonly used household butter churn until the 1940s. In the early 20th century, the Dazey Churn & Manufacturing Company began producing glass churns that could make smaller quantities of butter much quicker than the larger, time-intensive churns. Once the butter was churned, it could then be poured or pressed into decorative molds for serving.
A Hoosier is a freestanding, self-contained kitchen cabinet that was popular in the early 1900s, named after the Hoosier Manufacturing Company that made it. Also known as a “Kitchen Piano” due to its shape, this kitchen necessity offered homemakers ample storage space and an additional work surface. Hoosier cabinets had numerous drawers and shelves for storing cookware and utensils, as well as features such as a flour bin with a built-in sifter, a sugar bin, a spice and condiment rack, a bread bin, a pull-out cutting board, and a cookbook holder. The all-in-one cabinet fell out of favor as kitchen designs began to incorporate built-in cabinets and islands for additional storage and counter space, but they’re still sometimes used for decorative storage.
Photo credit: Camerique/ Archive Photos via Getty Images
A Manual Hand Mixer
While the iconic KitchenAid stand mixer was patented more than 100 years ago in 1919, electric hand mixers weren’t commercially available until the 1960s. Before then, beating eggs or mixing other ingredients was done by hand, often with a manual hand mixer (also called a rotary egg beater). First developed in the 1850s, hand mixers had two beaters that rotated when you turned a crank. Though the style and mechanisms evolved over the years, manual hand mixers were still widely used in the 1920s, when only two-thirds of American households had electricity.
Even though ground coffee was available in bags and cans in the 1920s, and instant coffee was gaining popularity, household coffee grinders, such as a wall-mounted coffee grinder (or mill), were still a common kitchen appliance. According to a 1918 New-York Tribune article on the art of making perfect coffee, “The real coffee lover will always have a mill in the kitchen.” The wall-mounted, hand-crank style had a glass container that could hold a pound of coffee beans, and a container with tablespoon markings to catch the ground coffee.
There was a time when treasured family recipes were written on 3-by-5-inch index cards and stored in a box on the kitchen counter. Before the 1920s, most recipes were passed on by example — young women would learn how to make their grandmother’s pot roast by helping her in the kitchen. As such, handwritten recipes were generally a list of ingredients, often without quantity, and vague directions. As kitchen science developed, magazines began advertising recipe subscriptions delivered as preprinted, perforated cards. Women also started writing their own recipes on blank cards to collect and exchange, and the recipe box proved to be a more decorative and lasting storage solution than a shoebox. Like many vintage kitchen items, this nostalgic throwback still has novelty appeal, but the recipe box has largely been replaced by digital recipes stored on apps and websites.
Smith Collection/Gado/ Archive Photos via Getty Images
Author Nicole Villeneuve
September 21, 2023
Love it?55
The United States may not have a royal family, but it has a number of influential family dynasties that have intrigued the public for centuries. Two families in particular, the Vanderbilts and the Rockefellers, are household names whose self-made fortunes made them two of the richest and most powerful American families in history. The Vanderbilts, led by Cornelius Vanderbilt’s railroad empire, amassed staggering wealth during the Gilded Age in the late 19th century. The Rockefellers, meanwhile, propelled by John D. Rockefeller’s dominance of the oil industry, made a large impact with their philanthropy and preservation. Although their ascendence is similar, their legacies ended up looking a little bit different in the end.
Rockefeller and Vanderbilt’s massive business ventures not only amassed them unprecedented personal wealth, but boosted the country’s industrial economy. At the same time, their reputations as “robber barons” emerged, amid criticisms that their successes came at the expense of fair competition, workers’ rights, and ethical standards. At the time, many Americans were living in poverty, a stark contrast to the glitzy guise of the Gilded Age that these wealthy families propped up.
John D. Rockefeller Was America’s First Billionaire
A century before Bill Gates and Jeff Bezos, there was John D. Rockefeller. When he was just 12 years old in rural New York, Rockefeller loaned a neighbor $50 of his own hard-earned money. When he received it back the next year with interest, he decided at that moment to let his money work for him instead of the other way around. This foresight and financial acumen lasted him a lifetime, helping him shape the landscape of American business and become the country’s first billionaire.
Trained and working as a bookkeeper by 16 years old, Rockefeller started his own company in agricultural trade within a few years. Through that business, he decided that the true future of industry was in moving raw materials, and at 24 years old, he moved into the oil business. Rockefeller went on to pioneer the American oil industry by founding Standard Oil (later dissolved into Exxon, Chevron, and more). Although his business practices faced their fair share of accusations and criticisms over the years — including colluding to control the price of oil and creating a monopoly by buying competing refineries — Rockefeller amassed an unprecedented $1.4 billion net worth by the time of his death in 1937 (almost $30 billion today). As much as he made, he gave plenty away, too — his philanthropic gifts over the years totaled $530 million.
He’s a towering figure in American business history, but Cornelius Vanderbilt had little formal education. Born the fourth of nine children in Staten Island, New York, in 1794, Vanderbilt was pulled out of school to work on his father’s shipping boat when he was just 11 years old. By the time he was 16, “the Commodore,” as he became known, had bought his own boat to ferry cargo around the New York Harbor. He got a job in the steamship industry and eventually went into business for himself.
Vanderbilt’s aggressive professional approach helped him accrue wealth quickly, and in the 1840s, he built the first of many large homes the family owned in New York (and elsewhere). When the California gold rush struck, Vanderbilt saw an opportunity: He launched a shorter steamship route from New York to San Francisco than had previously existed. It was an instant success, earning more than $1 million in one year (that’s almost $40 million today). Around this time, Vanderbilt also began to manage the railroads that connected textile mills on the East Coast to shipping ports. The shipping tycoon with no formal education also became a railroad tycoon.
John D. Rockefeller Celebrated the Anniversary of His First Job Every Year
It was referred to as his “most joyful holiday of the year” — the anniversary of the day John D. Rockefeller got his first job. On September 26, 1855, Cleveland company Hewitt and Tuttle hired a 16-year-old Rockefeller as an assistant bookkeeper. Every September 26 from that year forward, the magnate celebrated “job day.”
“All my future seemed to hinge on that day,” Rockefeller once said. “I often tremble when I ask myself the question: What if I had not got the job?” When the business magnate was 83 years old, The New York Times reported the events of his “job day” celebrations: Festivities included a round of golf (just nine holes, lest he spend too much time on the greens), working on his philanthropic ventures, and a luncheon with friends in which he detailed his job search so many years ago. Rockefeller said that just over three months after he started the job, on New Year’s Eve in 1855, he collected his first pay: $50.
On Christmas Eve in 1895, Cornelius Vanderbilt’s grandson George Vanderbilt invited friends and family into his opulent new home for the first time. George first visited Asheville, North Carolina, in 1888. Taken by the region’s picturesque mountainous charm, he began acquiring land, eventually amassing 125,000 acres for his future estate. It took six years for the Biltmore Estate, designed by architect Richard Morris Hunt, to take its iconic form. The 250-room French Renaissance chateau — now the largest home in the United States — even required its own brick factory, woodworking shops, and railway for transporting materials during construction.
Spanning 175,000 square feet — more than 4 acres of floor space — the famous mansion includes 35 bedrooms, 43 bathrooms, and 65 fireplaces. Despite the ostentatious footprint of the estate, George was also committed to sustainable land use practices, and following his death in 1914, his wife, Edith Vanderbilt, sold 87,000 acres to the federal government, creating the Pisgah National Forest.
The Rockefellers Made Their Fortune in the Oil Industry — Then Denounced It
At its peak, Standard Oil controlled more than 90% of petroleum products in the United States. Yet in the mid-2010s, two of the Rockefeller family charities announced their plans to divest from the oil industry, walking away from the very product that made them billionaires. The companies cited concerns about the environmental impact of fossil fuels and their contribution to climate change as the reason. While the move was largely symbolic, it certainly reaped rewards. In 2020, five years after announcing the divestment, the Rockefeller Brothers Foundation — which puts approximately $15 million each year into climate change efforts — claimed its oil-free portfolio posted bigger returns than a comparable portfolio with oil holdings over the same time period.. “Oil is obviously a definitional part of my family’s past,” Valerie Rockefeller, the great-great-granddaughter of John D. Rockefeller and chair of the RBF board of trustees, said in a press release at the time. “But it has no place in our future.”
Photo credit: FPG/ Archive Photos via Getty Images
The Vanderbilt and Rockefeller Dynasties Live on in Different Ways
Despite sharing historic and cultural space as two of the wealthiest and most powerful families in American history, the Vanderbilt and Rockefeller dynasties played out quite differently. The vast fortune — upwards of $100 million at the time of his death — built by Cornelius Vanderbilt deteriorated in subsequent generations. “When 120 of the Commodore's descendants gathered at Vanderbilt University in 1973 for the first family reunion, there was not a millionaire among them,” wrote Arthur T. Vanderbilt in Fortune's Children: The Fall of the House of Vanderbilt. Journalist Anderson Cooper, one of the most famous living Vanderbilt descendants (his mother was Gloria Vanderbilt), also detailed the lavish spending and poor planning that led to their financial fall.
In contrast, the Rockefellers’ wealth has endured. Across several generations and almost 100 descendants, the family maintains a fortune in the billions. Unlike with the Vanderbilts, careful financial planning, strategic philanthropy, and diversified investments have kept the family thriving, as has, according to the modern-day Rockefellers, a system of family values and traditions that includes family get-togethers — featuring upwards of 100 people — at least twice a year.
Chicago History Museum/ Archive Photos via Getty Images
Author Tony Dunnell
August 25, 2023
Love it?53
As early as the colonial era, the consumption of alcoholic beverages was a contentious issue in America. Drunkenness was generally frowned upon, and certain sectors of society believed that alcohol was nothing short of the devil’s juice. Tensions came to a head in the early 20th century, when the temperance movement (which advocated for moderation in all things), supported by groups such as the Anti-Saloon League, the National Prohibition Party, and women suffragists, convinced lawmakers to curtail what they saw as the calamitous and ungodly effects of alcohol.
The result was the 18th Amendment to the U.S. Constitution, ratified on January 16, 1919. One year after the ratification, the prohibition of alcohol in the United States began, and breweries, wineries, and distilleries across the country were shuttered.
Initially, the signs were positive. There was a significant reduction in alcohol consumption, booze-related hospitalizations declined, and there were notably fewer crimes related to drunkenness. But one thing never changed: Many people still enjoyed an occasional drink and weren’t willing to live completely dry lives. Enter bootleggers, speakeasies, and organized crime. The Prohibition era lasted until 1933, and marked a period of colorful characters, clandestine operations, and government corruption. Here are seven facts from this fascinating time in U.S. history.
The 18th Amendment prohibited “the manufacture, sale, or transportation of intoxicating liquors” within the United States, but it didn’t ban the consumption of alcohol at home. So, during the one-year grace period before Prohibition began, people — those who could afford it, at least — began stockpiling wine and liquor while it was still legal to buy. Once the cellars had been stocked and Prohibition began, there was a notable rise in home entertaining and dinner parties — a shift that transformed America’s drinking culture in a way that’s still felt to this day.
Despite the constitutional law, certain legal loopholes existed that facilitated the acquisition of alcohol. Doctors could prescribe whiskey for medicinal purposes, making a friendly neighborhood pharmacist a handy source of booze — not to mention an ideal front for bootlegging operations. Religious congregations were allowed to purchase communion wine, which led to an increase in church enrollment. Winemakers, meanwhile, began selling “wine bricks,” rectangular packages of entirely legal concentrated grape juice that could be used to make wine at home. The packaging even came with a handy “warning”: “After dissolving the brick in a gallon of water, do not place the liquid in a jug away in the cupboard for twenty days, because then it would turn into wine.”
The main source of liquor during Prohibition was industrial alcohol, the kind of stuff used to make ink, perfume, and camp stove fuel. Bootleggers could make about 3 gallons of barely drinkable — and dangerous — “gin” or “whiskey” from 1 gallon of industrial alcohol. But industrial alcohol was denatured, meaning it had additives to make it foul-smelling, awful-tasting, and poisonous. And while bootleggers found a way to recondition the denatured alcohol into cheap booze — colloquially known as “rotgut” — that was drinkable, it was still capable of causing blindness or death. On average, about 1,000 Americans died every year during the Prohibition era from drinking tainted liquor. Many estimates put the number even higher, with up to 50,000 total deaths from unsafe alcohol during Prohibition.
Like thousands of other Americans, congresspeople and senators, including many of those who had voted in favor of Prohibition, often sought out illegal alcohol. One of their main suppliers was a bootlegger named George Cassiday, who started off supplying hooch to two House of Representatives members. Demand for his services soon increased, and before long he was making 25 deliveries a day to House and Senate offices. A dapper gentleman, Cassiday was easily recognized by his emerald fedora, and soon became known as the “man in the green hat.” He was arrested in 1930 and sentenced to 18 months in prison, but was allowed to sign out every night and return the next morning during his time in jail. The same year he was arrested, Cassiday wrote a series of articles for The Washington Post in which he estimated that 80% of Congress drank illegally.
Al Capone’s Oldest Brother Was a Prohibition Enforcement Agent
Al Capone was the most famous of all the gangsters who came to prominence during the Prohibition era. Capone’s brothers Frank and Ralph were also mobsters. Then there was James Vincenzo Capone, the oldest of the Capone brothers, who later changed his name to Richard James Hart. He took a decidedly different path than his siblings: He became a Prohibition agent. He was, by most accounts, a daring and effective law enforcer, whose tendency to carry two ivory-handled pistols earned him the nickname “Two-Gun” Hart.
The End of Prohibition Made U.S. Constitutional History
Prohibition was, ultimately, a failure. At least half of the adult population wanted to carry on drinking, the policing of Prohibition was marred by contradictions and corruption, and with no actual ban on consumption, the whole thing became untenable. So, on December 5, 1933, the 18th Amendment was repealed by the 21st Amendment, bringing about the end of the Prohibition era. The 18th Amendment made constitutional history, becoming the first — and, to this day, only — constitutional amendment to be repealed in its entirety.
If for some reason you yearn for the days of Prohibition, you can always vote for the Prohibition Party. Yes, the anti-alcohol party, formed in 1869, still exists. Not only has it championed the cause of temperance for more than 150 years, but it’s also the oldest existing third party in the United States. And while the Democrats have their donkey and the Republicans their elephant, the Prohibition Party’s mascot is the camel — an animal that can survive without drinking for almost seven months.
Advertisement
Advertisement
Another History Pick for You
Today in History
Get a daily dose of history’s most fascinating headlines — straight from Britannica’s editors.