Minnesota Historical Society/ Corbis Historical via Getty Images
Author Tony Dunnell
December 20, 2023
Love it?100
If we could travel back 100 years and land on a typical city street, we’d probably be mightily discombobulated. Some things would seem familiar: the buzz of the urban environment, people walking this way and that, and buildings with facades that could well still exist today. But looking around, we’d soon realize that we weren’t in Kansas anymore — or if we were, it would be Kansas City in the 1920s.
A century ago, America was going through a monumental change. For the first time in U.S. history, more people were living in urban areas than in rural areas. The cities were booming, and for many middle-class Americans, the 1920s were a decade of unprecedented prosperity. People were earning more and spending more, advertising had reached new levels of sophistication, and the automobile was changing the way we live.
So, before you step into that time machine, you’d better brace yourself. Here are seven things you’d find in a city street a century ago, back in the dizzying days of the Roaring ’20s.
Before the development of practical light bulbs, street lights typically used piped coal gas, oil, or kerosene as fuel. The first electric streetlights were installed in Paris in 1878, but these used unwieldy and harsh arc lamps. Then came inventors such as Joseph Swan in the U.K. and Thomas Edison in the U.S., both of whom patented revolutionary incandescent light bulbs in 1880. Incandescent street lamps became the norm in many cities throughout the world, and the 1920s saw a wave of patents filed for innovative new street lighting. These electric lights, however, were often placed where they were needed rather than lining a whole street. So, 100 years ago, a city street at night would not have been as brightly lit as it is today, and pedestrians would often find themselves walking from one pool of yellowish light to the next.
Public phones and phone booths began appearing in the U.S. not long after Alexander Graham Bell patented the first telephone in 1876. By the 1920s, wooden phone booths were a fairly common sight on many city streets, but the wooden construction meant they were hard to maintain, limiting their popularity. In some cities, you’d be more likely to come across a public telephone room, which contained multiple booths. Individual outdoor phone booths became truly commonplace in the 1950s, when glass and aluminum became the booth-building materials of choice. Today, of course, public phones are heading rapidly toward extinction, now that most everyone can carry a phone in their pocket.
The art deco style flourished in the United States during the 1920s, in both the visual arts and architecture, as well as product design. Walking down a city street 100 years ago, art deco would have been everywhere, from the facades of grand buildings such as the Empire State Building and the Chrysler Building, to the window displays of newly emerging department stores such as Macy’s and Saks. The style, characterized by bold geometric patterns, vibrant colors, and glamorous details, became synonymous with the opulence and extravagance that defined the Roaring ’20s.
Thankfully, modern child labor laws ensure that we don’t see children working in the streets anymore. But 100 years ago, it was a common sight. In 1920, about a million children aged 10 to 15 were working in America, out of a total population of about 12 million children in that age range. The most visible were those working in city streets, in jobs such as flower seller, shoe shine, and courier. Children carried messages — and sometimes money and sales slips — throughout the city, facilitating daily commerce for banks, factories, and offices. Even more notable were the “newsies,” young children (some as young as 5) who sold newspapers in the street. But by the end of the decade, a growing preference for home delivery and tougher child labor laws led to the decline of the “newsie” in urban America.
If we traveled back 100 years, one of the first things we might notice is the fashion of the day. Men would be walking the streets wearing three-piece suits, thin bow ties, wingtip shoes, and the then-ubiquitous fedora hat. Sportswear was also becoming acceptable menswear, thanks in large part to the growing popularity of golf, which brought longer “plus four” trousers and wide-legged oxford bag pants to the urban milieu. Women’s fashion, meanwhile, reflected the newfound freedoms of the day. The dresses of the 1920s were loose, straight, and slender, with shorter hemlines. This was typified by the flapper style of the Jazz Age, with dropped waistlines and calf-revealing dresses — clothing that was stylish but also allowed women to move. New hairstyles completed the look, with bobs and waves becoming the defining cuts of the ’20s.
Today, we don’t encounter many horses on our city streets, but go back 100 years and you’d still occasionally see peddlers, milk trucks, coal wagons, and fire wagons using horse-drawn carriages. The heyday of horses, however, was coming to an end. In 1916, for instance, there were 46,662 horse-drawn vehicles in Chicago. By the end of the 1920s, this number had plummeted, and by 1940 there were fewer than 2,000 horse-drawn vehicles in the city. In New York City, meanwhile, the last horse-drawn fire engine was retired in 1922. The rise of the automobile had begun in earnest, bringing about a permanent change in the very nature of city streets.
Arguably no invention changed the everyday lives of Americans in the 20th century more than the automobile. Between 1900 and 1920, the number of cars increased dramatically, from 8,000 to 8 million. By 1929, there were more than 23 million automobiles on American roads. These early vehicles, of course, looked very different from the cars we drive today. A hundred years ago, the car of choice was the Model T Ford, which brought driving to the masses. By the early 1920s, more than half of the registered automobiles in the world were Model Ts. They were so popular that Henry Ford himself once quipped, “There’s no use trying to pass a Ford, because there’s always another one just ahead.” Pedestrians, however, found it hard to adapt to the new laws of the street. With the introduction of jaywalking laws — facilitated by automobile manufacturers themselves — the streets became a place for cars, rather than for pedestrians, horse-drawn carts, and children at play, as they once had been.
Queen Victoria ruled Britain from 1837 until her death in 1901. Her reign of 63 years and 216 days was longer than that of any of her predecessors, and was exceeded only by Elizabeth II’s time on the throne. This period, known as the Victorian era, saw the British Empire expand to become the first global industrial power.
Fueled by the industrial revolution that began the previous century — which reshaped almost every existing sector of human activity — the era saw many breakthroughs in the arts and sciences (perhaps most notably, Charles Darwin’s theory of evolution) as well as great social change and political reforms. And, as people from the countryside began to move to urban industrial hubs in search of work, there was a rise in both education and affluence, further driving the wave of ideas and innovation.
Victorian-era Brits were avid inventors, and many of the creations from this time had a major impact not only in Britain but across the globe. That’s not to say that all Victorian innovations were a hit. The hat cigar holder, ventilating top hat, anti-garroting cravat, reversible trousers, and “corset with expansible busts” all rank among the less successful ideas. These failures, however, were far outweighed by the era’s many influential developments, some of which laid the foundation for our modern age, and are still used every day. Here are some of the greatest innovations of the Victorian era, from the telephone to the electric light bulb.
Scottish-born inventor Alexander Graham Bell is considered the father of the telephone, but a degree of controversy remains over who exactly invented the world-changing device. The American electrical engineer Elisha Gray filed a patent on the exact same day as Bell, in Washington, D.C., on February 14, 1876. We’ll never quite know how things played out in the patent office, but Bell’s documents were filed first, and he was awarded the patent on March 7. A few days later, he made the first-ever telephone call. He called his assistant, Thomas Watson, with the now-famous words, “Mr. Watson, come here. I want you.”
Bell, who had lived in Boston since 1871, was keen to introduce his invention to Britain, where, as a young man, he had received an expansive Victorian education in Scotland and London, and where he first began his experiments in sound. In August 1877, he toured Britain with his wife Mabel (it was supposed to be their honeymoon), promoting his invention as he went. He even demonstrated the newly invented telephone to Queen Victoria herself, who was so impressed she asked to keep the temporary installation in place.
Photo credit: Miles Willis/ Getty Images Entertainment via Getty Images
Adhesive Postage Stamp
In 1837, English inventor Rowland Hill submitted a number of reforms to the British government regarding the existing postal system. Among his ideas was the use of an adhesive postage stamp. At the time, the postal service was unwieldy and rates were high — they were based on distance and the number of sheets in a letter, and the recipient paid for the delivery. Hill proposed a low-cost stamp based on weight, with the cost prepaid. This resulted in the Penny Black, the world’s first postage stamp, which cost a flat rate of one penny, regardless of distance. The idea was simple, but revolutionary. His adhesive stamp and associated reforms were soon adopted by other countries, and ultimately paved the way for modern postal systems around the world.
Advertisement
Advertisement
Photo credit: Print Collector/ Hulton Archive via Getty Images
Underground Railway
On January 10, 1863, the Metropolitan Railway opened to the public, becoming the world's first underground railway. Now part of the extensive London Underground, the original line ran for 3.75 miles between Farringdon Street and Bishop’s Road, Paddington. Hailed by The Timesas “the great engineering triumph of the day,” it consisted of open wooden carriages pulled by steam locomotives. Between 30,000 and 40,000 people attempted to travel on the railway’s opening day, causing chaos on the fledgling underground system. Despite the initial bedlam — and the sulfurous fumes released by the locomotives — the line was a huge success. In its first year, it carried 9.5 million passengers, showing cities around the world that underground railways were an excellent solution to growing congestion problems.
Basic public toilets were part of the sanitation system of ancient Rome, but it wasn’t until the Victorian era that they became widespread — and flushable. This was all thanks to George Jennings, an English sanitary engineer who introduced his flushable public toilets — which he called “monkey closets” — at the Great Exhibition of 1851, held at London’s Crystal Palace. Visitors could spend one penny to use the facilities, and records show that 675,000 pennies were spent (this led to the expression “to spend a penny,” which Brits still use today to refer to a trip to the loo). Public toilets were soon installed in various cities across Britain. Typically, however, these facilities were designed only for men. Because of this, women were far more restricted than men when it came to traveling, something referred to as the “urinary leash.” But by the end of the era — thanks to campaigning and women’s role in the ever-growing economy — women’s amenities began opening in major cities.
Decades before Thomas Edison patented the first incandescent light bulb in 1879, British inventors had already been working on the problem. James Bowman Lindsay and Warren de la Rue both created early versions of the light bulb, in 1835 and 1840, respectively. Then, in 1841, Frederick de Moleyns was granted the first patent for an incandescent lamp, which used powdered charcoal heated between two platinum wires. Then came English physicist and chemist Joseph Swan, who produced a primitive electric light in 1860 and, 20 years later, a practical light bulb. Both Swan and Edison applied for patents for their incandescent lamps in 1880. Litigation ensued, but was resolved when the two men formed a joint company in 1883. From there, there was no looking back. By 1893, even the chandeliers in Queen Victoria’s residence, Osborne House, had been wired for electricity.
Photo credit: Three Lions/ Hulton Archive via Getty Images
Pneumatic Tires
Robert William Thomson was a Scottish engineer who invented the pneumatic tire at the age of 23. He was granted the patent in 1846, for a hollow leather tire that enclosed a rubberized fabric tube filled with air. A set of his “aerial wheels” ran successfully for 1,200 miles on a horse-drawn carriage. At the time, however, the rubber required to make the inner tubes was prohibitively expensive, and so Thomson returned to solid tires. For 43 years, air-filled tires were all but forgotten.
Then came a second Scotsman, John Boyd Dunlop, who developed his own version of the pneumatic tire while trying to make his son’s bicycle more comfortable to ride. Dunlop patented his pneumatic bicycle tire in 1888. A year later, the competitive cyclist Willie Hume became the first man to fit his bike with Dunlop’s pneumatic tires, and he started winning races across Britain. Dunlop later lost his main patent when Thomson’s original patent was rediscovered. While Thomson is rightfully considered the inventor of the pneumatic tire, it’s Dunlop’s name that remains synonymous with tires today.
Our planet is home to many talented engineers. Termites, for example, build complex structures that rise up to 10 feet in height, their “bricks” bonded by bio-cementation. Spiders, meanwhile, weave intricate webs, which, like suspension bridges, are capable of bearing heavy loads in even the stormiest weather. Then there are beavers and their well-engineered dams, bees and their cellular hives, and industrious ants whose largest recorded contiguous colony stretches a truly incredible 3,700 miles.
Humans, of course, are in a league of their own when it comes to construction. For millennia, we have been building structures of awesome size and complexity: roads and bridges, cathedrals and stadiums, tunnels and skyscrapers. Among the innumerable structures built by humankind, some stand out for their sheer size and magnificence. Here are six of the greatest engineering marvels in history.
The Great Wall of China is widely considered one of the greatest engineering feats of all time. Built continuously from the third century BCE to the 17th century CE, this series of walls and natural barriers stretches for around 13,000 miles. (Still, despite a persistent myth, it is not visible from the moon or space, at least not with the naked eye.) The Great Wall was originally the idea of Emperor Qin Shi Huang, the founder of the Qin dynasty and the first emperor of a unified China, who wished to protect the country from barbarian attacks from the north.
Under his orders, many older, unconnected state walls and fortifications were removed and a number of existing walls were joined into a single system stretching from the eastern Hebei province to Gansu province in the west. The wall itself was 15 to 30 feet high, topped with ramparts of at least 12 feet, with guard towers distributed at regular intervals. Much of the Great Wall that we now see was constructed during the powerful Ming dynasty, starting from around 1474. Today, the most famous and iconic section of the Great Wall of China is located 43 miles northwest of Beijing. This section was rebuilt in the late 1950s and now attracts thousands of tourists every day.
The awe-inspiring Hagia Sophia was built under Byzantine Emperor Justinian I between 532 and 537 CE — a remarkably quick construction considering the size of the project. With its vast central basilica, complex system of vaults and semi-domes, and high central dome spanning more than 101 feet in diameter, it was the world’s largest cathedral for nearly a thousand years (until the completion of Seville Cathedral in 1520). Considered one of the greatest examples of Byzantine architecture, Hagia Sophia represents the region’s shifting political and religious affiliations. Built as a Christian church, in later centuries it became a mosque, a museum, and a mosque once again. These changes are reflected in the building’s design, which features minarets and inscriptions from Islam as well as extravagant Christian mosaics. Hagia Sophia was the inspiration for many other Ottoman mosques, and has long been considered a unique architectural masterpiece.
Perched high above the Urubamba River valley in a narrow saddle between two mountain peaks, with tropical cloud forests tumbling down below, Machu Picchu is simply mind-blowing. It’s hard not to look around and think, “How the heck did they build this here?” The site is situated at an elevation of 7,710 feet, and contains about 200 structures spread across zones dedicated to religious, ceremonial, astronomical, and agricultural activities, all surrounded by incredible farming terraces, connected by some 3,000 stone steps. Many mysteries still surround the construction and precise purpose of Machu Picchu. Historians estimate that it was built around 1450 CE and abandoned around 100 years later with the arrival of the Spanish conquistadors. The conquistadors never found Machu Picchu, and it stood for centuries known only to the peasant farmers who lived in the immediate area. It wasn’t until 1911, when American academic and explorer Hiram Bingham rediscovered the site, that the remarkable citadel was brought to the world’s attention.
The idea of building a waterway across the isthmus of Panama to link the Atlantic and Pacific oceans dates back to at least the 1500s, when Spanish explorer Vasco Núñez de Balboa realized that the two oceans were separated by a 50-mile stretch of land. It wasn’t until 1880, however, that excavations for a canal first began, led by France. Unfortunately, the original nine-year project was devastated by malaria, yellow fever, and other tropical diseases, and ended with bankruptcy and the loss of some 20,000 lives. In 1904, the United States began the project anew. A major sanitation effort was put in place to help minimize disease, and the entire infrastructure was modernized.
Perhaps most importantly, the project’s chief engineer, John Stevens, convinced President Theodore Roosevelt that the concept of the waterway must be changed from a sea level canal to a lock canal. The original locks (the system has since been expanded), each measuring 110 feet wide by 1,000 feet long, made it possible to lift ships 85 feet above sea level to the artificial Gatún Lake, and then back down to the sea. This incredible feat of engineering radically altered maritime travel and trade. By using the canal to cut across the isthmus, ships could avoid the lengthy trip around Cape Horn in South America, shortening sea voyages by about 8,000 nautical miles.
After six years of construction, the Channel Tunnel opened in May 1994, becoming the only fixed link between the island of Great Britain and the European mainland. The idea of building a tunnel between England and France was not new. French engineer Albert Mathieu had proposed such a project in 1802, his design featuring an artificial island halfway across for changing horses. Back then, the idea simply wasn’t technologically feasible, but by the 1980s, the tunnel project was given the green light. Digging began on both sides of the Channel, and the initial connection was completed in 1991. Three years later, the 31.5-mile tunnel was opened. It actually consists of three tunnels: two for rail traffic and a central tunnel for services and security. The tunnel descends to a depth of 246 feet below the sea bed. It is the third-longest railway tunnel in the world, and with 23.5 miles running under the English Channel, the world's longest undersea tunnel.
Built between 1998 and 2011, the International Space Station (ISS) is the largest single structure humans ever put into space. It also represents arguably the greatest multinational collaborative construction project in history, involving Europe, the United States, Russia, Canada, and Japan. Elements of the station were built in multiple countries beginning in the late 1980s. In 1998, Russia’s Zarya control module and the U.S.-built Unity connecting node were launched into orbit, where they were connected by American space shuttle astronauts. In 2000, following the installation of the Russian-built habitat module Zvezda, the ISS received its first resident crew. Since then, it has been continuously occupied. Altogether, the space station is larger than a football field and weighs almost 450 tons. Maintaining an orbit with an average altitude of 250 miles, it circles the Earth in about 93 minutes, completing 15.5 orbits per day. The ISS is a truly unique engineering masterpiece, and its sophisticated science laboratory allows us to carry out research that is simply not possible on Earth — paving the way for future missions to Mars and beyond.
In 1903, a Vermont doctor named Horatio Nelson Jackson drove from San Francisco to New York in a Winton touring car and became the first person to traverse the United States in an automobile. At the time, there were no more than 150 miles of paved road in the country, mostly concentrated within cities. The path that Jackson traveled was along rivers, mountain passes, flatlands, and the Union Pacific Railroad, and what roads he did encounter between cities were, in his description, “a compound of ruts, bumps, and ‘thank you m’ams’ [sic].” The trip took 63 days, 12 hours, and 30 minutes, but it inspired auto companies and other early car adopters to arrange trips of their own, sparking demand for long-distance highways.
The first automobile highways weren’t construction projects, and were referred to as “auto trails.” They were essentially suggested routes made up of existing thoroughfares, conceived of by private associations and codified with names such as Lincoln Highway, Victory Highway, National Old Trails Road, and so on. The associations marked the trails with signs or logos, and promoted the improvement of the routes, sometimes collecting dues from towns and businesses. Eventually, the U.S. government grew wary of the proceedings, and proposed the construction of a paved and nationalized numbered highway system. The proposal was adopted on November 11, 1926.
The numbered highways were a marked improvement over the auto trails, but nearly 30 years after their adoption, Congress approved the Federal-Aid Highway Act of 1956, revolutionizing the highway system by building 41,000 miles of interstate roads. The interstates repurposed existing numbered highways, connecting and extending them for greater efficiency, and these roads are to this day our main mode of distance auto travel. Let’s look at when some of the country’s biggest and most vital interstates were built.
I-70 is arguably the oldest interstate in the U.S. When it comes to the interstate projects initiated by the Federal-Aid Highway Act of 1956, I-70 was the first both by date of initial construction (August 13, 1956 in St. Charles County, Missouri) and initial paving (September 26, 1956 just west of Topeka, Kansas). The highway runs through 10 states as it spans the center of the country west from Baltimore, connecting Pittsburgh, Columbus, Indianapolis, St. Louis, Kansas City, and Denver. I-70 also includes the highest car tunnel in the world, the Eisenhower-Johnson Memorial Tunnel near Denver, which has an average elevation of 11,112 feet. The most recent segment of the tunnel is the 12.5-mile stretch through the narrow Glenwood Canyon, completed in 1992.
Interstate 70 may be the first of the federal interstates to begin construction, but Interstate 80 likely has the oldest antecedents, as it approximates the route of Nebraska’s Mormon Trail (aka Great Platte River Road), dating back to the 1840s, and also parts of the Lincoln Highway auto trail from the late-1910s to mid-1920s. Its transcontinental span runs through 11 states. Construction of the modern-day I-80 began in Nebraska in 1957 and in Pennsylvania in 1958 (though the Delaware Water Gap Toll Bridge that later became part of I-80 was opened on December 16, 1953). A final 5-mile connecting segment was completed near Salt Lake City on August 17, 1986.
Interstate 90 is another federal interstate that traces its origins to an older antecedent auto trail: the Yellowstone Trail from Plymouth, Massachusetts to Seattle, Washington that was founded in 1912. The first segment of newly constructed road for I-90 was opened in Spokane, Washington in November 1956. I-90 has the distinction of being the longest interstate, at 3,085 miles, and covers 13 states. The last link to its western terminus in Seattle was completed in 1993.
Route 66 was perhaps the most famous highway in the United States during the first half of the 20th century, inspiring a song and even a TV show. Interstate 40 is the longest of the five federal interstates that gradually replaced it, and it was I-40 bypassing Route 66’s final segment in 1984 that led to the iconic highway being decommissioned the following year. Construction of I-40 began in 1957 in North Carolina. Though the interstate stretches more than 2,500 miles between its eastern and western ends, its final segment was completed in 1990 in Wilmington, North Carolina — just 220 miles from its first segment’s completion in Kernersville, North Carolina.
Interstate 10 is the transcontinental highway with the southernmost span, running through all eight states of the lower U.S. Similar to I-40, it served as a replacement for Route 66, primarily for the stretch between California to Arizona. Exact details about the first new construction stretch of I-10 are sparse, but it most likely took place in El Paso in 1960. The Papago Freeway Tunnel completed I-10’s final segment when it opened in August 1990.
Interstate 95’s 1,920-mile span from Houlton, Maine to Miami, Florida makes it the longest north-south oriented interstate in the country. It crosses 15 states and Washington, D.C. (the most of any interstate), and it also established the first bus/carpool lanes in 1969. Since the route traverses more densely populated cities than any other interstate, its construction was often contentious, particularly in Philadelphia. The first new construction for I-95 began in the summer of 1956 in Richmond, Virginia, though the Connecticut Turnpike was the first stretch of I-95 that opened. The final stretch of I-95, a long unresolved gap on the Pennsylvania and New Jersey border, was finally completed in the summer of 2018. The event also marked a larger momentous occasion: the completion of the original federal interstate system first planned in 1956.
Advertisement
Advertisement
7 Items You Would Find in a Doctor’s Office 100 Years Ago
In many historical contexts, 100 years isn’t a very long time. But when it comes to science, technology, and medicine — particularly in the last century — it’s a veritable eternity. The seeds of modern medicine were just being planted in the early 20th century: Penicillin was discovered in 1928, physicians were still identifying vitamins, and insulin was a new breakthrough.
The doctor’s role itself was different than it is today, as preventative care was not yet an established practice; there was no such thing as a routine visit to a doctor’s office 100 years ago. A visit to the doctor typically meant that you were ailing (though in some cases during the Prohibition era, it meant that you and your doctor had agreed on a way around the alcohol ban). Thanks to advances in technology, doctors’ offices in the 1920s were also stocked with very different items than we see today. These are a few things you likely would have found there a century ago.
A metallic disc attached to a headband is generally considered part of a classic doctor costume, but what is the genuine article, exactly? It’s called a head mirror, and your doctor 100 years ago would’ve been wearing one. It wasn’t just an emblem; it provided a very core function, which was illumination for the examination of the ear, nose, or throat. The patient would be seated next to a lamp that was pointed toward the doctor, and the head mirror would focus and reflect the light to the intended target. Today, the easier-to-use pen light or fiber optic headlamp have largely replaced the head mirror, though some ENT specialists argue that the lighter weight and cost-effectiveness of the latter mean it may still have a place in contemporary medicine.
Photo credit: Marka/ Universal Images Group via Getty Images
Floor-Standing Spirometer
One hundred years ago, a spirometer was a large floor-standing unit made of metal, used to evaluate pulmonary function. The patient would breathe into a tube, and a dial on the top would indicate lung capacity and respiratory volume, allowing the doctor to diagnose pulmonary ailments. Today, spirometers are still very much in use, but they are much smaller and made of plastic. In fact, they’re so compact nowadays that patients can hold the entire unit themselves while they’re in use.
Advertisement
Advertisement
Photo credit: FPG/ Archive Photos via Getty Images
Electric Vaporizer
The electric vaporizer was similar to today’s at-home humidifiers, but it was more complex, and could be used to make vapor out of water or other liquid medication. In the doctor’s office, vaporizers were used to treat sinus or bronchial illnesses. Vaporizers were also used in hospital settings in order to administer anesthesia.
Considering that a doctor’s workspace is referred to as a “doctor’s office,” it follows that a classic wooden desk was generally present in one 100 years ago. There is something especially archaic-looking about a doctor seated at a wooden desk, though, since today we’re used to nonporous antiseptic surfaces in any space where medical exams or procedures take place. Indeed, it didn’t take long for the doctor’s office to shift in that direction: By the 1930s, most doctor’s offices contained furniture that was made of enamel-coated metal.
Sometimes referred to as vinegar of ipecac, syrup of ipecac was (in small doses) an early form of cough syrup used as an expectorant to treat respiratory illnesses. In larger doses, it was used as a poison control agent to induce vomiting, especially in pediatric medicine. Available by prescription only in the early 20th century, it was eventually approved by the FDA for over-the-counter sale and recommended as an essential item for households that included young children. In 2004, the FDA began discouraging use of syrup of ipecac due to its lack of efficacy as a treatment for poison ingestion.
The doctor’s bag (also referred to as a physician’s bag or a Gladstone bag) was an essential item in an era where house calls were still part of a general practitioner’s array of services. The bag was usually made of black leather, and carried the most important and portable medical equipment: a stethoscope, thermometer, bandages, syringes, a plexor for testing reflexes, a sphygmomanometer to test blood pressure, and more.
The image of an early 20th-century doctor sharpening a knife may seem foreboding, even sinister. But in the doctor’s office setting, the sharpening stone was used to sharpen scissors or knives for cutting bandages, not for any sort of medical procedures. The surgical discipline, meanwhile, used a cold sterilization method that prevented scalpel blades from dulling. Rest assured: Since at least 1915, scalpels have been a two-piece design that enable the blade to be discarded and replaced with a new one after each use, so there’s no need for sharpening.
Transcendental Graphics/ Archive Photos via Getty Images
Author Kristina Wright
November 9, 2023
Love it?130
First developed in the late 1820s, photography combined art and science into one medium capable of capturing an image in the moment. The innovation transformed recorded history into something that could be documented in pictures as well as text. As the technology advanced, the medium exploded in popularity, making it possible for families to create snapshots of memories for future generations to appreciate. These early photographic portraits transport us back in time, painting a picture of a different way of life: Families were larger, clothes were bulkier, and postures were noticeably stiff and formal. But perhaps the most conspicuous difference of all is that no one ever seemed to smile.
The somber expressions preserved in early photographs might lead us to assume that past generations led austere and joyless lives. However, the lack of joviality in these snapshots can be attributed to several other factors. Here’s the truth behind those stern expressions in old photos.
In the earliest days of photography, the lengthy exposure periods made it impractical to photograph people. For instance, French inventor Nicéphore Niépce’s 1826 “View from the Window at Le Gras,” credited as the oldest surviving photograph, required an exposure time of eight hours. It was more than a decade before Louis Daguerre’s 1839 invention of the daguerreotype made portrait photography practical. But even then, it was a relatively slow and meticulous process that required the subject to remain still for as long as 20 minutes.
By the early 1840s, photographic technology had advanced further, and the daguerreotype images that once required a 20-minute exposure neededonly 20 seconds to process. Still, even modern photo subjects understand the difficulty of maintaining an open-mouthed smile for any amount of time. It only takes a few moments for a candid smile to turn into something more like an awkward grimace. And anyone who has dealt with a restless child can attest that more than a few seconds of remaining motionless is a formidable challenge. To minimize movement and guarantee a sharp image, children were sometimes put into restraints for the length of a photo shoot.
Additionally, until the 20th century, the expense of photographic equipment and the toxic and dangerous chemicals needed to process film meant that most photographs were taken by professional photographers working out of studios or traveling with their equipment. A photography session was a time-consuming and pricey undertaking; it cost the average person as much as three or more months’ salary, and a person might only be photographed a few times in their life. The requirement for stillness, combined with the novelty and cost of posing for a professional photographer, created an atmosphere where it was simply easier to maintain a neutral or serious expression. But even once the technology existed to capture more relaxed expressions, it was a long time before smiling in photos became the norm.
Though technological limitations are frequently cited as the reason for the solemn expressions in old photographs, it wasn’t the only reason our ancestors so often appeared solemn in front of the camera. One notable feature shared by artist portraits from the 17th and 18th centuries and photographs from the early 19th century is the presence of stoic, enigmatic expressions on the subjects’ faces. As portrait artist Miss La Creevy observes in Charles Dickens’ novel Nicholas Nickleby, only two types of expressions existed in portraiture: “the serious and the smirk.”
Before photography, a painted portrait was the only way to preserve someone’s image for posterity. Having your portrait painted was an activity associated with wealth and social status, and accordingly, the art form had its own rules and expectations. This formal portraiture proved to be a big influence on early photographers, who featured their subjects in ways that represented their social status, occupation, or other interests. The social mores associated with painted portraits carried over into photographic portraiture, and smiling was discouraged.
Photo credit: Heritage Images/ Hulton Archive via Getty Images
Social Etiquette Frowned Upon Smiling
Some historians believe that advancements and accessibility in dental care may have contributed to more smiles eventually being captured on film. Other experts disagree, noting that for centuries, a lack of dental care was the norm and thus wasn’t considered to detract from a person’s physical appeal. Still, smiling for a photograph wasn’t commonplace in the early days of photography. In fact, instead of the modern directive to “say cheese!” to produce a wide, toothy grin, some photographers in Victorian-era England asked people to say “prunes,” forcing them to tighten their lips for a more socially acceptable expression based on the beauty standards and etiquette of the time.
In an era where open-mouthed grins were considered unacceptable and a smile was believed to signify someone was poor, drunk, lewd, or otherwise corrupt, it was rare for someone to choose to smile in a portrait — and even less likely that a photographer would encourage it. That all changed, however, with Kodak’s democratization of photography in the early 20th century.
As photography became more accessible in the late 19th century, a wider variety of people took and sat for photographs, and what was acceptable in portrait photography became less rigid. In 1888, Kodak founder George Eastman started a photographic revolution that put cameras in the hands of amateur photographers and gave them an instruction manual on how to take good photos. In 1900, the Kodak Brownie camera was marketed for children and sold for just $1, creating a photography craze that appealed to adults as well.
By the 1920s, a century after the first landscape photographs were captured on film, more relaxed postures and a greater variety of expressions, including closed- and open-mouthed smiles, were common in both amateur and professional photography. With the advent of color photography, the popularity of candid photos, and the rise of affordable personal cameras, capturing an array of expressions — including moments of genuine joy — became the gold standard.
There’s nothing more frustrating than working your socks off only to see someone else get all the credit for your efforts. Spare a thought, then, for the minds behind some of history’s most significant innovations, who, despite months, years, or in some cases lifetimes of work, find someone else’s name ignominiously attached to their invention.
Sometimes inventions are miscredited in the public consciousness simply because a more famous name becomes associated with the creation. For example, Thomas Edison and Henry Ford — two of modern history’s most well-known innovators — are often credited with things they didn’t actually invent, through no fault of their own. Then there are the more insidious misattributions. In some instances, an idea has been copied or outright stolen, robbing the true inventor of their glory; in others, a more senior or prominent member of a team is given credit despite not coming up with the original idea. See, for example, the Matilda effect, in which notable discoveries made by women have often been misattributed to the men they worked with.
Here are some notable inventions in history that are frequently credited to the wrong person, from the flush toilet to the iPod.
No name in the history of toilets is more famous than that of plumber Thomas Crapper, partly because his name appeared on the once-ubiquitous Crapper brand of toilets, and partly because Crapper is a humorously appropriate name for a toilet (the slang word “crap” existed before Thomas Crapper). Crapper, however, did not invent the flushing device with which he is so associated. He did patent the U-bend and floating ballcock — key components of the modern toilet — in the late 1880s, but he never held a patent for the flush toilet. Much earlier, in 1596, John Harington, an English courtier and the godson of Queen Elizabeth I, described what can be considered the first flush toilet, which involved a 2-foot-deep bowl and a massive 7.5 gallons of water per flush. (Only two working models were made, one in Haringon’s own home and one in Queen Elizabeth’s palace.) The first patent for a flushable toilet was granted to the Scottish inventor Alexander Cumming in 1775.
The Italian polymath Galileo Galilei is often credited with inventing the telescope, and it’s easy to see why. He gave birth to modern astronomy with his telescope-assisted discoveries about our moon, the moons of Jupiter, and other celestial bodies. Galileo made his first telescope in 1609 after hearing about the “perspective glasses” being made in the Netherlands. But the first person to apply for a patent for a telescope was Dutch eyeglass-maker Hans Lippershey in 1608, a year before Galileo. His telescope could magnify objects only three times, but it was nonetheless a landmark in the history of optics. (By comparison, by the end of 1609, Galileo had developed a telescope that magnified objects 20 times.) Whether Lippershey should be credited as the inventor of the telescope remains an open debate, as it is entirely possible that others created similar devices before he filed his patent.
Thomas Edison is often — and incorrectly — given all the credit for inventing the lightbulb. But the lightbulb was actually the result of a process that began before Edison was even born. In 1802, English chemist Humphry Davy used a voltaic pile (invented by Alessandro Volta, after whom the volt is named) to create the first “electric arc lamp” between charcoal electrodes. His rudimentary lamp was too bright and burned out too quickly, but it was nonetheless an important breakthrough. Other scientists worked to refine the lightbulb, but problems with filaments and batteries made these early bulbs impractical for everyday use. In 1860, English physicist Joseph Swan developed a primitive electric light that utilized a filament of carbonized paper in an evacuated glass bulb. Lack of a good vacuum and an adequate electric source ultimately made it inefficient, but it did pave the way for later innovations, including those by Edison. Edison purchased some of his predecessor’s patents, improved upon them, and came up with his own lightbulb, which, while not the first overall, was the first to be commercially viable.
Photo credit: Culture Club/ Hulton Archive via Getty Images
The Automobile
One commonly held misconception is that Henry Ford invented the automobile. In reality, the development of the automobile can be traced back to Nicolas-Joseph Cugnot, a French military engineer who, in 1769, built a steam-powered tricycle for hauling artillery. Due to its steam-powered nature, not everyone accepts Cugnot’s invention as the first true auto. Instead, that distinction often goes to vehicles made by two Germans, Karl Friedrich Benz and Gottlieb Daimler, who — working entirely separately — developed their own gasoline-powered automobiles in 1886, in two different German cities. Benz actually drove his three-wheeled vehicle in 1885, and it is regarded as the first practical modern automobile and the first commercially available car in history. As for Henry Ford, his name is forever remembered in auto history for the Model T, which he mass-produced using an innovative moving assembly line, making automobiles available to middle-class Americans.
Advertisement
Advertisement
Photo credit: Maurice Ambler/ Picture Post via Getty Images
Monopoly
Since the 1930s, it’s been common knowledge that Charles Darrow invented Monopoly, an idea that both he and the game’s manufacturer, Parker Brothers, freely propagated (it was printed in the instructions for decades). But it’s not quite true. Darrow got the idea for the game — which made him a millionaire — from a left-wing feminist named Elizabeth Magie. Magie created and patented an early version of Monopoly, called The Landlord’s Game, in 1903, about three decades before Darrow. Darrow learned about the game from a couple who had played it in Atlantic City (which is where many of the game’s street names come from) and made a few changes: The original game included a wealth tax, public utilities, and was designed as a protest against the big monopolists of her time. It had two sets of rules, one that allowed players to create monopolies and crush their opponents, and an anti-monopolist version that rewarded all players when wealth was created (the latter demonstrating what Magie believed to be a morally superior path). It’s only in recent years that Magie has started to receive the credit for inventing one of the world’s most popular and iconic board games.
Photo credit: Justin Sullivan/ Getty Images News via Getty Images
The iPod
Portable digital audio players have existed since the mid-1990s, but it was Apple’s iPod that revolutionized the industry upon its release in 2001. Yet it wasn’t the engineers at Apple who invented the iPod — not entirely, at least. British inventor Kane Kramer actually developed the technology behind the iPod as far back as 1979. His credit card-sized music player, which looked very similar to the iPod, could store only 3.5 minutes of music, but he was sure the storage capacity would increase over time. Unfortunately for Kramer, internal problems at his company ultimately led to his patent lapsing, at which point the technology became public. Apple later acknowledged Kramer’s involvement in inventing the technology behind the iPod.
Advertisement
Advertisement
7 Fascinating Facts From the History of the World’s Fair
The first world’s fair, known as the Great Exhibition, took place in London in 1851. Held in the Crystal Palace — a massive exhibition hall made of glass and iron — the fair displayed marvels of industry and science as well as works of craftsmanship and art from around the world. Since then, more than 100 world’s fairs have been held in over 20 countries, and countless inventions have made their debut at these massive events, from the telephone to cotton candy. Though the world’s fair has declined in popularity in the United States, it remains popular throughout much of the rest of the world. Here are seven highlights from the history of these fascinating exhibitions.
The 1893 World’s Columbian Exposition in Chicago (named in honor of Christopher Columbus) was ripe with opportunity for food sellers. But H.J. Heinz — an American purveyor of pickles and ketchup — was frustrated with his booth placement. While the main floor showcased food exhibits from Germany, Great Britain, Mexico, and other nations, Heinz was stuck on the second floor where there was little foot traffic. He devised a marketing plan that promised a free prize to anyone who visited his booth: a small green plastic pickle pin. The pins were a massive hit; the crowds that flocked to his booth were so large that the floor reportedly sagged around the display. By the end of the exhibition, Heinz had given away more than 1 million pickle pins, paving the way for his brand to become a household name.
The baby incubator — a lifesaving device in which premature or sick infants can develop — was invented in the 19th century by French obstetrician Stéphane Tarnier, who got the idea after seeing baby chicks being incubated at a zoo. The invention was widely adopted decades later, thanks to the work of two men, Pierre Budin and Martin Couney. Determined to popularize the groundbreaking technology, Budin and Couney displayed six incubators complete with real premature babies at the 1896 Great Industrial Exposition of Berlin, in an exhibit they dubbed “Child Hatchery.” The exhibit was so popular that Couney went on to set up a permanent exhibit in an unlikely location: the Coney Island amusement park in New York. For the next four decades, Couney managed a neonatal intensive care unit that saved thousands of babies while doubling as a carnival attraction. Despite not being a licensed doctor, Couney is now widely credited with the adoption of the baby incubator into mainstream medicine.
Thomas Edison’s New X-Ray Machine Was Almost Used When President McKinley Was Shot
The 1901 Pan-American Exposition was held in Buffalo, New York, and showcased many cutting-edge advancements in science and technology. But it was also the site of tragedy. While greeting the public at the fair, U.S. President William McKinley was shot twice by an anarchist named Leon Czolgosz. The first bullet only grazed McKinley, but the second bullet hit him in the stomach, and the medical team could not locate it. As fate would have it, one of Thomas Edison’s new X-ray machines was on display at the fair. Edison had an assistant bring a machine to the house where McKinley was staying, but the medical team decided the President’s condition was too unstable to undergo the X-ray procedure, and the device was never used. McKinley passed away a week later, leading some to wonder whether Edison’s invention might have saved his life.
Chicago Almost Became Home to a Bigger Eiffel Tower, But Got a Ferris Wheel Instead
Four years after the Eiffel Tower was built for the Paris International Exposition of 1889, a Chicago committee started to plan its own world’s fair, soliciting ideas from U.S. architects that would “out-Eiffel, Eiffel.” Proposals included a 1,500-foot tower made of logs and what would have been the first bungee tower. The architect of the Eiffel Tower, Gustave Eiffel, even offered to build a larger version of his namesake landmark. Instead, the Chicago committee opted for something unique: the world’s first Ferris wheel, built by and named for George Washington Gale Ferris Jr. At 250 feet in diameter, and sitting atop 140-foot-tall towers, the Ferris wheel took riders higher than the crown of the Statue of Liberty. While the original Ferris wheel in Chicago has since been replaced by the Centennial Wheel, visitors from around the world continue to enjoy the architectural legacy of the world’s fair at Navy Pier.
A World’s Fair Helped a Woman Inventor Launch the Modern Dishwasher
In 1883, an American socialite named Josephine Cochrane grew frustrated with the tedious task of washing the fine china she used to entertain guests. She vowed, “If nobody else is going to invent a dishwashing machine, I’ll do it myself.” In 1886, Cochrane received a patent for her dishwashing machine, which could wash and dry up to 240 dishes in two minutes with its innovative use of water pressure. However, Cochrane struggled to sell her invention due to the high cost of manufacture, as well as the sexism of the time; potential investors wanted Cochrane to resign and turn over control of her company to men. The 1893 World's Columbian Exposition in Chicago gave her the platform she needed — after publicly demonstrating her machine, Cochrane was awarded the event’s highest prize, for “best mechanical construction, durability, and adaptation to its line of work.” Orders from restaurants and hotels throughout the region skyrocketed, paving the way for the modern dishwasher.
It’s easy to assume that video chatting is a recent invention that came along with the advent of the internet. But video chat technology has a history going back more than half a century. The public’s first contact with video chat was at the 1964 World’s Fair in New York, where Bell Labs debuted a “picturephone” that enabled fairgoers to make video calls with strangers across the country at California’s Disneyland. Long lines formed and Bell Labs (along with its parent company, AT&T) believed the technology would be a commercial hit, with executives projecting that a million picturephone sets would be sold by 1980. Alas, the device failed to take off, largely due to the high price tag. Bell Labs attempted to roll out various iterations of the picturephone in the following decades, but it wasn’t until the advent of the internet that video chat finally took off.
Today, the term “snake oil” signifies fraudulent goods and deceptive marketing, and it all started with a man named Clark Stanley, nicknamed “the Rattlesnake King.” Stanley introduced a “snake oil” product to the American public at the 1893 World's Columbian Exposition in Chicago, claiming to have learned about it from Hopi medicine men. As part of a dramatic live demonstration, he cut open a rattlesnake and submerged it in boiling water, skimming off the fat that rose to the surface to create “Stanley’s Snake Oil.” Spectators were wowed, and Stanley’s product became an immediate hit. But while the oil from certain snakes, such as Chinese water snakes, does have medicinal properties, oil from most snakes native to the U.S. does not. What’s more, Stanley’s product was later found by the FDA to not contain any snake oil at all, but rather beef fat, red pepper, and turpentine, forever making snake oil synonymous with fraud.
Over the past century, the typical home kitchen has undergone a significant transformation, reflecting both social changes and new technology. In the 1920s and ’30s, kitchens were primarily utilitarian spaces with a focus on functionality and easy-to-clean surfaces. Appliances were limited, hand mixers had cranks, and gas ovens, which had replaced wood or coal-burning stoves in most homes, were starting to themselves be replaced by electric ovens.
The post-World War II consumerism of the late 1940s and 1950s brought bigger kitchens for entertaining and more labor-saving appliances, including blenders, mixers, and dishwashers. The kitchen space became more streamlined and functional, and the 1960s and 1970s brought countertop food processors and microwave ovens into the mainstream.
Open-plan kitchens and islands became increasingly popular in home design throughout the 1980s and ’90s, indicative of the kitchen’s role as a hub for family and friends to gather. That trend continued into the 21st century, along with a significant shift toward high-tech kitchens, smart appliances, and a focus on sustainability. Today’s kitchens — reflecting the changing ways we prepare, store, and consume food — look dramatically different than they did a century ago, making many once-popular items obsolete. Here are six things that your grandparents and great-grandparents might have had in their own home kitchens a century ago.
Photo credit: George Rinhart/ Corbis Historical via Getty Images
An Icebox
Before the widespread availability of electric refrigerators, iceboxes were used to keep perishable food cool. These wooden or metal boxes had a compartment for ice at the top, and fresh ice was delivered each week by an iceman. The design of the icebox allowed cold air to circulate around the stored items, while a drip pan collected the water as the ice melted. Naturally, iceboxes fell out of fashion as electric fridges went mainstream. In 1927, General Electric introduced the first affordable electric refrigeration, which relied on a refrigerant for cooling rather than ice.
Photo credit: FPG/ Archive Photos via Getty Images
A Butter Churn
Before commercial butter production made it possible to buy butter at the market, churning cream into butter was an activity done at home. The hand-crank butter churn was introduced in the mid-19th century, and it became the most commonly used household butter churn until the 1940s. In the early 20th century, the Dazey Churn & Manufacturing Company began producing glass churns that could make smaller quantities of butter much quicker than the larger, time-intensive churns. Once the butter was churned, it could then be poured or pressed into decorative molds for serving.
A Hoosier is a freestanding, self-contained kitchen cabinet that was popular in the early 1900s, named after the Hoosier Manufacturing Company that made it. Also known as a “Kitchen Piano” due to its shape, this kitchen necessity offered homemakers ample storage space and an additional work surface. Hoosier cabinets had numerous drawers and shelves for storing cookware and utensils, as well as features such as a flour bin with a built-in sifter, a sugar bin, a spice and condiment rack, a bread bin, a pull-out cutting board, and a cookbook holder. The all-in-one cabinet fell out of favor as kitchen designs began to incorporate built-in cabinets and islands for additional storage and counter space, but they’re still sometimes used for decorative storage.
Photo credit: Camerique/ Archive Photos via Getty Images
A Manual Hand Mixer
While the iconic KitchenAid stand mixer was patented more than 100 years ago in 1919, electric hand mixers weren’t commercially available until the 1960s. Before then, beating eggs or mixing other ingredients was done by hand, often with a manual hand mixer (also called a rotary egg beater). First developed in the 1850s, hand mixers had two beaters that rotated when you turned a crank. Though the style and mechanisms evolved over the years, manual hand mixers were still widely used in the 1920s, when only two-thirds of American households had electricity.
Even though ground coffee was available in bags and cans in the 1920s, and instant coffee was gaining popularity, household coffee grinders, such as a wall-mounted coffee grinder (or mill), were still a common kitchen appliance. According to a 1918 New-York Tribune article on the art of making perfect coffee, “The real coffee lover will always have a mill in the kitchen.” The wall-mounted, hand-crank style had a glass container that could hold a pound of coffee beans, and a container with tablespoon markings to catch the ground coffee.
There was a time when treasured family recipes were written on 3-by-5-inch index cards and stored in a box on the kitchen counter. Before the 1920s, most recipes were passed on by example — young women would learn how to make their grandmother’s pot roast by helping her in the kitchen. As such, handwritten recipes were generally a list of ingredients, often without quantity, and vague directions. As kitchen science developed, magazines began advertising recipe subscriptions delivered as preprinted, perforated cards. Women also started writing their own recipes on blank cards to collect and exchange, and the recipe box proved to be a more decorative and lasting storage solution than a shoebox. Like many vintage kitchen items, this nostalgic throwback still has novelty appeal, but the recipe box has largely been replaced by digital recipes stored on apps and websites.
As tensions rose on Earth during the Cold War, the United States and Soviet Union also vied for celestial supremacy. The space race between the two superpowers began shortly after World War II, and captivated the public until tensions finally eased in the 1970s. With the help of top scientists and talented pilots, Americans, Soviets, and other nations sought to do the seemingly impossible by conquering the final frontier. These decades were marked by scientific achievements and setbacks that make this space-obsessed era one of the most fascinating periods in the 20th century. Here are six facts about the space race.
Fruit Flies Became the First Animal Sent Into Space in 1947
Long before humans reached the stars, fruit flies became the first living organisms to be intentionally blasted into space. Beginning in 1946, the U.S. military conducted a series of experiments in New Mexico’s White Sands Missile Range with future space flight in mind. Utilizing V-2 ballistic missiles — which had been seized from Germany by the U.S. after World War II — the government propelled biological samples such as corn and rye seeds as far as 80 miles into the sky — well beyond the 66-mile distance that NASA now considers the limits of outer space. On February 20, 1947, a capsule containing fruit flies was affixed to one of said missiles and launched to a height of 67 miles above the ground. The flies were chosen to test the effects of cosmic radiation on living beings, and were the perfect candidate for a number of reasons, including their small size, minimal weight, and a genetic code analogous to that of humans, containing similar disease-causing genes. As the rocket began its descent, the capsule detached and drifted back down to Earth using a parachute, and the flies remained alive and unaffected.
In November 1969, just four months after Apollo 11 landed on the moon, the Apollo 12 mission took to the skies. But what was scheduled to be a standard launch experienced near-disaster just 36.5 seconds into the flight, as lightning struck the Saturn V rocket. The unexpected event disrupted the onboard control panels, causing astronaut Dick Gordon to confusedly exclaim, “What the hell was that?” before yet another bolt struck at the 52-second mark. With alarms blaring and equipment malfunctioning, the puzzled astronauts continued to troubleshoot the spacecraft while not fully understanding what had happened. Ultimately, the crew shifted the craft to an auxiliary power supply that allowed the mission to continue as planned. Around three minutes into the flight, astronaut Pete Conrad wondered aloud if they’d been struck by lightning, and by the 11-minute-and-34-second mark, the crew was successfully floating in space. With disaster averted, the Apollo 12 astronauts became the second group of individuals to walk on the moon.
Photo credit: Space Frontiers/ Archive Photos via Getty Images
Alan Shepard Played Golf on the Moon
While the harrowing stories of Apollo 12 and Apollo 13 are widely known, the Apollo 14 mission produced one of the more lighthearted moments of the space race. On February 6, 1971, during a live broadcast of the Apollo 14 spacewalk, astronaut Alan Shepard produced a retractable six-iron golf club and took four swings on the moon’s surface. Given his bulky spacesuit, Shepard couldn’t grip the club with both hands and swung it solely with his right, causing him to miss the golf ball and connect directly with the lunar surface on both of his first two swings. Shepard hit the ball with his third swing, though it only traveled 24 yards. With his fourth and final shot, Shepard made flush contact and claimed the ball traveled “miles and miles and miles”; in reality, it only reached a distance of about 40 yards, though it remained airborne for longer than here on Earth given the moon’s lack of gravity. After returning to Earth in 1974, Shepard donated the golf club to the USGA Golf Museum in New Jersey, where it remains a popular artifact.
The First Joint U.S.-Soviet Space Mission Occurred in 1975
Despite a heated rivalry that lasted more than two decades, the United States and U.S.S.R. worked together on a joint space mission in 1975. The space race had been in full swing since at least 1955, when the two global powers announced their intention to launch satellites into orbit. The Soviets made history by sending the first man into space in 1961, and America landed the first man on the moon in 1969. After years of one-upmanship, tensions began easing in 1972 with the signing of a space cooperation agreement. On July 15, 1975, the nations jointly embarked on the Apollo-Soyuz mission, which served as a symbolic end to the decades-long space race. This mission included two separate spaceflights led by American astronaut Tom Stafford and Soviet cosmonaut Alexei Leonov, who later docked their crafts together in space and exchanged an international handshake. In the 1980s, talk of an International Space Station jointly managed by the United States and Soviet Union began, and assembly of the spacecraft began in 1998.
Advertisement
Advertisement
Photo credit: Hulton Deutsch/ Corbis Historical via Getty Images
Nations Around the World Joined the Space Race
While the history of the space race largely focuses on the United States and Soviet Union, other nations joined the fray in the late 1970s. The first non-American and non-Soviet pilot to reach outer space was Czech cosmonaut Vladimír Remek, who studied flight in Moscow. On March 2, 1978, Remek took off aboard the Soviet’s Soyuz 28 spacecraft and headed for the Salyut 6 space station, where he and his co-pilot conducted research for eight days before returning to Earth. Upon his return, Remek was heralded as a hero by his native Czechoslovakia (now Czechia and Slovakia), paving the way for other nations to send humans into space shortly thereafter. Later that year, Polish pilot Mirosław Hermaszewski and East Germany’s Sigmund Jähn both boarded Soyuz missions of their own. In 1980, the Soyuz program also sent the first pilots from Latin America (Arnaldo Tamayo Méndez of Cuba) and Southeast Asia (Phạm Tuân of Vietnam) into outer space.
If you were alive at the time, it’s hard to forget where you were when humans first walked on the moon. While those vivid memories remain, it’s actually been more than 50 years since someone last set foot on the lunar surface. In 1969, Neil Armstrong became the first of the 12 people (all Americans) who have set foot on the moon. The final moon walk to date occurred just three years later as part of the Apollo 17 mission, with astronauts Eugene Cernan and Harrison H. Schmitt. They landed their spacecraft in the Taurus-Littrow valley, a narrow lunar opening deeper than the Grand Canyon. The pair explored the region for seven hours a day over the course of three straight days, and even suffered a minor accident in the process after Cernan accidentally dropped a hammer on their lunar rover. The astronauts also discovered orange soil as evidence of lunar volcanic activity, which proved to be one of the most important discoveries of any Apollo mission. Before leaving, they left behind a plaque that reads, “Here man completed his first explorations of the moon.” Nobody has returned to the moon since.
Advertisement
Advertisement
Another History Pick for You
Today in History
Get a daily dose of history’s most fascinating headlines — straight from Britannica’s editors.