Andrew Jackson’s parrot was kicked out of his funeral for swearing.

  • Andrew Jackson
Andrew Jackson
Photo credit: Library of Congress/ Unsplash
Author Michael Nordine

June 9, 2023

Love it?

Before cats and dogs became the go-to, a number of U.S. Presidents had unusual pets. Thomas Jefferson had bear cubs, Calvin Coolidge kept a raccoon in the White House, and Theodore Roosevelt counted guinea pigs, a bear, and a hyena among his dozens of animals. That makes the fact that Andrew Jackson had an African grey parrot named Poll less odd, but what is strange is that the bird was kicked out of Jackson’s funeral for swearing. This knowledge comes to us courtesy of Samuel G. Heiskell’s book Andrew Jackson and Early Tennessee History, which quotes Reverend William Menefee Norment as observing: “Before the sermon and while the crowd was gathering, a wicked parrot that was a household pet got excited and commenced swearing so loud and long as to disturb the people and had to be carried from the house.”

Sadly, the good reverend didn’t detail the specifics of Poll’s outburst beyond saying that it was “excited by the multitude and… let loose perfect gusts of ‘cuss words,’” causing some to be “horrified and awed at the bird’s lack of reverence.” What became of Poll following the funeral is unknown, but it isn’t uncommon for African greys to outlive their owners — they’ve been known to live up to 60 years in captivity.

Victorian-era women ate arsenic as a beauty treatment.

  • Arsenic on periodic table
Arsenic on periodic table
Credit: magnetix/ Shutterstock
Author Sarah Anne Lloyd

March 27, 2024

Love it?

In the United States and Europe, a ghostly pallor was the height of fashion among Victorian-era women. Pale skin signaled high class, both because it meant that you never had to work in the sun and because wasting away from consumption, what we now know as tuberculosis, had become associated with beauty in certain affluent circles. In the late 18th century, wealthy women started romanticizing the extreme thinness, near-translucent skin, and rosy cheeks of those who suffered from the disease, an attitude that came to a peak in the mid-19th century. Tuberculosis, while devastating, brought out features that some already considered attractive, and beautiful women were, falsely, even thought to have been particularly vulnerable to the illness. Women chasing the fashion wore tight corsets and full skirts to show off their tiny waists and made their faces as pale as possible. And if they didn’t already have a ghostly complexion, they could get the look in other ways — such as through long-term arsenic exposure.

In 1851, a Swiss physician published a report in a medical journal about the “toxicophagi,” a group of people in modern-day Austria who routinely consumed arsenic; they knew it was poison, but thought they could develop an immunity to it by starting with small doses and gradually increasing the intake. The report’s author claimed that arsenic gave them great energy, sparkling eyes, and wonderful complexions, but noted that after long-term use, unsurprisingly, “most arsenic eaters end with an inevitable infirmity of the body.”

After the article was published, beauty writers told stories of women in Bohemia (modern-day Czechia) who bathed in “arsenic springs” as skin care. It became a popular treatment for women chasing a naturally wan look — as opposed to the “painted ladies” of the day, who used heavy makeup to appear pale. Arsenic-based “complexion wafers” started hitting store shelves; Dr. James P. Campbell’s Safe Arsenic Complexion Wafers promised relief from blemishes and “a deliciously clear complexion,” and were sold well into the 20th century. These wafers reportedly had a very low dose of the toxin, but because there is no such thing as “safe” arsenic, the wafers were still fatal for some consumers. Legitimate doctors warned against their use, and at least one physician in San Francisco worried that arsenic poisoning was going undiagnosed because women neglected to tell their doctors they were taking it. 

Limping was a fad in Victorian England.

  • Queen Alexandra
Queen Alexandra
Credit: Alpha Stock/ Alamy Stock Photo
Author Sarah Anne Lloyd

March 13, 2024

Love it?

In Victorian-era Britain, few fashion icons were as influential as Alexandra of Denmark, aka Alexandra, Princess of Wales, who became Queen Alexandra in 1901 when her husband took the throne after the death of Queen Victoria. Alexandra married Prince Albert, later King Edward VII, in 1863, joining the royal family just as photography rose to prominence, and her style spread through Britain like wildfire. When she started wearing broad, elaborate, choker-style necklaces — supposedly to cover up a scar on her neck from childhood — they caught on, and stayed in fashion for the next several decades.

After the birth of her third child in 1867, the princess developed a severe case of rheumatic fever that left her with a stiff knee and a pronounced limp, and she sometimes used mobility aids such as walking sticks to get around. It was far from a style choice, but high-society ladies were so eager to imitate her that they even adopted her gait in a trend known as the “Alexandra limp.” At first, Alexandra’s able-bodied imitators wore mismatched shoes to get the walk, but eventually retailers took notice. Shops started stocking pairs of shoes with two different heel heights to capitalize on the limping craze. Sometimes a walking stick completed the look. Even at the time, the trend was considered in poor taste. “There must be a line at which even fashionable folly may be expected to stop short… at the caricaturing of human infirmity,” read a column published in the Scottish newspaper Courier and Argus. Ultimately, the Alexandra limp didn’t have the same longevity as thick chokers, and the trend passed quickly.

St. Patrick was originally associated with the color blue, not green.

  • St. Patrick illustration
St. Patrick illustration
Credit: duncan1890/ iStock
Author Bennett Kleinman

March 7, 2024

Love it?

Long before St. Patrick’s Day became synonymous with the color green, its namesake saint — and even Ireland as a whole — was more closely associated with various shades of blue. St. Patrick is often credited with spreading Christianity throughout Ireland, and he became known as the patron saint of the country. The earliest known depiction of the saint — found in a 13th-century French manuscript — shows him clad in a blue robe, and he was often associated with the cool hue. 

The color blue was also used to represent Ireland itself, starting with the actions of King Henry VIII in 1541. The English monarch declared himself king of Ireland and presented the Irish kingdom with a new coat of arms featuring a golden harp on a dark blue background — the first link between Ireland and the color blue. Two centuries later, in 1783, England’s King George III established the Order of St. Patrick in Ireland, whose members wore outfits in a shade known as “St. Patrick’s blue.” However, these symbols were imposed upon the Irish people by their English oppressors, and the color blue never reflected the true Irish identity. It wasn’t until the Irish Rebellion of 1798 that the Irish adopted the color green — a shade also embraced by nationalists during an earlier 1642 rebellion — as a symbol of national pride, replacing the old blue colors found in many Irish flags and emblems. Around this time, green was also introduced to St. Patrick’s Day festivities, and it became the standard hue of the holiday shortly after.

Jimmy Carter was once attacked by a “swamp rabbit.”

  • Jimmy Carter in boat, 1979
Jimmy Carter in boat, 1979
Credit: Zuri Swimmer/ Alamy Stock Photo
Author Michael Nordine

March 7, 2024

Love it?

Most Presidents are known to have favorite vacation spots when they need to temporarily escape the hustle and bustle of Washington, and for Jimmy Carter it was his farm in his hometown of Plains, Georgia. During one such getaway on April 20, 1979, however, the commander in chief was relaxing on a boat in a pond when what has since become known as the “killer rabbit attack” occurred. The swamp rabbit in question, which, according to The New York Times, was said to be “hissing menacingly, its teeth flashing and its nostrils flared,” swam toward Carter until he shooed it away with his paddle. As unbelievable as the incident might sound — including to Carter’s own staff, whose skepticism stung him — it was photographed for posterity.

“It was a killer rabbit,” one skeptical staff member said upon seeing the picture. “The President was swinging for his life.” Carter himself downplayed the incident, describing his supposed attacker as “just a nice, quiet, typical Georgia rabbit” that was likely fleeing from a predator. The media had its fun with the offbeat story nevertheless. The Washington Post ran a front-page article titled “Bunny Goes Bugs: Rabbit Attacks President,” accompanied by a Jaws-style poster titled Paws, and the Associated Press published an even more sensationalized cartoon of the encounter. 

Doctors warned women of developing “bicycle face” from cycling in the 19th century.

  • 1800s woman cycling
1800s woman cycling
Credit: ZU_09/ iStock
Author Sarah Anne Lloyd

March 7, 2024

Love it?

The modern bicycle — originally called a “safety bicycle” because it wasn’t as treacherous as a big-wheel penny farthing — was invented in the 1880s, ushering in the 1890s bike craze in America. Cycling was especially popular with women, as it offered a freedom they didn’t have before, such as easier means to travel where they pleased, go on unchaperoned dates, or skip church. Female cyclists also began wearing bloomers under skirts, which, in the eyes of some who disapproved, were a little too close to pants. The popularity of cycling (and its implications for women’s empowerment) caused something of a moral panic. Men weren’t immune — some religious leaders worried about physical exertion, competitiveness, and performance-enhancing drugs — but women got the bulk of the ire. Cycling,  some medical authorities claimed at the time, could lead to uterine displacement, or a new condition called “bicycle face.”

Descriptions and alleged causes of bicycle face varied; according to one magazine, a woman suffering from the malady would be “usually flushed, but sometimes pale, often with lips more or less drawn, and the beginning of dark shadows under the eyes, and always with an expression of weariness.” One physician said that those suffering from bicycle face have “an anxious look and an unwholesome pallor.” Others said that symptoms include a clenched jaw and bulging eyes. Nobody was immune to bicycle face, but women were considered much more susceptible. Theories as to the cause included overexertion from trying to keep the bike balanced, bad posture, or even a more spiritual cause: riding bikes on the Sabbath. Fortunately for cyclists, the crisis subsided in the early 1900s as the bicycle became more commonplace and hand-wringers turned their anxiety toward automobiles — and, naturally, “horseless carriage face.”

A woman named “Diot Coke” was born in the Middle Ages.

  • Diet Coke bottles on ice
Diet Coke bottles on ice
Credit: Steve Cukrov/ Alamy Stock Photo
Author Michael Nordine

February 27, 2024

Love it?

Roughly 600 years before a certain sugar-free soda was created, a newborn entered the world with a most bubbly name: Diot Coke. The woman’s existence was rediscovered many centuries later by George Redmonds of the British National Archives, who happened upon Ms. Coke while researching 14th-century names. His contention is that her first name was a diminutive version of “Dionisia,” which at the time was a rather popular name that evolved into “Denise,” while her last name was a corruption of “Cook.”

Though little is known about Diot — her hopes, her dreams, her beverage of choice — beyond the time and place (Yorkshire, England) in which she lived, plenty is known about the world’s most popular diet soda. Introduced nearly two decades after the initial low-calorie Coca-Cola product, Tab, which first entered the marketplace in 1963, Diet Coke eschewed sugar in favor of artificial sweeteners and originally carried the slogan, “Just for the taste of it, Diet Coke!” Tab was discontinued in 2020 after 57 years of production, but sales figures suggest Diet Coke won’t leave shelves anytime soon.

The world’s oldest film is two seconds long.

  • Louis Le Prince, 1880s
Louis Le Prince, 1880s
Credit: Svintage Archive/ Alamy Stock Photo
Author Kerry Hinton

February 27, 2024

Love it?

Although Thomas Edison is often credited as the “father of motion pictures,” he wasn’t the first inventor to make a movie. In 1888, a French photographer named Louis Le Prince made his mark on cinema with a two-second, 24-frame celluloid film titled Roundhay Garden Scene, widely believed to be the oldest film in existence. This debut predated more well-known works such as Edison’s Monkeyshines (1889) and Workers Leaving the Lumière Factory (1895) by the Lumière brothers.

In the late 19th century, Le Prince and dozens of other inventors were in a race to create and project moving pictures. Inventions such as the Phenakistoscope (Joseph Plateau, 1833) and the Zoopraxiscope (Eadweard Muybridge, 1879) were able to simulate movement, but they relied on images created with multiple cameras. In 1888, Le Prince patented his Single-lens Cine Camera, the first device to record and reproduce pure motion — arguably the first movie camera. In October that year, he used the groundbreaking device to shoot Roundhay Garden Scene in Leeds, England, where he had relocated with his family. The cast wasn’t brought on for its acting chops: It included Le Prince’s son (quite possibly cinema’s first nepo baby), parents-in-law, and a friend. The resulting film could be called one of the earliest examples of slow cinema: Four people walk in circles in front of a camera for 2.11 seconds. 

Sadly, before he could demonstrate his inventions to the public, Le Prince disappeared on September 16, 1890, after boarding a train in Dijon, France. He was never seen again, but his pioneering inventions give a glimpse of what else he could have accomplished. According to British filmmaker David Wilkinson, “He would have done what Edison and then the Lumieres did, but before them. He would have been known.” 

In 1712, Sweden had a February 30.

  • Swedish flag
Swedish flag
Credit: fretschi/ Shutterstock
Author Bennett Kleinman

February 21, 2024

Love it?

February is normally the shortest month of the year, but in 1712, Sweden extended the month all the way to February 30. This calendrical anomaly occurred as the country awkwardly shifted between the Julian and Gregorian calendars, which had about a 10-day gap between them. Pope Gregory XIII had introduced the latter calendar in 1582 to fix large discrepancies between the solar year and calendar date that the Julian calendar had incurred. Nations around the world slowly adopted the new calendar, and Sweden finally opted to do so in 1700.

The year 1700 happened to be a leap year in the Julian calendar, but not in the Gregorian version, widening the gap even further; March 1 in the Julian calendar corresponded to the Gregorian March 12. Sweden planned to gradually switch to the Gregorian calendar by omitting 11 leap days over the course of 40 years, but that plan was derailed when leap years were still mistakenly observed in 1704 and 1708. By 1712, Sweden’s timekeeping was such a mess, the country planned to shift back to the Julian calendar starting on March 1. (It also wanted to ensure that Easter would be celebrated on a Sunday.) To accomplish this, Sweden added February 29 — as 1712 was already a leap year to begin with — plus an extra day, February 30, to make up for the leap day it had omitted back in 1700. The country finally made the permanent shift to the Gregorian calendar in 1753, bridging the 11-day difference by jumping from February 17 to March 1.

The “New York Times” logo originally had a period at the end.

  • The New York Times, 1942
The New York Times, 1942
Credit: Everett Collection Historical/ Alamy Stock Photo
Author Bennett Kleinman

February 14, 2024

Love it?

When The New York Times ran its first issue on September 18, 1851, there was a period at the end of the now-iconic logo. However, after more than a century of use, that period was removed — in part to save money on ink. The company was initially called the New-York Daily Times before undergoing its first visual revamp in 1857, when its name changed to just The New-York Times. The masthead was modified even further in 1896, when the paper removed the hyphen from its logo. But it wasn’t until 71 years later that the period was finally axed as well.

Around the middle of the 20th century, the Times sought out ways to modernize its look. In 1967, art director Louis Silverstein redesigned the logo to appear stronger and more visually appealing, by making the thicker parts of the font even thicker and the thinner parts thinner. At the same time, he did away with the period at the end, which he believed made the overall logo look less sleek and more archaic. The change had an added financial benefit, too: Silverstein estimated that dropping the period would save roughly $600 in ink costs each year (around $5,500 today). The paper’s logo has remained largely unchanged since then, outside of slight modifications to the thickness of the lettering.