“Computers” used to be people.
In today’s English lexicon, the word “computer” almost exclusively refers to electronic devices — but it used to be a human job. For centuries, “computer” meant “one who computes,” particularly in an astronomical observatory or as a surveyor. This definition dates all the way back to the early 1600s, long before even the most primitive digital computing machines existed.
The role of computers was, more often than not, filled by women. Although the work required a great deal of skill and made major contributions to the field of astronomy, computing was considered clerical work. In the 1870s, the Harvard College Observatory hired several dozen women as computers, who compared photographic plates of the night sky and painstakingly measured the differences in stars’ positions. Among them were Williamina Fleming, who pioneered classifying stars by temperature; Annie Jump Cannon, who created the letter stellar classification system that scientists still use today; and Henrietta Swan Leavitt, who discovered around half of all variable stars (meaning their brightness changes when viewed from Earth) known at the time.
Perhaps the best-known human computers were those employed by NASA to make calculations by hand during critical space missions. Katherine Johnson, one of the three African American NASA computers featured in the book and film Hidden Figures (along with Dorothy Vaughan and Mary Jackson), performed calculations for the Mercury and Apollo missions, including the first moon landing.
You may also like
Recommendations For You
-
01.Science & Industry
Why Did Doctors Wear Beak Masks During the Bubonic Plague?
-
02.Science & Industry
5 Inventions That Came Out of the Great Depression
-
03.Science & Industry
6 Amazing Breakthroughs Made by the Ancient Greeks
-
04.Science & Industry
6 Shocking “Scientific” Beliefs From Victorian England