Long before laptops and smartphones, a 19th-century Englishwoman named Ada Lovelace created what many consider to be the world’s first computer program. Lovelace was born in 1815 to famed poet Lord Byron and philanthropist Annabella Milbanke Byron, though she never had a relationship with her father and was raised alone by her mother. Annabella became fearful that Ada would inherit her artistic father’s perceived “insanity,” and so she encouraged Ada to study grounded disciplines such as logic and math. Lovelace grew fond of those pursuits, and developed a keen interest in the inventions of English mathematician Charles Babbage, whom she met in 1833. Babbage told Lovelace of his plan to create a complex calculating machine known as the Analytical Engine — the precursor to the modern computer — and Lovelace was eager to contribute to the project.
In 1843, Lovelace was asked to translate a French account of one of Babbage’s lectures overseas, and Babbage encouraged her to expand the paper with her own thoughts. In August of that year, Lovelace published the 66-page translation, which included 41 pages of appendices containing additional theories and formulas. The most famous of these notations is “Note G,” which has been deemed the world’s first computer program. In this table, Lovelace determined how the machine could theoretically calculate a sequence of rational numbers known as Bernoulli numbers. Though the machine was never built, and thus was never able to successfully execute Lovelace’s calculations, the theory laid the groundwork for the future of computer programming.