Happy 200th Birthday, Ada Lovelace

Birthdays & Anniversaries Geek Culture

ada

Augusta Ada King, Countess of Lovelace was born 200 years ago today.

Ada, as she would be known throughout her life, was born on December 10, 1815. She was the daughter of Anne Isabella Milbanke and the poet Lord Byron, although her parents separated when she was only four months old and she never met her famous father.

From an early age, Ada found mathematics fascinating, something her mother was all too happy to encourage in the hopes that it would keep Ada from becoming too much like her philandering father.

A modern reproduction of the Babbage Analytical Engine at the Computer History Museum. Photo by Rob Huddleston.
A modern reproduction of the Babbage Analytical Engine at the Computer History Museum. Photo by Rob Huddleston.

Ada was introduced to Charles Babbage in 1833. Babbage was developing what he called a “Difference Engine”–a giant machine that could quickly and accurately calculate differential equations, but Ada saw much more potential in the machine. Over the next few years, Ada and Babbage would exchange dozens of letters. In 1842, Ada would translate an Italian’s article on Babbage’s next great work, the Analytical Engine, which he suggested she annotate. Those notes ended up being three times longer than the article itself, but in them she made the argument that the Engine could become a general purpose computer. In fact, she went so far as to lay out details on how to program the machine, which today is considered the world’s first computer program, and by that logic, Ada is considered by many to be the first programmer.

She married William King in 1835. He would later become Earl of Lovelace, and thus Ada became Lady Lovelace. They had three children: Byron, Annabella and Ralph. She died at age 36 from uterine cancer. She is buried next to her father in Nottinghamshire.

It’s worth noting that two other computer pioneers would also have had birthdays this week:

Admiral Grace Hopper, born December 9, 1906

Grace Hopper on her promotion to Commodore. Image by Pete Souza [Public domain], via Wikimedia Commons
Grace Hopper on her promotion to Commodore. Image by Pete Souza [Public domain], via Wikimedia Commons
Grace Hopper was born Grace Murray in New York City. She was curious from the start, famously dismantling her family’s clocks as a child to figure out how they worked. She earned her bachelor’s degree in mathematics and physics from Vassar at age 22, with masters and PhD degrees from Yale in the following four years.

In World War II, she enlisted in the Navy Reserve. In 1944, she was assigned to work on the Mark I computer with Howard Aiken, with whom she co-authored three papers on the computer. After the war, she would transfer to work on the UNIVAC, where she invented the world’s first compiler. She became convinced that programming languages should be written in something closer to English, rather than machine language. In 1954, she was a technical consultant for a conference that developed COBOL, which built upon an earlier language she developed.

She remained in the Navy until 1986, when, thanks to a special act of Congress, she retired with the rank of Rear Admiral. From that time until her death at age 85, she worked as a consultant for DEC, where she lectured on computers and how they should make life easier for users.

She is buried at Arlington National Cemetery. The USS Hopper (DDG-70), an Arleigh Burke class destroyer, is named for her–one of the very few combat vessels in the Navy named for a woman.

Robert Noyce, born December 12, 1937

Robert Noyce (left) and Gordon Moore at Intel. Image credit: By Intel Free Press [CC BY-SA 2.0 (http://creativecommons.org/licenses/by-sa/2.0)], via Wikimedia Commons
Robert Noyce (left) and Gordon Moore at Intel. By Intel Free Press [CC BY-SA 2.0 (http://creativecommons.org/licenses/by-sa/2.0)], via Wikimedia Commons
Noyce was born in Burlington, Iowa, the son of a Congregational minister. While working on an undergraduate degree from Grinnell College, Noyce studied under Grant Gale, who happened to be in possession of some of the earliest transistors. Noyce was instantly fascinated by the devices. At Gale’s urging, Noyce applied for and was accepted to MIT to pursue a PhD.

In 1956, William Shockley, co-inventor of the transistor, personally invited Noyce to move to California and join his company. They would have a rocky relationship, eventually leading Noyce and seven of his colleagues, including Gordon Moore (of Moore’s Law fame), to leave the company and start their own business, Fairchild Semiconductor. While this is common practice today–particularly in the Silicon Valley–it was practically unheard of at the time. Shockley would never forgive the “traitorous eight”. While at Fairchild, Noyce pioneered open work spaces, casual work environments, a lack of executive perks, and many of the other workplace innovations we associate today with tech companies. Unlike Shockley, he encouraged his people to go on and found other companies (what became known as “Fairchildren”.)

In July of 1959, Noyce filed US Patent 2,981,877 for a “Semiconductor Device and Lead Structure”–the integrated circuit or silicon chip, which he invented by improving on the earlier work of Texas Instruments’ Jack Kilby (most importantly, by building it using silicon rather than germanium). While lots of work was being done by others at the same time, and the invention of the integrated circuit was certainly inevitable, Noyce nonetheless deserves the credit for being the one who got there first.

In 1968, Noyce and Moore left Fairchild to found yet another company, which they called Integrated Electronics, but was very soon thereafter shortened to Intel. While at Intel, he oversaw the invention of the next great step forward in modern computing: the microprocessor.

Noyce suffered a heart attack and died on June 3, 1990. In accepting the Nobel Prize for Physics in 2000, Kilby credited Noyce and stated that they should have received the prize together. (Nobels are never awarded posthumously.)

Liked it? Take a second to support GeekDad and GeekMom on Patreon!
Become a patron at Patreon!