In 1843, Ada Lovelace published what is now recognised as the first algorithm — a set of instructions for calculating Bernoulli numbers using Charles Babbage’s proposed Analytical Engine. The engine was never built. The algorithm was never run. But Lovelace’s notes, which extended the paper she was translating from French for the benefit of English readers, contained something Babbage himself had not written: the insight that a machine capable of manipulating numbers according to rules could manipulate any symbol according to rules, and could therefore, in principle, compose music, process language, and do anything else that could be described in sufficiently precise instructions.
She was describing, in 1843, what we now call software. She was 27. She died of cervical cancer at 36. Her name was largely absent from the history of computing until the 1970s, when feminist historians began recovering it.
This is the history of women and technology: a history of foundational contribution followed by systematic erasure, followed by the necessity of recovery.
The Women Who Built Computing
Before computing was a profession — before there were university departments, career paths, and salaries — it was work done by women.
During World War II, the human computers who performed the calculations that guided artillery, tracked aircraft, and cracked enemy codes were largely female. The ENIAC — the first general-purpose electronic computer, completed in 1945 — was programmed by six women: Jean Jennings Bartik, Frances Bilas Spence, Betty Holberton, Marlyn Meltzer, Frances Elizabeth Snyder Holberton, and Ruth Lichterman Teitelbaum. When ENIAC was publicly demonstrated, the men who built the hardware posed for photographs with the machine. The women who programmed it were not in the photographs.
Grace Hopper is the most significant figure in the history of software whose name is least known to people outside the technology industry. A Navy rear admiral and mathematician, Hopper created the first compiler — a program that translates human-readable language into machine code — and was instrumental in the development of COBOL, the programming language that ran (and in many cases still runs) the financial and governmental systems of the Western world. She coined the term “debugging” after extracting an actual moth from a relay in the Harvard Mark II computer. The bug is preserved in the Smithsonian.
The women of NASA’s Langley Research Center — Black mathematicians who performed the calculations that underpinned America’s space program — were described in Margot Lee Shetterly’s Hidden Figures (2016) and subsequently in the film adaptation. Katherine Johnson, Dorothy Vaughan, and Mary Jackson calculated trajectories, programmed IBM computers, and produced work without which the astronauts they supported could not have flown. They worked in segregated facilities. Their bathrooms were separate. Their names were not on the plaques.
The Gender Flip and Its Causes
The proportion of women in computing rose steadily through the 1960s and 1970s, reaching a peak in the mid-1980s. Then it fell — sharply, consistently, and in a way that is distinct from every other STEM field.
By 2021, women represented approximately 26% of computing professionals in the United States, down from around 37% in 1985. The proportion of computer science bachelor’s degrees awarded to women fell from 37% in 1984 to approximately 18% in 2021.
What happened? The research on this is substantial and its conclusions are reasonably clear.
The personal computer — which became a consumer product in the late 1970s and was mainstream by the mid-1980s — was marketed almost exclusively to boys and men. The family computers of the 1980s were positioned in boys’ bedrooms. Boys arrived at university computer science courses having already used computers extensively; girls often arrived with no prior experience and encountered curricula that assumed the knowledge they didn’t have. The gap was not cognitive. It was experiential. And it was manufactured.
The “nerd” culture that emerged around computing in the 1980s and 1990s was explicitly male in its references and its social codes — the science fiction, the war gaming, the social dynamics that valued a specific kind of aggressive intellectual competition. Women who entered this culture often described feeling not merely excluded but actively unwelcome: the “booth babes” of tech trade shows, the gendered jokes, the casual dismissal of female technical competence that is now well-documented in interview studies and internal company reports.
The pipeline problem is real but it is not the primary problem. When women leave the tech industry at twice the rate of men — which research consistently shows they do — the problem is not that there aren’t enough women entering. The problem is a workplace culture that is hostile to them in specific and documented ways.
The Bias Built In
The consequences of building technology without women are not merely corporate. They are embedded in the products.
Facial recognition systems trained predominantly on white male faces perform significantly worse on women and people of colour. A 2019 study by the MIT Media Lab found error rates of 35% for darker-skinned women compared to less than 1% for lighter-skinned men. These systems are used by law enforcement, by banks, by employers.
Voice recognition systems have historically performed worse on female voices — a consequence of being trained predominantly on male speech. Medical diagnostic algorithms trained on clinical data collected primarily from male patients make worse diagnoses for women. The Apple Watch’s health monitoring features, at launch, tracked heart health extensively but could not track menstrual cycles. The omission was not malicious. It was the product of building health technology without women in the room.
Algorithmic hiring tools, trained on historical hiring data from industries where men were hired for technical roles at higher rates, have been documented to systematically downrank female candidates. Amazon scrapped its AI recruiting tool in 2018 for exactly this reason.
The problem of biased AI is not separable from the problem of who builds it. This is why the gender gap in technology is not a diversity-and-inclusion issue. It is a product quality issue. It is a public safety issue.
What Is Being Done and What Is Working
The interventions that research shows are effective are specific and often counter-intuitive.
Representation in curricula works — but only when it includes active role models, not just historical mentions. Telling girls about Ada Lovelace in a textbook has limited effect. Connecting them with working female technologists has measurable effect on interest and persistence.
Changing the classroom culture — specifically, eliminating the competitive speed-and-performance culture of many CS classrooms in favour of collaborative, project-based learning — significantly increases female retention without reducing male performance.
Parental leave equality reduces attrition of women at the critical mid-career point where most dropout occurs, because most dropout is not about preference but about the impossibility of combining the demands of technical careers with the unequal domestic labour distribution that kicks in around child-rearing.
Pay transparency — the requirement that pay scales be visible — reduces the gender pay gap in companies where it’s implemented. The UK’s Gender Pay Gap Reporting requirement, introduced in 2017, has produced measurable narrowing in reporting companies.
What Women Are Building
The binary of “women in tech” (discussed) versus “tech” (the actual thing) is itself a problem. Women building technology are building technology, not “women’s technology.”
Fei-Fei Li, co-founder of Stanford’s Human-Centered AI Institute, created ImageNet — the dataset that underpinned the deep learning revolution that transformed computer vision. The AI systems that power image recognition in your phone, in medical imaging, in autonomous vehicles, use architectures trained on a dataset she built.
Dr. Timnit Gebru’s work on algorithmic bias — and her subsequent firing from Google in 2020 after refusing to suppress a research paper on the harms of large language models — is the most important story in AI ethics of the past five years. She founded the DAIR (Distributed AI Research) Institute to continue the work outside corporate constraints.
Joy Buolamwini founded the Algorithmic Justice League after discovering that facial recognition software could not recognise her face. Her research changed the commercial decisions of IBM, Microsoft, and Amazon on facial recognition deployment.
These women are not adjacent to the technology story. They are, in different ways, the technology story. The history of computing has always included them. The history told about computing has excluded them. Correcting this is not an act of charity toward women. It is an act of accuracy about what actually happened.
Related reading: