An Abbreviated History of Computer Coding: From The Analytical Engine to Ruby On Rails
Written by wyncode on 16th April 2014, 12:03 PM
<div id="attachment_1531" style="width: 950px" class="wp-caption aligncenter img-fluid"><img class="size-large wp-image-1531" src="/uploads/2014/04/earlycomputercoding_wyncode-940x770.jpg" alt="Early census computing"><p class="wp-caption-text">Via census.gov: “FOSDIC (Film Optical Sensing Device for Input to Computers). First developed in 1953 for the 1960 Census of Population and Housing.”</p></div>
Image via census.gov: “FOSDIC (Film Optical Sensing Device for Input to Computers). First developed in 1953 for the 1960 Census of Population and Housing.”
Coding has come a long way since it was first invented by an Italian woman in the 1800’s. And women have always played a significant role in its advancement through the ages. Now, many significant aspects of the modern world are now run by algorithms written in the code of increasingly powerful computer programming languages.
While Harry Potter may be powerful in his world, coders are the wizards of ours, and coding school is our Hogwarts. What follows is a short history of the events that have brought the coders to power.
Italian mathematician Ada Lovelace is credited with the creation of the world’s first computer program, which was written for calculating Bernoulli numbers with Charles Babbage’s Analytical Engine. Way back in the mid 1800’s, this fine lady wrote the first algorithm specifically designed to be completed by a machine. The Analytical Engine was never finished. But Ada Lovelace essentially wrote the first computer program for it, even though it couldn’t actually run at the time.
Herman Hollerith is the inventor of the world’s first true programing language. Inspired by the way that train conductors encoded train tickets using punched holes, he created his own system of encoding information on punch cards. In 1890, he used this system to encode and tabulate U.S. census data, and the programming language was born.
Beginning with Hollerith’s punch cards, computing technology quickly became popular for its ability to process enormous amounts of data.
This led to the creation of companies like the Computing-Tabulating-Recording Company (C-T-R). Under the guidance of Thomas J. Watson (and later his son), the company flourished, focusing on offering tabulating solutions for businesses. In 1924, the company was renamed the International Business Machines Corporation (IBM). IBM went on to become a leader in the field and contribute to such technological innovations as the electronic computer.
Once the first electronically powered computers began operating in the 1940s, they were able to store program instructions in their electronic memory. These stored-program computers required the use of low-level instructions to efficiently and directly manipulate their hardware and memory. These instructions were difficult to operate, and so the assembly language was born. Assembly languages use utility programs to convert written instructions into operation codes that control the hardware itself.
In the 1940’s, Alan Turing, widely considered the Godfather of Computer Science, was hard at work in Britain fighting the Nazi scourge through government cryptanalysis. Later, he developed the idea of the general-purpose computer and the concept of artificial intelligence.
The next leap forward came with high-level programming languages, which allow for a greater level of symbolic abstraction from the specific details of operating the computer’s architecture. The first widely used, high-level, general purpose programming language was developed by IBM in the mid-1950s. Called FORTRAN (derived from FORmula TRANslating system), it is still in use today. FORTRAN is regularly used in computationally intensive situations that require high performance computing, such as the benchmarking of supercomputers.
COBOL (an acronym for for COmmon Business-Oriented Language) is another early, yet significant programming language that is still in use today. Developed in 1959 by the Conference of Data Systems Languages, a committee of researchers, COBOL was inspired greatly by Grace Hopper’s FLOW-MATIC language. With a focus on administration and financial systems, COBOL was at one time in use by roughly 80% of the world’s companies, as well as by many government and military agencies.
Looking back at this history, it grows increasingly obvious that the stereotypical view of the male-dominated technological fields must be corrected. This is one of many significant points raised by Dr. Sadie Plant, whose philosophical work explores themes related to women’s roles in technological fields.
She points out that some of the world’s most influential programmers were women (such as Ada Lovelace and Grace Hopper, mentioned earlier), and argues that women are well suited for coding and other technology fields. These fields require skills at which women excel, such as plurality and polyvocality (to name just a few).
Given the significant role that women have played in the history of electronics and technology, it is not only likely that her points will become increasingly prevalent mindsets as we move forward in our digital age, but also a certainty that women will continue to have momentous roles to play in its course.