©2007, 2014 Martin Rinehart
|1957||Fortran||Backus||ALGOL (Fortran influences, directly or indirectly, every other language on this page excepting Lisp, COBOL and APL)|
|1958||Lisp||McCarthy||Scheme, all functional languages, Ruby|
|1959||COBOL||Hopper + committee|
|1964||APL||Iverson||(small family of descendants)|
|1964||BASIC||Kemeny / Kurtz||Apple and Microsoft Basics, scripting language in MS Office, Lotus Notes and many others|
|1967||Perl||Wall||influences many, Ruby|
|1983||C++||Stroustrup||Java (and numerous others)|
|1995||Java||Gosling||(most later languages)|
In the beginning (ENIAC, 1946¹) there was machine code. The programmer entered the right combination of 1s and 0s (commonly as octal digits) and a computation got performed. It was immediately obvious that this was exceptionally tedious work.
Assembly language was born. Mnemonics such as ADD substituted for the bit pattern of the machine instruction. It was a step in the right direction, but not enough. The early computers used such small words that even a simple addition might require a non-trivial group of instructions.
In 1951 and 1952 an employee of Remington Rand, Grace Hopper wrote the first compiler, A-0. This turned into commercial products ARITH-MATIC, MATH-MATIC and FLOW-MATIC, which was to be a major influence on the design of COBOL.
Some said these compilers could not compete with code written by hand. IBM's representatives, who did not have a compiler product, might have made this claim. An IBM employee, John Backus, wrote a proposal in 1954 that turned into a manual in 1956 that became a working compiler in 1957 for the first "third-generation" language, Fortran (FORmula TRANslator). Fortran became the language of choice for "scientific" computing—programs that were characterized by intense computational work. Fortran (thoroughly updated) is still used today for computational applications.
The generational language hierarchy (machine language, assembly language, compiled language) has seen many attempts at "fourth" and "fifth" generation languages. There has never been a clean definition of what constitutes a sufficient advance to earn a generational increment. Most subsequent language developments have been in the direction of improving on the third generation compilers.
Compilers, which produced machine code, would later be joined by interpreters which read the source language, compiled and executed it in a single step. Interpreters were easier to use but significantly slower than compilers. 21st century languages blur this distinction beyond recognition.
The original Fortran was not "structured." Branching was done by "GOTO" statements that specified transfer of control to specific statements. Within a year after Fortran's debut, ALGOL (ALGOrithmic Language) was designed. It introduced structured concepts such as "begin" and "end" pairs around blocks of code. ALGOL caught on as a way of describing algorithms, but not as a computer language. It was, however, a major influence on BCPL, a minor language that would be forgotten had it not been the grandparent of C.
In 1959, "Amazing" Grace Hopper appears again as part of the COBOL (COmmon Business-Oriented Language)) design team. COBOL was to "business" computing as Fortran was to "scientific" computing. It would have little influence on future languages but it became the standard language of 20th century business data processing.
In 1964, IBM created PL/I, to unite the worlds of scientific and business programming. It was the first of several attempts to be all things to all people, none of which have succeeded. Ada, the U.S. government's attempt to settle on a single language, repeated this failure.
The same year saw a language designed by two Dartmouth professors, Kemeny and Kurtz, as a tool for beginners, BASIC (Beginner's All-purpose Symbolic Instruction Code, the nadir of forced acronyms in language naming). BASIC synthesized and simplified Fortran and ALGOL. It spread widely on minicomputers in the '70s and microcomputers in the '80s. It is still pervasive as a scripting language in products such as Lotus Notes and Microsoft Office.
In 1966, Martin Richard's BCPL (Basic Combined Programming Language) was introduced. Despite its name, BCPL was specifically designed as a small language useful for writing compilers. In 1969, Ken Thompson at Bell Labs stripped all he could from BCPL to create B. B is now extinct but it is remembered as the parent of the most influential language of the late 20th century, C.
In 1972, Dennis Ritchie, also at Bell Labs, created C (simply the successor to B). It was designed as a small, fast tool for writing systems programs (operating systems, language compilers) that otherwise would need to be written in assembler to get acceptable speed. It ran on Bell Labs Unix® operating system. As Bell Labs gave Unix away free to academic customers, Unix and C spread rapidly in academia. C was ported to many minicomputers and later to microcomputers, where its twin virtues of small size and high performance were valued by professional programmers.
C reigned as the preeminent tool for software development until 1983 when Bjarne Stroustrup, also at Bell Labs, introduced C++ that was to become the first widely used object-oriented programming language. (In C, the "++" operator adds one to a variable. C++ thus is a "better" C.) Discussion of the object-oriented programming (OOP) paradigm is beyond the scope of this paper. OOP is such a significant advance that all languages that didn't incorporate it (including Fortran, COBOL and Basic) have now been extended to enable OOP.
As C++ was extended, it left its base as a small, high-performance language and became a heavy-weight, though still high-performance, language. In 1995, Sun released Java, a much lighter language based on C++ syntax and targeted toward applications for the Internet. Originally successful as a language for "applets," small applications that ran within browser pages, Java is now replacing COBOL as the tool of choice for much business programming.
Scripting languages began to become important in the late 20th century. The dominant solution on the Internet for databased applications (shopping, for instance) is the so-called LAMP stack: Linux OS, Apache web server, MySQL database and one of Perl, PHP or Python as the programming language to connect the database and the web pages. Perl has some enthusiastic supporters and many detractors. (Google "python vs. perl".) PHP is an unpretentious, simple and successful way to get data from web pages to DBMSs and back. Python also performs this function, but it does so as part of a complete programming language. Supporters of Ruby claim to have a superior version of Python. Unlike the Python/Perl debate, both sides of the Python/Ruby debate have strong claims.
Of the big six, Cobol was born independently of Fortran. The other four are all descended from Fortran. The overwhelming majority of all other computer languages are also descended from Fortran. I'll close with a mention of two that are completely original works.
Lisp (LISt Processing language, John McCarthy at MIT, 1958) is the second oldest language still in use. Most artificial intelligence research is done in Lisp. APL (A Programming Language, Ken Iverson, Harvard and IBM, 1964) is an array programming language, popular with actuaries and Wall Street analysts². These languages have almost nothing in common except that both have avid supporters and both can write and then execute their own source code. A famous example creates Conway's Game of Life from just one line of APL.
Describing one language as the offspring of another is in some cases highly accurate: C++ emerges from C; C from B; B from BCPL. In other cases it's very loose: BCPL from ALGOL
¹ Ask Google about "the first computer" for contradictory claims.
² While at a major investment bank in the late '70s I converted my group from Fortran to APL. We considered it a major competitive advantage, so kept it to ourselves. Later I would learn that almost all our competitors had done the same.