免费注册 查看新帖 |

Chinaunix

  平台 论坛 博客 文库
最近访问板块 发新帖
查看: 1206 | 回复: 0
打印 上一主题 下一主题

The History of Computer Programming [复制链接]

论坛徽章:
0
跳转到指定楼层
1 [收藏(0)] [报告]
发表于 2009-02-20 14:50 |只看该作者 |倒序浏览
While doing some research
for one of my classes at Kaplan, I came across the following article cached in
Google's search engine. Since the page is no longer available and will likely
be de-indexed over time, I have decided to re-print the information/research here.
The research is on the history of programming languages and presents some
information that is cited and not part of what is in the current Wikipedia
entry on the same topic. The researcher's name is Andrew Ferguson who I am
assuming did this research while a student at Princeton University.
The History of Computer Programming Languages
by Andrew Ferguson, Princeton
University
Ever since the invention of Charles Babbage's difference engine in 1822,
computers have required a means of instructing them to perform a specific task.
This means is known as a programming language. Computer languages were
first composed of a series of steps to wire a particular program; these morphed
into a series of steps keyed into the computer and then executed; later these
languages acquired advanced features such as logical branching and object
orientation. The computer languages of the last fifty years have come in two
stages, the first major languages and the second major languages, which are in
use today.
In the beginning, Charles Babbage's difference engine could only be made to
execute tasks by changing the gears
which executed the calculations. Thus, the earliest form of a computer language
was physical motion. Eventually, physical motion was replaced by electrical signals
when the US Government built the ENIAC in 1942. It followed many of the same
principles of Babbage's engine and hence, could only be "programmed"
by presetting switches and rewiring the entire system for each new
"program" or calculation. This process proved to be very tedious.
In 1945, John Von Neumann was working at the Institute for Advanced Study. He
developed two important concepts that directly affected the path of computer
programming languages. The first was known as "shared-program technique" (
http://www.softlord.com
).
This technique stated that the actual computer hardware should be simple and
not need to be hand-wired for each program. Instead, complex instructions
should be used to control the simple hardware, allowing it to be reprogrammed
much faster.
The second concept was also extremely important to the development of
programming languages. Von Neumann called it "conditional control transfer" (
http://www.softlord.com
).
This idea gave rise to the notion of subroutines, or small blocks of code that
could be jumped to in any order, instead of a single set of chronologically
ordered steps for the computer to take. The second part of the idea stated that
computer code should be able to branch based on logical statements such as IF
(expression) THEN, and looped such as with a FOR statement. "Conditional
control transfer" gave rise to the idea of "libraries", which
are blocks of code that can be reused over and over.
In 1949, a few years after Von Neumann's work, the language Short Code appeared (http://www.byte.com).
It was the first computer language for electronic devices and it required the programmer
to change its statements into 0's and 1's by hand. Still, it was the first step
towards the complex languages of today. In 1951, Grace Hopper wrote the first
compiler, A-0 (http://www.byte.com).
A compiler is a program that turns the language's statements into 0's and 1's
for the computer to understand. This lead to faster programming, as the
programmer no longer had to do the work by hand.
In 1957, the first of the major languages appeared in the form of FORTRAN. Its name stands for FORmula TRANslating
system. The language was designed at IBM for scientific computing. The
components were very simple, and provided the programmer with low-level access
to the computers innards. Today, this language would be considered restrictive
as it only included IF, DO, and GOTO statements, but at the time, these
commands were a big step forward. The basic types of data in use today got
their start in FORTRAN, these included logical variables (TRUE or FALSE), and
integer, real, and double-precision numbers.
Though FORTAN was good at handling numbers, it was not so good at handling
input and output, which mattered most to business computing. Business computing started to take off in
1959, and because of this, COBOL
was developed. It was designed from the ground up as the language for
businessmen. Its only data types were numbers and strings of text. It also
allowed for these to be grouped into arrays and records, so that data could be
tracked and organized better. It is interesting to note that a COBOL program is
built in a way similar to an essay, with four or five major sections that build
into an elegant whole. COBOL statements also have a very English-like grammar,
making it quite easy to learn. All of these features were designed to make it
easier for the average business to learn and adopt it.
In 1958, John McCarthy of MIT created the LISt Processing (or LISP) language.
It was designed for Artificial Intelligence (AI) research. Because it was
designed for such a highly specialized field, its syntax has rarely been seen
before or since. The most obvious difference between this language and other
languages is that the basic and only type of data is the list, denoted by a
sequence of items enclosed by parentheses. LISP programs themselves are written
as a set of lists, so that LISP has the unique ability to modify itself, and
hence grow on its own. The LISP syntax was known as "Cambridge
Polish," as it was very different from standard Boolean logic (Wexelblat,
177) :

  
  x V y - Cambridge Polish, what was used to describe
  the LISP program
  OR(x,y) - parenthesized prefix notation, what was used in the LISP program
  x OR y - standard Boolean logic
  

LISP remains in use today because its highly specialized and abstract nature.
The Algol language was created by a
committee for scientific use in 1958. It's major contribution is being the root
of the tree that has led to such languages as Pascal, C, C++, and Java. It
was also the first language with a formal grammar, known as Backus-Naar Form or
BNF (McGraw-Hill Encyclopedia of Science and Technology, 454). Though
Algol implemented some novel concepts, such as recursive calling of functions,
the next version of the language, Algol 68, became bloated and difficult to use
(http://www.byte.com). This lead to the
adoption of smaller and more compact languages, such as Pascal.
Pascal was begun in 1968 by Niklaus Wirth. Its development was mainly out of
necessity for a good teaching tool. In the beginning, the language designers
had no hopes for it to enjoy widespread adoption. Instead, they concentrated on
developing good tools for teaching such as a debugger and editing system and
support for common early microprocessor machines which were in use in teaching
institutions.
Pascal was designed in a very orderly approach, it combined many of the best
features of the languages in use at the time, COBOL, FORTRAN, and ALGOL. While
doing so, many of the irregularities and oddball statements of these languages
were cleaned up, which helped it gain users (Bergin, 100-101). The combination
of features, input/output and solid mathematical features,
made it a highly successful language. Pascal also improved the
"pointer" data type, a very powerful feature of any language that
implements it. It also added a CASE
statement, that allowed instructions to to branch like a tree in such a manner:

  
  CASE expression OF
         possible-expression-value-1:
     statements to execute...
         possible-expression-value-2:
     statements to execute...
  END
  

Pascal also helped the development of dynamic
variables, which could be created while a program was being run, through
the NEW and DISPOSE commands. However, Pascal did not implement dynamic arrays,
or groups of variables, which proved to be needed and led to its downfall (Bergin,
101-102). With later created a successor to Pascal, Modula-2, but by the time
it appeared, C was gaining popularity and users at a rapid pace.
C was developed in 1972 by Dennis
Ritchie while working at Bell Labs in New Jersey. The transition in usage
from the first major languages to the major languages of today occurred with
the transition between Pascal and C. Its
direct ancestors are B and BCPL, but its similarities to Pascal are quite
obvious. All of the features of Pascal, including the new ones such as the CASE
statement are available in C. C uses pointers extensively and was built to be
fast and powerful at the expense of being hard to read. But because it fixed
most of the mistakes Pascal had, it won over former-Pascal users quite rapidly.
Ritchie developed C for the new Unix
system being created at the same time. Because of this, C and Unix go hand in
hand. Unix gives C such advanced features as dynamic variables, multitasking,
interrupt handling, forking, and strong, low-level, input-output. Because of
this, C is very commonly used to program operating systems such as Unix,
Windows, the MacOS, and Linux.

In the late 1970's and early 1980's, a new programing
method was being developed. It was known as Object Oriented Programming, or
OOP. Objects are pieces of data that
can be packaged and manipulated by the programmer. Bjarne Stroustroup liked
this method and developed extensions to C known as "C With Classes."
This set of extensions developed into the full-featured language C++, which was
released in 1983.
C++ was designed to organize the raw power of C using OOP, but maintain the
speed of C and be able to run on many different types of computers. C++ is most often used in simulations, such
as games. C++ provides an elegant way to track and manipulate hundreds of
instances of people in elevators, or armies filled with different types of
soldiers. It is the language of choice in today's AP Computer Science courses.
In the early 1990's, interactive TV was the technology of the future. Sun Microsystems
decided that interactive TV needed a special, portable (can run on many types
of machines), language. This language eventually became Java. In 1994, the Java
project team changed their focus to the web, which was becoming "the cool
thing" after interactive TV failed. The next year, Netscape licensed Java
for use in their internet browser, Navigator. At this point, Java became the
language of the future and several companies announced applications which would
be written in Java, none of which came into use. 
Though Java has very lofty goals and is a text-book example of a good language,
it may be the "language that wasn't". It has serious optimization
problems, meaning that programs written in it run very slowly. And Sun has hurt
Java's acceptance by engaging in political battles over it with Microsoft. But
Java may wind up as the instructional language of tomorrow as it is truly
object-oriented and implements advanced techniques such as true portability of
code and garbage collection.
Visual Basic is often taught as a first programming language today as it
is based on the BASIC language developed in 1964 by John Kemeny and Thomas
Kurtz. BASIC is a very limited language and was designed for non-computer
science people. Statements are chiefly run sequentially, but program control
can change based on IF..THEN, and GOSUB statements which execute a certain
block of code and then return to the original point in the program's flow.
Microsoft has extended BASIC in its Visual Basic (VB) product. The heart of VB
is the form, or blank window on which you drag and drop components such as
menus, pictures, and slider bars. These items are known as "widgets."
Widgets have properties (such as its color) and events (such as clicks and
double-clicks) and are central to building any user interface today in any
language. VB is most often used today to create quick and simple interfaces to
other Microsoft products such as Excel and Access without needing a lot of
code, though it is possible to create full applications with it.
Perl has often been described as the "duct tape of the Internet,"
because it is most often used as the engine for a web interface or in scripts
that modify configuration files. It has very strong text matching functions
which make it ideal for these tasks. Perl was developed by Larry Wall in 1987
because the Unix sed and awk tools (used for text manipulation) were no longer
strong enough to support his needs. Depending on whom you ask, Perl stands for
Practical Extraction and Reporting Language or Pathologically Eclectic Rubbish
Lister.
Programming languages have been under development for years and will
remain so for many years to come. They got their start with a list of steps to
wire a computer to perform a task. These steps eventually found their way into
software and began to acquire newer and better features. The first major
languages were characterized by the simple fact that they were intended for one
purpose and one purpose only, while the languages of today are differentiated
by the way they are programmed in, as they can be used for almost any purpose.
And perhaps the languages of tomorrow will be more natural with the invention
of quantum and biological computers.
Bibliography
"A Brief History of
     Programming Languages."
http://www.byte.com/art/9509/se7/artl9.htm
.
     Cited, March 25, 2000.
"A Short History of the
     Computer." 
http://www.softlord.com/comp/
.
     Jeremy Myers. Cited, March 25, 2000.
Bergin, Thomas J. and Richard G.
     Gibson, eds. History of ProgrammingLanguages-II. New York: ACM
     Press, 1996.
Christiansen, Tom and Nathan
     Torkington. Perlfaq1 Unix
     Manpage
. Perl 5
     Porters, 1997-1999.
Christiansen, Tom and Nathan
     Torkington. Perlhist Unix
     Manpage
. Perl 5
     Porters, 1997-1999.
"Java History." 
http://ils.unc.edu/blaze/java/javahist.html
.
     Cited, March 29, 2000.
"Programming Languages." McGraw-Hill
     Encyclopedia of Science and Technology
. New York: McGraw-Hill, 1997.
Wexelblat, Richard L., ed. History of
     Programming Languages
. New
       York: Academic Press, 1981.

EXTRA:
A BRIEF HISTORY OF PROGRAMMING.
 
http://www.lingoworkshop.com/Articles/A_brief_history_of_computer_programming.php
In the 17 Century, the first calculating machines were
invented by 
Wilhelm Schickard
 and 
Blaise Pascal
 (who created the "Pascaline"
in 1642). These mechanical devices were remarkable creations but they could
only perform specific calculations. Arguably the first programmable
computer was the Analytical
Engine by Charles
Babbageconceived in 1835 but never completed.
Analytical Engine - 1835
With the Analytical Engine(解析机, 分析机(早期的机械通用计算机)), Babbage
conceived of a machine that could be programmed to solve any logical or
computational problem. This project came to the attention of 
Lady Ada Lovelace
 (who, incidently, was the only
legitimate child of Lord Byron-
British
 poet). Lovelace
became obsessed with the project and wrote notes on programming techniques,
sample programs and the potential for programmable machines to play chess and
compose music. She is regarded as the
world’s first computer programmer and is credited with the invention of the
programming loop and the subroutine.
Babbage’s ideas were conceived in
terms of mechanical technology, and it wasn’t until a century later when
advances in electronic technology would enable many of his ideas to be fully
realized.
Z-3, Robinson and Mark I
During the Second World War, British
invested significant resources into the 
‘Ultra’
 project based at Bletchley Park.
This top-secret project utilized machines built by Alan Turing(图灵机) to decode German military messages encoded
using the ‘Enigma’ enciphering machine. One such machine, called Robinson, was
built in 1940 and is generally regarded as the first operational (although
non-programmable) computer.
The first programmable computer was
actually built in Germany
by Konrad
Zuse in 1941. In
contrast to the British, the German military apparently overlooked the
significance of Zuse’s achievements and his work only ever got minor support
and very little recognition after the war (the original Z-3 machine was
destroyed during the war, but a replica is on display at the Deutsches Museum
in Munich).
In the US, a team of
Harvard and IBM scientists led by Howard Aiken were
also working on a programmable computer. This computer, called 
Mark I
, was completed in 1944. The person who
is credited with fully harnessing the power of this programmable computer is 
Captain Grace Murray Hopper
. She was one of
the first to recognize the value of
reusable libraries of subroutines, is credited with inventing the term 'debug' (when she removed a dead moth
stuck in a relay) and having written the first
high level compiler (A-0). She also led the effort to develop COBOL – a
programming language not identified with a particular manufacturer. Among
Hopper's many achievements was winning "Computer Science Man-of-the-Year
Award" in 1969 (despite not being a 'man').
Programming Languages
The language a computer can
understand (called "machine code") is composed of strings of zeros
and ones. This smallest element of a computer’s language is called "a
bit" – 0 or 1. Four bits are a
nibble. Two nibbles (8 bits) equal a byte. The ‘words’ of a computer
language are the size of a single instruction encoded in a sequence of bits
(for example, many computers speak a language with words that are '32-bits'
long).
As machine code is extremely
difficult to work with, a type of language called “Assembly” was soon
developed. Using an assembly language, programmers use series of mnemonics that
are then translated by a program into machine code that the computer can understand.
However, assembly is very similar to machine code in that all procedures have
to be spelt out in exact detail in a process that is extremely difficult, slow
and prone to errors.
Translators
(Compilers and Interpreters)
Grace
Hopper is credited with pioneering the idea of a ‘compiler’ to translate some
more human-friendly language into the language of a computer.
These more ‘human-friendly’ languages are called ‘higher level languages’ and
were developed to allow programmers to concentrate more closely on the abstract
problem to be solved rather than all the painful detail required for machine
code or assembly language programming.
A compiler converts source code written in some high-level language
into executable machine code (also called binary code or object code). The
resulting machine code can only be understood by a specific processor, such as
a Pentium or PowerPC.
An interpreter translates either source code or tokens into machine
code, one instruction at a time, as the program is run. An interpreter does not
generate machine code from a source program.
High level
languages
One of the first 'higher level'
languages that gets wide use is FORTRAN, first
released in 1957. This language is very good at number crunching, but not so
good at input and output. COBOL, released
soon after, was "designed from the ground up as the language for
businessmen" and used "a very English-like grammar"
1
.
These languages are generally
considered to reflect a 'procedural' paradigm of programming. In 1958, John
McCarthy at M.I.T. began work on LISP which goes on to become one of the
most important languages in the area of "Artificial Intelligence". LISP, which gets its name from LISt
Processing, reflects a language model based on recursive functions. Another
language, PROLOG, invented in the 70s, used a model
based on 'logic programming' with predicate calculas.
The next most significant language
to appear, at least from a Director/Lingo perspective, was Smalltalk which was
developed by Alan Kay at Xerox PARC.
 
 
               
               
               
               

本文来自ChinaUnix博客,如果查看原文请点:http://blog.chinaunix.net/u2/62361/showart_1836182.html
您需要登录后才可以回帖 登录 | 注册

本版积分规则 发表回复

  

北京盛拓优讯信息技术有限公司. 版权所有 京ICP备16024965号-6 北京市公安局海淀分局网监中心备案编号:11010802020122 niuxiaotong@pcpop.com 17352615567
未成年举报专区
中国互联网协会会员  联系我们:huangweiwei@itpub.net
感谢所有关心和支持过ChinaUnix的朋友们 转载本站内容请注明原作者名及出处

清除 Cookies - ChinaUnix - Archiver - WAP - TOP