|
Chapter 5
5-1 The Way in the
Artificial
The more closely we look
at human behaviour, but also how the world is behaving, we simply
cannot ignore the fact that technology has a major impact in many
ways, or is perhaps a major factor in the whole!
That is why I think it
is good to have a deeper look into the technology of computers.
Here we go.
5-1a The computer language
Algorithm
In mathematics and computer science, an algorithm (/'ælg?r?ð?m/)
is a finite sequence of well-defined, computer-implementable instructions,
typically to solve a class of specific problems or to perform a
computation. Algorithms are always unambiguous and are used as specifications
for performing calculations, data processing, automated reasoning,
and other tasks. In contrast, a heuristic is a technique used in
problem solving that uses practical methods and/or various estimates
in order to produce solutions that may not be optimal but are sufficient
given the circumstances.
As an effective method,
an algorithm can be expressed within a finite amount of space and
time, and in a well-defined formal language for calculating a function.
Starting from an initial state and initial input (perhaps empty),
the instructions describe a computation that, when executed, proceeds
through a finite number of well-defined successive states, eventually
producing output and terminating at a final ending state.
The transition from one state to the next is not necessarily deterministic;
some algorithms, known as randomized algorithms, incorporate random
input.
The concept of algorithm has existed since antiquity. Arithmetic
algorithms, such as a division algorithm, were used by ancient Babylonian
mathematicians c. 2500 BC and Egyptian mathematicians c. 1550 BC.
Greek mathematicians later used algorithms in 240 BC in the sieve
of Eratosthenes for finding prime numbers, and the Euclidean algorithm
for finding the greatest common divisor of two numbers. Arabic mathematicians
such as al-Kindi in the 9th century used cryptographic algorithms
for code-breaking, based on frequency analysis.
The word algorithm itself
is derived from the name of the 9th-century mathematician Mu?ammad
ibn Musa al-Khwarizmi, whose nisba (identifying him as from Khwarazm)
was Latinized as Algoritmi. A partial formalization of the modern
concept of algorithm began with attempts to solve the Entscheidungsproblem
(decision problem) posed by David Hilbert in 1928. Later formalizations
were framed as attempts to define effective calculability
or effective method. Those formalizations included the
GödelHerbrandKleene recursive functions of 1930,
1934 and 1935, Alonzo Churchs lambda calculus of 1936, Emil
Posts Formulation 1 of 1936, and Alan Turings Turing
machines of 193637 and 1939.
Etymology
The word algorithm has its roots in Latinizing the nisba,
indicating his geographic origin, of the name of Persian mathematician
Muhammad ibn Musa al-Khwarizmi to algorismus. Al-Khwarizmi (Arabized
Persian ????????? c. 780850) was a mathematician, astronomer,
geographer, and scholar in the House of Wisdom in Baghdad, whose
name means the native of Khwarazm, a region that was
part of Greater Iran and is now in Uzbekistan. About 825, al-Khwarizmi
wrote an Arabic language treatise on the HinduArabic numeral
system, which was translated into Latin during the 12th century.
The manuscript starts with the phrase Dixit Algorizmi (Thus
spake Al-Khwarizmi), where Algorizmi was the translators
Latinization of Al-Khwarizmis name. Al-Khwarizmi was the most
widely read mathematician in Europe in the late Middle Ages, primarily
through another of his books, the Algebra. In late medieval Latin,
algorismus, English algorism, the corruption of his
name, simply meant the decimal number system. In the
15th century, under the influence of the Greek word ????µ??
(arithmos), number (cf. arithmetic), the
Latin word was altered to algorithmus, and the corresponding English
term algorithm is first attested in the 17th century;
the modern sense was introduced in the 19th century.
In English, it was first
used in about 1230 and then by Chaucer in 1391. English adopted
the French term, but it wasnt until the late 19th century
that algorithm took on the meaning that it has in modern
English.
Another early use of
the word is from 1240, in a manual titled Carmen de Algorismo composed
by Alexandre de Villedieu. It begins with:
Haec algorismus ars praesens
dicitur, in qua / Talibus Indorum fruimur bis quinque figuris.
which translates to:
Algorism is the art by
which at present we use those Indian figures, which number two times
five.
The poem is a few hundred
lines long and summarizes the art of calculating with the new styled
Indian dice (Tali Indorum), or Hindu numerals.
Informal definition
For a detailed presentation of the various points of view on the
definition of algorithm, see Algorithm characterizations.
An informal definition
could be a set of rules that precisely defines a sequence
of operations, which would include all computer programs (including
programs that do not perform numeric calculations), and (for example)
any prescribed bureaucratic procedure or cook-book recipe.
In general, a program
is only an algorithm if it stops eventually even though infinite
loops may sometimes prove desirable.
A prototypical example
of an algorithm is the Euclidean algorithm, which is used to determine
the maximum common divisor of two integers; an example (there are
others) is described by the flowchart above and as an example in
a later section.
Boolos, Jeffrey &
1974, 1999 offer an informal meaning of the word algorithm
in the following quotation:
No human being can write
fast enough, or long enough, or small enough ( smaller
and smaller without limit
youd be trying to write on
molecules, on atoms, on electrons) to list all members of
an enumerably infinite set by writing out their names, one after
another, in some notation. But humans can do something equally useful,
in the case of certain enumerably infinite sets: They can give explicit
instructions for determining the nth member of the set, for arbitrary
finite n. Such instructions are to be given quite explicitly, in
a form in which they could be followed by a computing machine, or
by a human who is capable of carrying out only very elementary operations
on symbols.
An enumerably infinite
set is one whose elements can be put into one-to-one correspondence
with the integers. Thus Boolos and Jeffrey are saying that an algorithm
implies instructions for a process that creates output
integers from an arbitrary input integer or integers
that, in theory, can be arbitrarily large. For example, an algorithm
can be an algebraic equation such as y = m + n (i.e., two arbitrary
input variables m and n that produce an output y), but
various authors attempts to define the notion indicate that
the word implies much more than this, something on the order of
(for the addition example):
Precise instructions
(in a language understood by the computer) for a fast,
efficient, good process that specifies the moves
of the computer (machine or human, equipped with the
necessary internally contained information and capabilities) to
find, decode, and then process arbitrary input integers/symbols
m and n, symbols + and =
and effectively produce,
in a reasonable time, output-integer y at a specified
place and in a specified format.
The concept of algorithm
is also used to define the notion of decidabilitya notion
that is central for explaining how formal systems come into being
starting from a small set of axioms and rules. In logic, the time
that an algorithm requires to complete cannot be measured, as it
is not apparently related to the customary physical dimension. From
such uncertainties, that characterize ongoing work, stems the unavailability
of a definition of algorithm that suits both concrete (in some sense)
and abstract usage of the term.
To
Chapter 39
|