Revival: Computer Chemistry (1989) First Edition Marsili Download PDF
Revival: Computer Chemistry (1989) First Edition Marsili Download PDF
Revival: Computer Chemistry (1989) First Edition Marsili Download PDF
com
https://textbookfull.com/product/revival-computer-
chemistry-1989-first-edition-marsili/
https://textbookfull.com/product/revival-computer-control-in-the-
process-industries-1987-first-edition-chin/
textbookfull.com
https://textbookfull.com/product/revival-tumor-matrix-
biology-1995-first-edition-adany/
textbookfull.com
https://textbookfull.com/product/revival-alcohol-interactions-with-
drugs-and-chemicals-1991-first-edition-calabrese/
textbookfull.com
https://textbookfull.com/product/japanese-fixed-income-markets-money-
bond-and-interest-rate-derivatives-jonathan-batten-t-a-fetherston-p-g-
szilagyi/
textbookfull.com
Berkowitz s Pediatrics A Primary Care Approach 6th Edition
American Academy Of Pediatrics
https://textbookfull.com/product/berkowitz-s-pediatrics-a-primary-
care-approach-6th-edition-american-academy-of-pediatrics/
textbookfull.com
https://textbookfull.com/product/efficient-radiology-how-to-optimize-
radiology-operations-daniel-rosenthal/
textbookfull.com
https://textbookfull.com/product/transcendental-inquiry-its-history-
methods-and-critiques-1st-edition-halla-kim/
textbookfull.com
https://textbookfull.com/product/plant-bioinformatics-methods-and-
protocols-2nd-edition-david-edwards-eds/
textbookfull.com
https://textbookfull.com/product/napoleon-a-life-adam-zamoyski-3/
textbookfull.com
The Fragile Brain First Edition Kathleen Eleanor Taylor
https://textbookfull.com/product/the-fragile-brain-first-edition-
kathleen-eleanor-taylor/
textbookfull.com
Computer
Chemistry
Author
This book contains information obtained from authentic and highly regarded sources. Reasonable
efforts have been made to publish reliable data and information, but the author and publisher cannot
assume responsibility for the validity of all materials or the consequences of their use. The authors and
publishers have attempted to trace the copyright holders of all material reproduced in this publication
and apologize to copyright holders if permission to publish in this form has not been obtained. If any
copyright material has not been acknowledged please write and let us know so we may rectify in any
future reprint.
Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced,
transmitted, or utilized in any form by any electronic, mechanical, or other means, now known or
hereafter invented, including photocopying, microfilming, and recording, or in any information storage
or retrieval system, without written permission from the publishers.
For permission to photocopy or use material electronically from this work, please access www.
copyright.com (http://www.copyright.com/) or contact the Copyright Clearance Center, Inc. (CCC), 222
Rosewood Drive, Danvers, MA 01923, 978-750-8400. CCC is a not-for-profit organiza-tion that
provides licenses and registration for a variety of users. For organizations that have been granted a
photocopy license by the CCC, a separate system of payment has been arranged.
Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are
used only for identification and explanation without intent to infringe.
Publisher's Note
The publisher has gone to great lengths to ensure the quality of this reprint but points out that some
imperfections in the original copies may be apparent.
Disclaimer
The publisher has made every effort to trace copyright holders and welcomes correspondence from those
they have been unable to contact.
Visit the Taylor & Francis Web site at http://www.taylorandfrancis.com and the
CRC Press Web site at http://www.crcpress.com
To My Mother
and the KL Eagles
PREFACE
In the last decade we have witnessed a blooming activity in the field of computer
applications in chemistry. The reason for this wide acceptance of computer methodologies
among chemists may be seen in the particular structure of chemical problems, which can
be easily recognized as having strong combinatorial features. It is well known that such
problems often resemble solving puzzles in which each stone must be located in one, and
only one, proper place to yield a correct final picture. The same happens in chemistry when
trying to assemble molecular “ fragments” , the substructures derived from visual interpre
tation of spectral data, to form a complete molecule. Similarly, the mental dissection of a
molecular structure usually performed by the synthetic chemist to conceive possible synthesis
routes is one more classic example where the human brain must tackle monumental com
binatorial and permutatorial problems. It was these two main branches of chemical research
that stimulated, at the beginning of the 1970s, the birth of the first attempts to combine
artificial intelligence and chemistry. We could say that computer chemistry originated in
the wish to emulate human chemical thinking within a computer. For this reason, as explained
in great depth in the text, computer chemistry must not be regarded as computational
chemistry, which is primarily dominated by quantum chemistry. This fact is demonstrated
by the history of computer chemistry and its pioneers, the majority of whom were organic
chemists. This proves that it was the attempt to reproduce chemical “ thinking” , and not
chemical “ computing” , that provided the driving force in the primary efforts to compile
chemically intelligent computer programs.
The first important schools of computer chemistry were found in illustrious universities
in the U.S., Germany, and Japan; this young science had a merely academic character, and
many observers just shrugged their shoulders when hearing about “ synthesis design pro
grams” or “ autodeductive structure elucidation programs” . They were somehow annoyed
by the possibility that a computer could “ think” . Computer chemists were considered
daydreamers, chemistry hippies not worthy of any serious consideration.
However, the importance of computer chemistry was soon recognized by chemical
industry. Its intrinsic potential to enhance laboratory performance was readily made evident,
and since then a great deal of funds have been invested for large-scale computerization of
industrial chemical research, both in software and hardware.
These last years have definitely seen computer chemistry being accepted even among
its previous opponents. Teaching courses are held today in many universities around the
world. Learning programming languages has become customary among many chemistry
students.
It is further interesting to note how the necessary formulation of chemistry by means of
algorithms has been reflected in a clearer view of our conceptual chemical models. The
advent of extremely fast computers has cleared the way for the treatment of chemical problems
of a complexity unthinkable just 5 years ago. Protein modeling and retrieval of chemical
information from data bases containing millions of structural data also have become feasible
due to dramatic improvements in hardware architecture. Parallel processors are introducing
a revolution in chemical software design and application. Tabletop supercomputers will be
available soon, and what appears to be impracticable today will be obvious in a few years.
Computer chemistry is evolving at such a speed that any book can seem obsolete if it has
to report about the technology. For this reason, this volume is aimed at a conceptual and
even philosophical presentation of computer chemistry, enhancing its peculiar psychological
aspects; the author has attempted to focus its description on how our human knowledge of
chemistry can be transformed into formal schemes, the chemical rules, and then expressed
in a form that makes their representation in a computer program possible. This volume is
therefore neither a collection of descriptions of the most important computer chemistry
software packages nor the exaltation of some specific programs described in more detail
than others. It merely attempts to introduce the graduate student, the industrial chemist, the
analytical chemist, and the pharmacologist to the world of computer methods in chemical
research, which are not alternative but complementary to the currently adopted tools of
investigation.
The author has spent more time on the explanation of specific software systems on
which he has worked or which he has used frequently. This does not mean that these systems
are superior to others that are only cited here: no quality ranking is given for any achievement
whatsoever, and judgments are limited strictly to chemical and technical characterizations
of the introduced software systems. This book also does not subsititute more specific original
literature, but tries to act as a primer for the student approaching computer-assisted methods
in chemical research.
Mario Marsili
Rome, Italy
April 1989
THE AUTHOR
Mario Marsili, Ph.D., was bom in Rome in 1953. He left his home country at the age
of 18 to study chemistry at the Technical University, Munich, Federal Republic of Germany.
In 1977 he obtained the “ Diplom” degree in chemistry with research work on fast algorithms
for the computation of partial atomic charges in molecules based on orbital electronegativity.
He earned his Ph.D. at the Technical University 3 years later in the area of computer-assisted
synthesis design, where he had expanded the charge calculational models to pi electron
systems and first derived bond reactivity functions to be utilized as “ deductive” means
inside of the synthesis design program EROS, the development of which he contributed to
under the leadership of Professor Gasteiger.
He spent one postdoctoral year at the University of Zurich in Switzerland with Professor
A. Dreiding, where he worked in the area of molecular graphics and molecular modeling,
creating a computerized method for morphological comparison of three-dimensional molec
ular structures. In 1982 he was appointed Lecturer in Computer Chemistry at the University
of Zurich. At the end of 1982 he was called back to Italy by the National Research Council
of Italy and joined the team of the Project on Fine Chemistry, directed by Professor L.
Caglioti; there he established the first Italian Computer Chemistry research unit. In 1985 he
was nominated Assistant Professor of Computer Chemistry at the Rome University “ La
Sapienza” , where he stayed for 3 years. In 1986 he was elected Director of the Strategic
Project on Computer Chemistry inside of the National Research Council. At the same time,
Italian industry took up the challenge in computer chemistry and an important research
project was launched, supported jointly by the Istituto Mobiliare Italiano and 15 Italian
chemical and pharmaceutical industries. The project, carried out in the Tecnofarmaci lab
oratories, was led by Mario Marsili for the scheduled 4 years, ending in the creation of a
global molecular modeling system, SUPERNOVA. Currently, he is Professor of Computer
Chemistry at the University of L’Aquila and team leader of several industrial research projects
in Italy, Germany, and Japan. His actual major fields of interest are molecular modeling
and chemometrics.
Dr. Marsili is the author of more than 30 original papers in computer chemistry. He
was President of the Ninth International Conference on Computers in Chemical Research
and Education, held in Italy in May 1989.
TABLE OF CONTENTS
Chapter 1
Introduction......................................................................................................................... 1
I. Man and Computers................................................................................................. 1
II. Computers in Chemistry.............................................................................................2
A. Computational Programs................................................................................2
B. Semantic Programs...........................................................................................3
C. Computer Chemistry and HumanPsychology................................................4
III. Areas of Application of Computer Chemistry Methods........................................... 7
Chapter 2
The Computer as a Laboratory........................................................................................11
I. Hardware...................................................................................................................11
A. Architecture of a Computer......................................................................... 11
B. Bits, Chips, and Microprocessors................................................................12
C. Memory and Storage.....................................................................................14
1. Main M emory................................................................................... 14
2. Auxiliary Memory Devices.............................................................. 15
II. Software.....................................................................................................................15
A. Operating Systems (OS)................................................................................. 16
1. Event-Driven Multiprogramming...................................................... 17
2. Memory Management........................................................................18
3. Device Handlers................................................................................18
4. Higher Level Programming Languages........................................... 19
III. Binary Representation of Numbers...........................................................................20
References..............................................................................................................................22
Chapter 3
Problem Solving and Artificial Intelligence.................................................................... 25
I. Boolean Operations...................................................................................................25
II. Methodology in Problem Solving............................................................................ 26
A. Definitions.................................................................................................... 27
B. Nonheuristic Methods.................................................................................... 27
1. Random Search.................................................................................. 27
2. Algorithmic M ethods....................................................................... 27
C. Heuristic Methods......................................................................................... 28
1. Trees and Graphs..............................................................................29
2. Generating Paths: Breadth-First and Depth-First
Searches............................................................................................ 30
References..............................................................................................................................34
Chapter 4
Molecular Modeling............................................................................................................ 35
I. Fundamentals of Molecular Modeling....................................................................... 35
A. Introduction...................................................................................................35
B. Generation and Representationof Two-Dimensional Molecular
M odels...........................................................................................................36
1. Topological Encoding....................................................................... 37
2. Ring Perception.................................................................................42
3. Canonical Numbering....................................................................... 46
4. Display of Two-Dimensional Molecular Models............................ 47
C. Generation and Representation of Three-Dimensional Molecular
M odels........................................................................................................... 48
1. Three-Dimensional Molecular Structures from Data
Banks................................................................................................. 50
2. Molecular Atomic Coordinates from Bond Parameters................... 50
3. Assembly of Structural Fragments................................................... 52
4. Stereochemistry................................................................................. 53
5. Display Techniques of Three-Dimensional Molecular
Models................................................................................................55
6. Manipulation of Three-Dimensional Molecular Models................. 56
II. Generation of Physicochemical Parametersby Molecular Modeling
Techniques................................................................................................................. 61
A. Molecular Volumes, MolecularSurface Areas, and Shape
Similarity........................................................................................................61
1. Boolean Encoding of Three-Dimensional Space-Filling
Molecular Models.............................................................................. 61
2. Boolean Tensor Operations.............................................................. 62
B. Molecular Energetics.....................................................................................64
1. Introduction......................................................................................... 64
2. Molecular Mechanics: Empirical Force-Field
Calculations........................................................................................65
3. Molecular Dynamics......................................................................... 73
C. Electronic Molecular Descriptors................................................................. 76
1. Introduction........................................................................................76
2. A Model for Sigma Charges.............................................................77
3. The Model for Pi Electrons.............................................................. 81
a. Delocalized Systems.............................................................83
4. Correlations with Experimental Quantities...................................... 85
5. Effective Polarizability...................................................................... 88
References.............................................................................................................................. 89
Chapter 5
Autodeductive Systemsfor Reaction Kinetics...................................................................95
I. Introduction................................................................................................................95
II. Principles of Numeric AutodeductiveSystems......................................................... 95
III. The CRAMS System................................................................................................. 96
A. Semantic Input.............................................................................................. 97
B. Predictive Questions...................................................................................... 98
C. Computing Questions.....................................................................................99
IV. Designing an Experiment........................................................................................100
A. Example 1.....................................................................................................100
B. A Computational Exam ple......................................................................... 101
C. An Equilibrium System................................................................................102
D. A More Complex Example......................................................................... 104
References............................................................................................................................ 105
Chapter 6
Structure Elucidation Systems......................................................................................... 107
I. General Principles of Computer-Assisted Structure Elucidation............................ 107
A. The PLANNING Phase................................................................................109
B. The GENERATING Phase.......................................................................... 110
C. The TESTING Phase................................................................................... 112
II. Structure Generation............................................................................................... 112
A. Definitions................................................................................................. 113
B. The Generating Algorithm......................................................................... 115
1. An Interactive Structure Generation Session..................................116
III. SES with Autodeductive Interpretation of Spectral D ata.....................................121
A. The 13C-NMR Interpreter........................................................................... 121
B. TESTING with CHEMICS..........................................................................125
IV. The CASE System...................................................................................................130
A. An Example with CASE: Monochaetin.....................................................131
B. Another Example: Forgetting Structures.................................................... 133
V. TESTING Using Mass Spectra Predictors.............................................................. 134
A. The Half-Order Theory and Mass Spectroscopy Simulation.................... 134
B. Rule-Based Theory and Mass SpectroscopyKnowledge
Systems....................................................................................................... 137
VI. TESTING Using Simulated Chemical Transformations........................................138
References............................................................................................................................ 140
Chapter 7
Computer Simulation of Organic Reactions................................................................. 141
I. Introduction..............................................................................................................141
A. The Connection Between Chemical Mind and Computer
Reasoning.................................................................................................... 142
II. Synthesis Design Systems Based on Reaction Libraries........................................144
A. Structure and Terminology of a SynthesisDesign System......................... 144
B. Transformations (R ).................................................................................... 146
C. Evaluation Strategies and Tactics................................................................148
1. Strategic Bonds................................................................................148
2. Recognition of Functional Groups................................................. 150
3. Strategic Routes................................................................................155
III. Synthesis Design Systems Based on Formal Reactions ........................................158
A. Matrix Representation of Organic Reactions.............................................. 158
1. Ensembles of Molecules and BE Matrices.................................... 159
2. The Chemical Distance...................................................................161
B. Reaction Generators.................................................................................... 162
C. Evaluation Tactics in EROS........................................................................163
1. Evaluation of Reaction Enthalpies................................................. 166
2. The SOHIO Process Discovered with EROS................................. 169
3. Retrosynthesis of a Prostaglandin-Like Compound........................169
IV. Chemical Reactivity and Forward Search.............................................................. 173
A. Electronic Effects........................................................................................173
B. The Reactivity Space...................................................................................176
V. Other Approaches Based on Mechanistic Steps.................................................... 179
A. Synthesis Design and Reaction Prediction: Artificial
Intelligence, Expert Systems, o r...? ...........................................................180
References............................................................................................................................ 180
Appendix.............................................................................................................................. 183
Index 195
Visit https://textbookfull.com
now to explore a rich
collection of eBooks, textbook
and enjoy exciting offers!
1
Chapter 1
INTRODUCTION
Computers have entered most areas of scientific research, industrial production, and
educational activities to such an extent that an impact has even been made on the social
life, mental attitude, and the psychology of people. Computers can often replace or support
many human activities at low costs: cars are assembled by robots; teachers are substituted
by computer programs, experienced instructors by simulators. This has occurred because
computers are millions of times faster than man. Speed is the name of the game, and speed
means competitiveness on the market, low financial investments, and better overall per
formance. On the other hand, a certain number of disappearing human activities, obsolete
and no longer considered profitable, are transformed into new equivalents under a different
perspective: the computer perspective. Somebody who in the past manufactured coil springs
for wristwatches is almost no longer required, having been replaced by somebody con
structing the integrated circuits on which modem watches rely.
Computers have disclosed new frontiers in medicine, improving diagnostic techniques
(e.g., imaging in computerized axial tomography). They have caused a real revolution in
data management and communication and allow modeling of extremely sophisticated systems
like astrophysical events or weather forecasts.
Computers undoubtedly provide a number of astonishing improvements in several sectors
of the modem world, but are at the same time the backbone of modem warfare, which has
created the most incredible array of annihilating weapons ever (pattern-recognizing “ intel
ligent” missiles, for example). For the single human, this double-faced process of tech
nological evolution has bloomed into a wealth of new professions, all of them connected to
computer science, be it theoretical or applied.
Computers are neither good or bad; a knife is neither good nor bad. Each depends on
its use. Philosophical fights are raging everywhere on the role of man in a computer-
dominated world in which few selected specialists have the knowledge and the power to
press strategic buttons on keyboards, and no final solution is expected soon. The question
whether human intuition (in other words, the artistic gift, the invention, the intellectual
breakthrough) can be replaced by computer simulation, once computers have enough memory
and speed to tackle such problems, is indeed a central question and contains even a touch
of moral texture.
If a computer simulation based on artificial intelligence systems leads to some unexpected
brilliant scientific discovery, is this the merit of the human programmer or of the “ thinking”
computer?
Chemistry is no exception within the framework of this discussion. The introduction of
computer-assisted research techniques into chemistry over the last 15 years has caused a
split pattern of reactions among chemists. Whenever computers have been used in a kind
of subordinate, secondary, concealed way, they have been accepted as precious and powerful
help. This has especially been the case with regard to chemical information and in analytical
chemistry. On the contrary, as soon as computers entered an apparent role of equality with
the human chemist in solving problems of a more decisional type, exerting a primary, direct
influence on man-tailored research strategies and methods, an evident anxiety arose among
traditional-minded chemists. Chemists saw (and still see) their leading role as “ masters of
the art” endangered by an “ idiot made of steel” . Grown on a serious misunderstanding of
the role of computers in chemistry, this attitude in some cases has led to mental rejection
of this new technology at the level of its cultural root. On the other hand, enthusiasts are
2 Computer Chemistry
readily found who expect immediate successful results to a variety of difficult problems,
believing that “ the computer can do everything.” They forget that computers still depend
primarily on man’s performance.
To understand the reasons for a methodology called computer chemistry, to correctly
place it among modem research methods, and to detect its benefits and limitations — these
points must be discussed in some depth.
A. COMPUTATIONAL PROGRAMS
A distinction was postulated above between a direct, or primary, influence of computer
action on chemical research and a subordinate, secondary one. Historically this distinction,
caused by an independent growth of what is called computer chemistry from other traditional
fields of computer applications in chemistry, was rooted in two main facts: the attempt to
create computer programs to emulate chemical thinking, and the parallel development of a
new, fascinating, and promising branch of computer science, artificial intelligence (AI). AI,
which will be discussed later to some extent, is the part of computer science dealing with
the computer-generated perception and solution of complex symbol-oriented and semantic
problems.
In the early 1970s, chemists were acquainted with a purely numerical use of computers
in chemistry. Quantum chemistry and X-ray structure determination were the poles of heaviest
exploitation of the fast computational capacity of a computer. In both of these important
research fields, the investigator faces such an enormous quantity of bare numbers that their
successful treatment would be utterly unfeasible without electronic data processing. The
main role of computers in performing these tasks simply consists of managing huge arrays
of numbers following a user-implemented, rigid, predetermined prescription. The result of
what in a joking manner is termed “ number crunching” is in all of these situations a mere
numerical result. In other words, the computer delivers a certain number of specific magnitude
that interests the user, and the path along which such a number is generated is a one-way
road within the codified program. Solving iteratively thousands of Coulomb or exchange
integrals and refining Fourier coefficients are examples of such a path. Here the computer
follows a fixed scheme of data processing. The final result, for example, could be the energy
of some specific electronic state of a given molecule or an array of cartesian coordinates
for atoms in a molecule. That is what we expect. The magnitudes of energy and coordinates
will change if the investigated substrate is different, but this is obvious. They will also
change if a different degree of approximation, refinement, or parameterization is chosen by
the user. What does not change is the certainty that some number will come out as the
unique result. We might not known in advance what energy value a certain molecule will
show at its conformational minimum, but that is the main reason for using a computer: to
do the necessary calculations according to user-determined equations which already contain
the solution to the problem in all its principles. Due to its advantage in speed, the computer
offers a numerical result for final interpretation by man. The program run by the computer
contains no alternatives other than to produce quantitative numerical answers of one and the
same kind, repetitively, as it has been instructed to do. Truly, there are no alternatives to
atomic coordinates for a program that calculates atomic coordinates. The statement “ I shall
ask the computer to tell me the energy of formation of this molecule” appears to be
conceptually and semantically wrong. Justified questioning anticipates the potential existence
of an answer; answering demands the a priori existence of choice elements among which
a suitable answer can be found.
A quantum mechanical program, once implemented according to a particular approach,
is geared in a way as to solely calculate a set of numerical quantities, and it has no choice
3
elements on which to exert any kind of deductive evaluation for constructing an answer.
Thus, the actual calculation is just a reproduction of the equations contained in the program,
substituting real numbers for symbols: no influence is exerted by the computer on the strategic
content of the program, on its structure, or on its meaning, and the computer will not be
able to change the structures of the equations themselves during execution. Question and
answer are like vectors: each has a magnitude and a direction in space. The direction
determines the difference between a vector and a scalar. Selecting a direction (i.e., including
deduction in the formulation of a certain answer by considering the nature of the available
choice elements) means adding a qualitative element to a purely quantitative response.
Calculating orbital energies cannot produce chemical answers within the conceptual frame
work just expounded because programs tackling these kinds of computational problems yield
scalar numbers (e.g., energies) as results. The direction that we miss in such results, which
is nothing less than the general structure of the solution scheme, is called the solution model.
In lucky cases of a known theory, this direction is known in advance by the investigator
and formulated as a sequence of instructions in a computer program. We can finally assert
the following:
Assertion I — Computational programs in chemistry rely on predefined solution schemes,
the models, which are known in their qualitative essence by the user. The output of such
programs is a quantitative response, a scalar, for the model under specific, user-given
conditions. The generation of such responses follows a rigid, unbranched, and constant
data processing mechanism. No strategy evaluation is involved.
It clearly now appears that computer support in this fashion does not scratch the polished
image of any scientist devoting his time to the discovery of fundamental theories or models.
He remains master of the situation and welcomes computer aid as a fast and reliable processor
of numbers in a kind of subordinate position. In final words, the computer will not teach
him anything.
B. SEMANTIC PROGRAMS
What would happen to human psychology and to scientific research if a computer started
to deliver qualitative answers, to give strategic advice, to propose models, to change the
structure of user input equations, or to emulate chemical reasoning?
To do this, a computer perception of quality must be created. Quality involves com
parison; comparison involves rules for judgment; using rules involves the capacity of au
tonomous acting; acting involves effects; effects involve interpretation and ranking, which
finally contribute to the establishment of quality. Quality and quantity together build our
response vector, the answer.
Computer chemistry started off right at this point: it provided programs, along with the
first blooming achievements and concepts in AI, that were able to help chemists discover
strategies. These programs had to be organized flexibly enough to deal with varying mech
anisms for making choices. This key term requires the questions addressed to the computer
to have, in principle, a manifold set of possible outcomes, which undergo evaluation and
ranking.
The intrinsically different response vectors may differ in probability (the magnitude of
the vector) and in direction (the quality, the conceptual content of the computer-generated
solution, the strategic orientation). Such programs are well suited, in general terms, to
provide alternative models, thus enhancing knowledge. That is exactly the complementary
(not the opposite) situation to computational programs. The latter apply established models,
while the former use general rules (empirical or theoretical), to produce models and ranking
strategies. For example, calculating the energy in calories that one needs to move one’s arm
while playing chess (i.e., to pick up a piece, move it to its new position, and lower the arm
again) corresponds to the use of a program belonging to the computational class. However,
4 Computer Chemistry
asking the computer that has been “ taught” chess rules to predict all possible sequences of
moves leading to checkmate, starting from a user-given initial pattern, is an example of the
use of programs of the AI class. Here the process of establishing strategies, predicting
countermoves, and ranking sequences of moves according to chance of success is the principal
feature of such an autodeductive program.
In computer chemistry, chemical rules are transformed into a program inside a computer,
making the electronic device look like it is thinking chemically and therefore turning it into
a seeming threat, a cold, stainless steel rival of any human chemist. Computer answers of
the following kind are common today, and they make the instinctive repulsion among a few,
if not justifiable, at least comprehensible; for example, “ Your mass spectrum belongs with
96% probability to a molecule with three chlorine atoms,” or “ There are 24 different reaction
routes within an exothermic range of 0 to 10 kcal/mol that can lead to your desired product;
I will draw them for you,” or “ After interpreting all your spectral data, three molecular
structures were found compatible and were generated; here they are,” or “ You don’t have
to care for the temperature parameter while running your chemical reactor; adjust the pH to
5.5 instead.”
These answers clearly go far beyond those to which chemists had been typically ac
customed. They offer direct intervention into operational strategy, as well as tactical real
ization. They lead to a redesign of a certain experimental setup or to a new, unexpected
conceptual insight. Thus, a revised model can be developed. We finally can assert the
following:
Assertion II — Semantic programs are the core of computer chemistry systems. They
are tailored to reproduce schemes of human reasoning — in our case, of chemical thinking.
They use chemical rules to treat the strategic, decisional kind of problem. They have a
primary influence on subsequent methodologies, the establishment of models, the creation
o f alternatives, and the intelligent interpretation of data in chemical research.
The benzene symbol automatically includes the six hydrogen atoms not drawn explicitly,
and the ring inside the hexagon is immediately understood as symbolizing six delocalized
it electrons. Even the concept of delocalization is recalled in the brain and is readily
formulated as a (4n + 2)7r-electron Huckel rule. This happens at an astonishingly high
speed in the human mind. The reason for it is that symbols and their correlated chemical
5
and physical properties are already stored in the brain; they represent our chemical knowledge
base. Recalling chemical data (retrieving structural formulas) is a procedure that we do every
day while discussing chemistry. A computer does very similar work when used for chemical
data retrieval, one of the first applications of computer technology in chemistry. Concep
tually, data retrieval is remotely connected to semantic programming, as it generally deals
with the matching of input character strings (the name of a molecule, for example) with
corresponding strings inside the data base. A relation to truly semantic systems is to be
found just in the ability of modem retrieval systems to accept symbols as input, to perform
sophisticated logical search and matching operations, and to return the results in an equally
sophisticated, symbol-oriented manner. However, no additional original material is generated
by the computer during a search session. Autogenous creation of something new must occur
by different paths, both in the brain and in computers. Searching for a chemical structure
in an array of collected structures stored on some magnetic device can have only one of two
possible outcomes: found or not found. In the “ not found” situation, the computer cannot
augment the data base with the one missing datum because it does not “ know” it until an
operator supplies the new entry. The unquestionable usefulness of data banks is exemplified
by the evident speed in gathering available data as compared to man. The simple psycho
logical experiment of visualizing the benzene symbol and automatically attaching to it all
of the chemistry we know (from learning and from practice) highlights the parallelism of
our power of perception, our memory, and our retrieving and correlative capabilities with
the computer equivalents. These are engineered and emulated inside specific software and
deal with a finite set of known elements.
We shall continue this psychological investigation, shifting to problems where new, still
unknown elements must be deductively inferred and linked to the previous set. The following
argument is an an example of the many possible paradigmatic representations focusing on
giving evidence to the differences between man and computer in autogenous creation and
manipulation of symbolic elements. It justifies the consistency of inclusion of computer
chemistry tools in modem chemical research.
Let us use a different symbol for the representation of benzene, which now will be
C6H6. This tells us that six carbon and six hydrogen atoms, connected through chemical
bonds, form what we call a molecule. Now, in this fictitious experiment, the problem put
both to man and computer is to generate all possible structures with the given set of atoms
(i.e., generate all isomers of benzene).
The problem is of a semantic/symbol-oriented nature, and according to assertion II its
solution requires a number of rules to build the skeleton of the AI procedure. Organic
chemistry supplies the rules.
Rule 1. A carbon atom must have four bonds, regardless of its arrangement with connecting
partners.
Rule 2. Each hydrogen atom must have one bond connecting it to the next partner.
Rule 3. The molecules must be in a neutral state.
Rule 4. Structures obeying Rules 1 and 2 are valid whether or not they are thermodynamically
stable.
Rule 5. No disconnected atoms are allowed.
Disposing of the rules, one can attack the problem of generating as many topological
isomers of benzene as possible. Looking at benzene, our fantasy involves the search for a
new arrangement of the graphical elements (the lines representing bonds) that constitute the
pieces of the game (consider, for example, the analogy to a chess game). The first attempt
likely would be to transpose the “ localized” double bonds to obtain a new image, as in the
case of Dewar benzene (structure b below). Another scheme of bond shifting leads to the
6 Computer Chemistry
Sooner or later, man’s intuition will lead to other images, like open-chain compounds
or isomers with five- or four-membered rings in them. The reader may wish to exert himself
by finding other elements in the finite set of benzene isomers.
A major difficulty arises when a certain number of isomers have been derived by hand.
Suppose that 35 different isomers have been drawn on paper. A 36th is bom in the chemist’s
mind, and in order to validate it he will have to compare the new structure with the other
35. As the mind cannot keep track of so many different images simultaneously, and as they
are not perceived and stored in a unique, canonical way, the chemist will in many cases
find that the 36th isomer is one that he has generated already. As an example, he might
have deduced as the 36th isomer the following open-chain structure,
and, going back through a one-by-one structural check, realized that it is the same as
which he had found long before. The reason is that his mind works on images (symbols),
which are remembered not in their abstract, intrinsic nature, but simply as they have been
perceived visually; thus, the first linear code given above, once reflected, is at first judged
as a different molecule. The brain is not trained for immediate recognition of asymmetrical
structures.
The reader interested in knowing how many different structures can be assembled from
C6H6 and who does not wish to spend the next 6 months doing it without computer help
can find them all in the Appendix at the end of this volume. This task takes only a few
seconds on a modem mainframe computer.
The human mind seems to be the very best instrument for conceptual breakthroughs,
but reveals slowness in exhaustive solution of combinatorial problems. Can the speed at
which a computer performs operations be a masked kind of intuition? The great steps in
intellectual achievement in man’s history were obtained by intuition and not by fast treatment
of data according to known rules, as was the case with the benzene isomers. Going from
the geocentric concept of the world of the Middle Ages to a heliocentric concept, recognizing
the four dimensions of space-time with time being no more absolute, and conceiving particles
as waves and waves as particles are examples of the sublime flower of pure intuition, which
breaks rules! Breaking rules is only in the realm of human thought. Our chemical example
proved valuable in understanding the power of a computer in managing data according to
7
rules, but no computer could have such a complete perception of any complex system that
it could invent new fundamental rules and, thus, change the boundaries of validity of our
rules. This is left to man.
We are now able to confine the role of computers to a well-determined region in chemical
research. The computational use of computers requires data to produce data; the use according
to AI concepts takes data and rules to produce information, and our minds use intuition to
interpret information to finally produce knowledge.
The path between data and information is the area of application of computer chemistry
programs.
To end our philosophical digression, we could say that the proper use of knowledge
produces wisdom, but this still seems a distant goal for mankind.
Computers can then be instructed to deal with chemical problems where the following
hurdles appear to burden human efficiency:
The reason why computer chemistry diverged from classical computer applications in
chemistry (quantum chemistry, physical chemistry, chemical kinetics, X-ray analysis, etc.)
and separate journals and conferences were established is rooted in the necessity to deal
with formal problems regarding the symbolic perception of molecular structure by computers.
Many years were spent generating programs for the perception of rings and aromaticity, for
the canonical numbering of atoms in a molecule, for effective user-friendly input and output
interfaces, for the recognition and storage of particular substructural features, for the encoding
of reaction schemes in reaction data bases, for the fast and compact storage and retrieval of
molecular structures, and for the codification of chemical rules. Later, when these basic
problems were obliterated, a shift toward a more refined introduction of physicochemical
parameters into semantic models, enhancing the chemical quality of computer simulations,
took place. Today, due to the enormous speed of mainframe computers (some of them array
processors), a greater use of computationally oriented software to feed the semantic, AI-
oriented systems with the necessary, more sophisticated data is becoming increasingly popular.
The present stages of evolution show computer chemistry as an established research
area constantly propelled by two major mutually supporting thrusts: semantic programs and
computational programs.
COMPUTATIONAL PROGRAMS
\ -----------> COMPUTER CHEMISTRY
SEMANTIC PROGRAMS X
[IR], 1H- or 13C-nuclear magnetic resonance spectroscopy [NMR], elemental analysis, and
UV), and if enough substance is available he will then perform some chemical degradation
reaction to obtain smaller fragments or target derivatives. All of these investigative techniques
provide him with a large batch of raw data that must be interpreted. He knows the rules
that link the data (shifts, peak patterns, integrated areas, etc.) to some more or less specific
structural elements, the substructures, of the investigated molecules. In an unlucky, difficult
case, he may not be able to derive an unambiguous final structure easily, be it due to a
possible uncertainty in the molecular formula (MS and elemental analysis do not always
guarantee a unique molecular formula; high resolution MS may not be available; etc.) or to
the actual combinatorial complexity of assembling the identified substructures. In such a
case, the investigator finds an ally in structure elucidation systems: programs for computer
generation of molecular structures from spectral and substructural data. These programs
belong to the first historic phase of development of computer chemistry tools.
Once the structure of the unknown compound has been elucidated, this information is
conveyed to the next laboratory, where pharmacologists, medicinal chemists, and organic
chemists work together to find new drugs. The situation can arise where obtaining enough
substance for a complete series of pharmacological tests, necessary to evaluate the overall
potency of the new drug, becomes cumbersome and expensive because of difficulties in
isolation and purification from the natural source. A synthetic approach is consequently
decided upon, and by inspection of the target structure some synthesis pathways are proposed
by the organic chemist, who proceeds by literature inspection (to find established reaction
routes for similar structures) and by intuition. Too often the latter consists of modifications
of memorized reactions recalled from the chemical data base in his mind rather than original
and innovative contributions. To ensure maximum efficiency in the search for known re
actions and to enhance the probability of success in the search for new reaction schemes,
he will find it advisable to spend a short time in front of a computer running synthesis design
programs. These powerful software systems attempt to model organic reactions, to predict
reaction routes retrosynthetically by strategic disconnections of a target compound, and, in
a few systems, even to predict the products of an organic reaction from given educts.
Computer and man will cooperate to finally find a suitable way to synthesize a certain
amount of the drug in laboratory scale, not focusing so much at this stage on optimization
of yield. The drug is tested in vivo and in vitro, and the pharmacologists become interested
in a number of chemical modifications of the current structure to tune its behavior toward
a better and lasting biological activity. The design of a first series of analogues of the lead
compound includes choosing substitution positions on the parent structure and selecting the
type of substituents. Molecular modeling programs provide for a multitude of methodologies
to carry out these selections in an optimized manner, and they ensure a means to visualize,
manipulate, compare, and describe (by physicochemical parameters) the structures of the
analogues.
The analogues will have to be synthesized, and synthesis design systems might be
necessary in turn. The analogues are tested extensively, and a number of biological responses
are collected (e.g., pharmacological activity, toxicity, time/activity contours, and metabo
lism). The formation of metabolic products can be simulated by reaction modeling systems
in a forward search strategy and their structure inferred by structure elucidation systems, if
required. The wish of the investigator now will be to detect a latent link, a structure-activity
relationship, between the measured multiple responses and the varying structural features
of the analogues. If such a significant mathematical relationship can be found, a second set
of more specifically tailored analogues can be postulated by structural modifications which,
according to the strategy implied in the structure-activity model, should correlate with
increased drug potency, lower toxicity, longer persistence to metabolic breakdown, transport
characteristics, and every other drug feature of interest. These kinds of studies, aiming at
9
confirmatory and predictive models, are realized through methods and programs offered by
chemometrics. Chemometrics deals with the science of statistics as applied to chemistry.
Chemometrics is probably the one direction of computational chemistry that evolved quite
independently in the last decade and showed rare connections to the more semantic, stra
tegically operating philosophies described in this book. However, although almost exclu
sively based on computational programs, chemometrics in its most recent advances seems
to gain strategic performances rapidly. Its recurrent application to other systems and the
acquisition of semantic outfits rank it among the most prospectively fruitful and promising
tools in computer-assisted chemical research. Depending on a variety of circumstances, the
chemometrical analysis can be reiterated using pharmacological data measured for the second
set of analogues. Suppose that the combined effort of the analytical chemists, the phar
macologists, and the organic chemists seems to converge on a well-defined structure can
didate among those tested. It will be necessary at this point to synthesize larger amounts of
the substance, and normally this is accomplished in a pilot plant. Optimization of the synthesis
procedure suddenly becomes exceedingly important, as it must point to the best conditions
for a future scaleup to industrial production and, finally, commercialization of the medi
cament. In their most recent versions (autodeductive systems, expert systems), chemometrical
programs again help the researchers to select those particular experimental parameters which
the computer judges to be responsible for the best possible response — in our example, the
yield. These selected parameters, the predictors, are then adjusted in practice by the ex
perimenter at predicted trim values corresponding to maximum yield.
This imaginary walk along the several research steps involved in drug design loops back
to analytical chemistry when production and quality control actions are requested in an
industrial environment to guarantee high standard product quality. Once more, chemometrical
programs intervene to sharpen the precision of the collected analytical control data and to
ease human interpretation.
In the past, the foundations of structure elucidation systems, synthesis design systems,
molecular modeling systems, and related software were established separately. Times were
not yet ripe for interdisciplinary overlap, as each field had its own problems finding an inner
cultural consolidation, a propositional coherency in the definition of contents and objectives
to pursue, and, in many cases, a scientific justification to induce broad acceptance in an
initially reluctant chemical audience. Later, the justification was provided by the rising need
for more sophisticated drugs, by increasing research times and costs, and by stiff market
competition. It must be acknowledged, more to the chemical and pharmaceutical industries
around the world than to academic institutions, that an overlap has taken place and that a
solid framework of methods in computer chemistry is present today which, although still
evolving, successfully operates on a broad spectrum of real problems.
The current architecture of computer chemistry can be represented by Figure 1. Man
still rules from the top; at the center is the object of interest, the molecule, around which
the various disciplines are positioned, and at the bottom is the computer. All elements of
Figure 1 are mutually connected by a conceptual or a real flow of data, by an exchange of
information, by some operational necessity coming from, or by a service action addressed
to any of the linked elements of the computer chemistry framework.
This chapter has attempted to offer a general introduction to the subject matter of this
book, beginning with the mysterious combination of words which forms its title. In the
following chapters, the previously mentioned subfields of computer chemistry will be dis
cussed in detail. However, as the laboratory of a computer chemist is a computer and his
equipment consists of paper, pencil, and diskettes, a homeopathic amount of knowledge
about computer science will be introduced first for readers who are not yet very familiar
with computer configurations. Those of you who are comfortable with computer terminology
and concepts should proceed to Chapter 3.
Visit https://textbookfull.com
now to explore a rich
collection of eBooks, textbook
and enjoy exciting offers!
10 Computer Chemistry
FIGURE 1. The conceptual framework of computer chemistry. Its main areas of research
are positioned around the object of all chemical investigations, the molecule, and are mutually
interconnected. Man rules from the top and is supported by the computer (still his subordinate).
11
Chapter 2
I. HARDWARE
For the computer chemist, computer hardware represents what traditional laboratory
equipment represents for the experimentalist — the physical means (and their location) for
serving scientific research. Although someone wishing to become a professional computer
chemist does not necessarily have to gain a knowledge about computers comparable to a
full-time hardware specialist, he certainly will pave his way to a higher final quality of
computer chemistry programs if he knows in general terms what can be demanded from
modem hardware.
A computer differs from a calculator in self-controlled linking and processing of com
putational steps, which are contained in one or more programs, called software. A calculator
needs a human at all stages of computation. Computers can be divided into two main familes:
analog computers and digital computers. Analog computers are machines fed with continuous
data, like changing electric currents or other physical time-dependent variables (temperature,
light, pressure, etc.) which are emulated internally in analogy to the real physical time-
dependent phenomenon. Any input signal to an analog computer can be manipulated and
rephrased directly in various fashions by intervention of electronic components of the com
puter related to a specific mathematical function or operator (multiplication, addition, in
tegration, etc.). Since such a type of computer does not contain logic circuits, as digital
computers do, programming is done not at the software level, but through the assembly of
electronic parts in a desired sequence. The output is normally some transformed electrical
signal whose amplitude can be visualized in several ways, e.g., on the familiar oscilloscope
display or on scaled charts. Their use is found primarily in process control: chemical and
physical monitoring sensors emit instructional signals to the controlled machine governing
its proper functioning. When improper operational conditions are detected, they issue ap
propriate counteractions or an alarm if they are trespassed.
Analog computers operate in real time and are devoted to the study of dynamic, time-
evolving continuous systems. They have no memory and are thus completely neglected in
computer chemistry.
Large memory capacity and processing logic are fundamental requirements in scientific
computing and simulation of complex systems. They are provided by digital computers,
which process discrete electrical impulses encoding numbers, symbols, and operational
instructions. The discrete states of these impulses can be represented simply by two states:
(1) CURRENT and NO CURRENT, (2) YES and NO, or (3) 1 and 0. The latter representation
is a binary representation. Every number and symbol can be transformed into a binary
equivalent by binary (base 2) arithmetic. The majority of digital computers are binary
machines. The following section will deal specifically with digital machines.
A. ARCHITECTURE OF A COMPUTER
A digital computer is defined as an electronic multiunit system consisting of a central
processor unit (CPU), an input unit, and an output unit. The central processor consists of
a core memory, a control unit, and a mathematical/logical processor.
12 Computer Chemistry
MEMORY
I
INPUT---- > CONTROL UNIT ---- > OUTPUT
I
MATHEMATICAL
PROCESSOR
The input and output (I/O) units allow communication between the external world
(human, robot, any data storage device) and the central processor. Input devices can be
magnetic tapes, disks, keyboards (with visual control through a video terminal), and sensors.
In the romantic pioneer era of computers, I/O devices also worked with punched cards and
paper ribbons. The atmosphere inside a user’s room filled with the “ ack-ack” noise of
rattling card readers and punchers was more mechanical than electronic. The output unit
consists of printers, video terminals, and plotters for direct, human-readable output, whereas
fast magnetic or optical alternatives like disks, drums, tapes, and laser-scanned disks allow
permanent digitized mass storage of data.
The mathematical/logical unit must be able to manipulate data under the constant su
pervision of the control unit. Temporarily generated data are stored in accumulators, which
are the heart of this unit. In addition, the unit contains the logic circuitry responsible for
performing the arithmetical and logical operations required by the running programs. Within
the core memory, each program instruction is memorized in a codified, machine-dependent
numerical form, including all ancillary data. At any time during the data processing, the
control unit has direct and fast access to the data contained in the core memory. However,
for large calculations, the size of the core memory is in some cases not sufficient to allocate
the bulk of incoming data; a memory extension is therefore simulated by modem computers
through virtual memory expansion. This technique consists of a dynamic, computer-con-
trolled partitioning and allocation of the requested total amount of memory over core memory
and fast-access magnetic disks. Thus, programs of a size much larger than the theoretical
core memory limit can be processed without forcing the user to cut his program code into
subsections small enough to fit the core memory storage boundary. The decision to allocate
portions of running programs on virtual memory areas is taken by the control unit, which
directs and keeps track of every action inside a computer. The unit reads the current instruction
to be addressed to the core memory, interpreting and coordinating all implied operations
within the CPU or directed to specific I/O units.
No. of
elementary
Year components Vol (m3) Price ($)
Transistors exhibited a high reliability and a low energy consumption. A trend to min
iaturize computers began at that time and still continues today. The current price collapse
of hardware components makes the purchase of a powerful home computer, a personal
computer (PC), very attractive. The integration of many transistors and of other electronic
elements such as resistors was soon postulated, but for its practical realization more so
phisticated silicon purification and doping techniques had to be developed. Doping means
a controlled introduction of trace amounts of alien atoms into the silicon lattice in order to
obtain its semiconductorial behavior. The embroidered design of the integrated circuits, i.e.,
of single quadrangular silicon plates of about 5 mm side length having on their surfaces
thousands of transistors, is the result of a repeated overlay of stencils, of masks reproducing
one particular scheme of the total circuit. The design is done first on a relatively large scale
and then is reduced photographically. Photolithography and other miniaturization techniques
make it possible to print many integrated circuits on small slices of single-crystal silicon.
They are subdivided into minute plates called chips, each carrying one integrated circuit.
The integrated circuit is the strategic elementary unit of modem microelectronics and com
puter technology. The number of components mounted on a single silicon plate has increased
exponentially. In 1965, about 10 transistors could be mounted; after 1980, up to 10,000
transistors became the rule.
If one includes resistors, diodes, condensators, and other parts, over 100,000 elements
are patched on a single chip. The classification of integrated circuits depends on the number
of logical ports, i.e., of functions that can be performed: small-scale integrated circuits (SSI,
ca. 10 components), medium-scale integrated circuits (MSI, from 64 to 1,024 components),
large-scale integrated circuits (LSI, from 1,024 to 262,144 components), and, recently, very
large-scale integrated circuits (VLSI, over 262,144 components).
Chips storing data as 1s or 0s are used to contruct the core memory and the logic circuitry
of the CPU of a computer. This last application belongs to the microprocessor’s technology.
Advanced microprocessors contain all the fundamental parts of a computer CPU and can be
programmed in hard-wired form for a broad spectrum of purposes. The specific architecture
of a microprocessor determines its speed and the overall system efficiency. Microprocessors
are classified according to the number of bits that constitute the basis of the elaborated data.
Within one full work cycle, a microprocessor based on an 8-bit architecture can evaluate
data that are not larger than the integer number 256 (the highest number obtainable in binary
arithmetic with 8 binary digits available); in the same period of time, a 16-bit processor can
process data up to an integer of 65,536. However, the number of necessary components
increases, too (ca. 100,000).
Eight bits in a row form what is called a byte. One byte is enough to translate all symbols
of a standard keyboard into a binary machine code. High-performance PCs work with 16-
bit microprocessors. In some models, a mathematical coprocessor is linked to the CPU to
increase calculation speed. Large computers (mainframe computers) have a 32-bit architec
ture, and the CPUs of some advanced floating point systems (for example, IBM® FPS-
164,264) reach the 64-bit level for multiplication and addition operations (vectorial machines,
array processors).
The advantage of processors designed on a larger bit basis is rooted in their higher speed
of managing a fixed amount of data or, conversely, in processing more data in a given
14 Computer Chemistry
reference time period. They also permit a more compact program structure, with fewer lines
of code, due to their own inherent pattern of instructions.
1. The processor can memorize data inside a memory cell while deleting its former
content.
2. The processor can retrieve data from a memory cell, leaving its original content
unaltered. This action generates a copy of the cell within the processor.
1. Memorizing procedure
A. The processor sets an address number in the MAR, puts the data in MDR, and
switches the SFS to “ 00” .
B. The memory removes the data from the MDR, transposing them into a cell, and
switches the SFS to “ 10” .
2. Retrieval procedure
A. The processor sets an address number in the MAR and switches the SFS to “ 01” .
B. The memory makes a copy of the content of the addressed cell, sending the copy
to the MDR while the SFS is switched to “ 10” .
C. The processor retrieves the data from the MDR.
The core memory is the working area inside a computer which contains the programs
and the data; it must supply the processor with a flow of instructions. The processor (the
CPU), which is the control unit and the mathematical/logical unit, has the task of processing
the instructions. Each instruction is split into four steps. The first step determines what kind
of operation has to be performed, the second and the third steps determine the memory
addresses whose contents are processed, and the last step provides information about the
address for final storage.
Random documents with unrelated
content Scribd suggests to you:
CHAPTER XI
All Ice Where Eye Could See
Every one of us was, I think, eager to join issue with the frozen
enemy. The desire to conquer must always remain a dominant
instinct in men’s souls, whether the object of conquest be human or
merely geographical. You feel that life isn’t worth living unless you’re
fighting!
The next day broke bright and inspiring; the mists had fled, and
everywhere was floating ice. These bergs need a volume to
themselves adequately to describe, for to me it seemed as though
no two were alike. Some were flat-topped, calves from the great Ice
Barrier; others were fantastic in outline, like fairy islands, indeed,
pierced by dull blue-green caverns through which the seas roared
and thundered and hissed and whined. You could see what might
have been frozen cathedrals, rearing inspiring spires to the
untroubled blue of the sky; ice-clad ships of an older time, castles,
glittering palaces, shifting, bowing, curtsying to the bidding of the
sea that was drawing them north to inevitable destruction. Many of
them were cluttered thickly with penguins and other sea birds, in
clouds of hundreds at a time; and the high sea that was now
running threw itself in angry foam far, far up the icy obstacles in a
bewilderment of shifting beauty that left me near breathless.
There had not been overmuch laughter of late, but now the
spirits of all aboard were rising; and the return to duty of Jeffrey,
who had been hors-de-combat ever since we left Rio, was a further
matter for rejoicing.
A very considerable sea was running down here, and the Quest
set up a lively motion, rolling with the purposeful thoroughness she
had always displayed. Next night we had another narrow shave of
colliding with a deceptive berg. As we progressed we got case-
hardened to these risks, and the ship’s work went on much as usual.
Whether you’re under the Line or nearing the Pole, your work must
be done; the ship must be cleaned and kept in weatherly condition,
for she is your only home, your safeguard against death. The most
scrupulous cleanliness goes as a matter of course, for dirt breeds
disease, and in a small, tightly packed community like ours anything
in the nature of an epidemic might have truly appalling
consequences. Snow fell for a while during this Sunday, and though
the wind was not high the restlessness of the sea was very marked,
and the Quest was as lively as a ball on a piece of elastic. That more
nearly describes her movements than anything else I can think of.
Ice was everywhere, and big combers where the ice was not. But
beyond the ordinary routine of eating, working and sleeping I find
there is little enough of interest to narrate during this portion of our
journeying. We ate heartily and spent practically all our leisure in
sleep. It is astonishing what a great amount of sleep a man can
stand down there in the Antarctic. Astonishing, too, the quantities of
food he can consume! Life was just one darned meal after another,
we used to say, with spasmodic interludes of work, and then deep,
deep, dreamless wells of slumber.
The ship had been leaking extensively ever since we left Rio; but
now the leaks were becoming so considerable that active pumping
was necessary. It is a much overrated pastime, let me say. All right
enough in smooth water when the decks are dry; but when the ship
is piling white water aboard with every heave she gives, when that
white water, as cold as the ice itself, is tearing at your legs,
drenching you, insinuating itself into your sea-boots, sweeping over
your bent shoulders, as generally happened, pumping leaves much
to be desired. Still, we couldn’t have the old hooker settling down
beneath us, and what Kipling calls “the ties of common funk” helped
us to endure the rigours and make the best of what was a bad job
amongst many bad jobs.
During the night the storm grew in force, and Commander Wild
was reluctantly compelled once more to heave to. His
disappointment was keen, for he was so anxious to make every mile
he possibly could to the east; but you can’t drive a ship with weak
engines dead in the teeth of a snorter, and the only thing to do is to
resign yourself to adverse circumstances and wait for better times to
come along when the fates are more propitious. Smothered in
crashing water, washed off our feet, clinging breathlessly to
everything that afforded a handhold, waist deep when we were not
over our shoulders, we handed the foresail—an ugly sail to tackle in
a breeze—and got the Quest laid to under her staysail alone. Then
the ship friskily beat all her previous bests. She pitched things about
that you’d think an earthquake couldn’t have started. She lifted
wedged books out of their shelves and flung them to the floor
amongst dirty swilling water; she turned the galley into an imitation
slap-stick comedy; and Green, trying to retrieve his belongings—now
plunging gallantly into Gubbins Alley after a soup-kettle, now flying
across the galley to collect a kettle—used language that would
certainly have shocked our troops in Flanders.
The pack thickened as the day went by; the open lanes of water
between the congealed masses grew fewer and fewer. One or two
seals, lying prone on the ice-floes, lifted their heads and looked at us
with astonishment and supercilious disdain as we ploughed forward,
but betrayed otherwise no symptom of alarm. Over all was the
solemn mysterious stillness of the frozen wastes, broken only by the
crunching of the young ice our sheathed bow parted on its
determined progress. And somehow the nearness of the ice bred up
a queer kind of exhilaration; it created a sort of “do or die” feeling
that is not easily expressed in words. I fancy, though, judging by
what the veterans said, that it was very much the same effect as is
produced on old soldiers who smell powder—it recalls past victories
and gives promise of further achievement. These mysteries are
beyond my ken; I can only speak of what I experienced, and I know
that my first day amongst the ice left me tingling all over.
During this Sunday the pack hourly grew thicker and the
weather became colder, but not unpleasantly so, and I found this
crisp cold much easier to bear than the wet, soggy cold of the lower
latitudes. Altogether the day was very pleasant, for the sun was
shining throughout and the sky quite clear of cloud. Daylight, too,
lasted all the twenty-four hours, even though the sun did disappear
for a little while. But I was getting hardened to the lack of night by
this time, just as I was getting hardened to all the other peculiar
features of exploring the vicinity of a Pole.
It was very astonishing to take the first trick of the middle watch
in broad daylight; but the lack of darkness was a godsend, as it
enabled us to pick our way in amongst the floes and so keep going
steadily. The sun was not above the horizon, but the light was quite
as clear as early afternoon of a winter’s day in Scotland. Of course,
the dazzling white surface of the ice itself helps a lot, and the
remarkable clearness of the air is another consideration when
reckoning up this curious visibility.
As the day wore on the floes began to pack much more closely
together, and the ice itself was increasing in thickness, so that we
made only indifferent headway; and at last, coming to an unusually
heavy belt of pack, we decided that it would be necessary to give up
altogether. To force a way through appeared impossible, but just
ahead showed a clear space of water, and it was determined to
make an effort to cut the frozen barrier that parted us from further
progress. To get through the five hundred yards that separated the
Quest from free water took exactly two hours of steady thrusting.
For long spaces of time we would find ourselves jammed tightly
between floes as high as our bulwarks, where, with engines rattling
away at full speed, we failed to make an inch of headway. Then it
was a case of stopping and going astern, after which the ship was
stopped again, engines opened to full speed ahead, and like a ram
we crunched into the solid mass and bored a little way farther
towards our goal, with the broken ice grating and roaring and
screaming along our sides in a crashing chorus of spite. Then, as
soon as we gained a trifling expanse of open water, we were
through it and up against the solidifying ice once more, when the
whole process had to be repeated.
Later in the day five seals were shot and flayed on the ice; their
fat proved a welcome addition to our bunkers, to say nothing of
dainty fare for our larder. The big risk in our kind of work is scurvy,
close quarters and a monotonous diet of preserved foods tending to
encourage this most dreaded of all shipboard diseases, so every
opportunity of feeding the crew on fresh meat was naturally taken.
Like explorers in more temperate zones, we were determined to live
more or less on the country. But as there were other considerations
besides food, Mr. Wilkins sighted, stalked and shot one lone, lorn
Emperor penguin, which he gleefully added to his growing collection.
Watching as the Quest edged her way through the pack under
sail alone was quite an interesting experience. She managed quite
well, and seemed to lean all her weight on the ice when it hampered
her, thrusting forward in a purposeful fashion; and it was quite
possible to realize why earlier Polar explorers had done so well
before the era of steam. But during the first watch we took in sail
and got the engines going again, and with a lookout constantly in
the crow’s nest to direct our devious twistings and turnings, we
continued throughout the night, with the occasional screech and
bump of ice to haunt our slumbers. This bumping was supplying us
with extra work, for it strained the ship’s timbers no little, and the
pumps were our principal recreation, the ship leaking considerably.
During the middle watch bigger gaps and wider lines showed to
the westward, so our course was accordingly altered; by 4 a.m. that
course, instead of N.E., was S.W. By way of a change from the
recent sparkling brilliance of the atmosphere, this morning was so
thick that we could not see very far; but being sent to the masthead
lookout, I saw, over the blanket of mist, free water both to the north
and the south. Thus throughout the day we steered a series of
devious courses in hunting open water; and up there I experienced
the deep sense of loneliness that attacks a man when perched up in
the crow’s nest, staring out across the illimitable wilderness of ice,
veined only slightly by the ever-shifting water lanes. The sight even
of just one seal was warming and heartening, as presenting a relief
to the everlasting brooding mystery of the frozen south.
Furthermore, sight of a basking seal gave us an added interest in
life, for, if at all possible, the fellow was promptly shot, not only with
a desire to replenish our larder, but also to eke out our supply of
fuel.
All hands were very fit these days, in excellent spirits, and
possessed of appetites that would have created dismay in the soul of
a boarding-house keeper. The cessation of the ship’s wearisome,
exasperating rolling and pitching brightened our outlook, I think; it is
impossible to keep optimistic and joyous when you’re being hove
about like a parched pea on a hot shovel. We did not realize fully
how trying that incessant liveliness of the little ship was until it
ceased; but now our troubled souls were given a chance to forget
the galling fatigue, and so we laughed and rubbed our hands and
decided that the Antarctic wasn’t at all a bad health resort.