Chapter - I Enterprise Risk Management: An Introduction

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 37

Chapter I

Enterprise Risk Management: An Introduction


A business has to try to minimise risks. But if its behaviour is governed by the
attempt to escape risk, it will end up by taking the greatest and least rational risk of all:
the risk of doing nothing.
-Peter Drucker1
On September 11, two airplanes hijacked by terrorists crashed into the World Trade
Centre (WTC) in New York and another into the Pentagon building in Washington. The
incident sent shock waves through the US and indeed the rest of the world. The
unprecedented terrorist attack on the soil of the worlds most powerful nation was an
event which few had anticipated. The tragedy has seriously affected the US economy,
which was already teetering on the brink of recession. Consumer confidence has been
severely eroded. Airlines have suffered a sharp decline in profitability due to reduced
demand and increased spending on security measures. Some smaller airlines are on the
verge of bankruptcy. Even the bigger US and European airlines seem to be in trouble. The
insurance industry has been hit hard it has to pay up about $30 billion. Until the WTC
terrorist strike, Hurricane Andrew which hit South Florida in August 1992 was the
biggest liability ($16 billion) the US insurance industry had faced. Stocks of insurance
companies have already started crashing. In the currency markets, many traders have
been hit. With the Federal Reserve cutting interest rates three times between September
11 and November, calculations of treasurers have also been upset. Then we have the
unquantifiable losses. Thousands of talented employees who were in the twin towers at
the time of the strike lost their lives. Companies will struggle to find replacements for
them.
Quite clearly, companies could not have done much to prepare for the WTC event,
except probably take insurance cover, which many seem to have done. Fortunately for
companies, not all risks are so unpredictable or unexpected. By closely monitoring the
environment, companies can anticipate risks associated with changing technology,
changing customer tastes, changing interest and currency rates, changing competitive
conditions, etc. This book provides a conceptual framework for dealing with some of
these risks in a systematic and coordinated way across an organization. Hence, the name
Enterprise Risk Management.

Understanding risk management


People interpret the word risk in several ways. According to the world famous risk
management guru, Harold Skipper2, No universally accepted definition of risk exists.
Risk is commonly used to refer to insured items, to causes of loss and to the chance of
loss. Statisticians and economists associate risk with variability A situation is risky if a
range of outcomes exists and the actual outcome is not known in advance.
But there is no doubt that risk management has become a favorite topic of
discussion these days. Bankruptcies and huge losses have re-emphasised the importance
1
2

Managing for Results


International Risk and Insurance - An Environmental Managerial Approach, Irwin McGrawHill, p. 6

of identifying and managing risks effectively. Companies such as Procter & Gamble,
investment banks such as Barings and government organisations like the Orange County
have all burnt their fingers due to faulty risk management practices. Closer home, we
have seen many Non Banking Finance Companies (NBFCs) such as CRB winding up
after taking risks totally inconsistent with their resources or capabilities. Quite clearly,
companies need to develop and apply an integrated risk management framework, that can
inspire the confidence of shareholders by stablising earnings and lowering the cost of
capital.
Organisations face various types of risk. Unfortunately, in many organisations,
much of the focus of risk management has been on fluctuations in financial parameters
such as interest rates and exchange rates. As Butterworth 3 puts it: A strong appreciation
of finance and accounting is useful, since all risk effects will have an impact on the profit
and loss account and the balance sheet. But this focus on finance as an important core
skill may have been overemphasized. Just like the field of knowledge management has
been dominated by software companies, risk management has been strongly associated
with treasury, forex and portfolio management. The risk management agenda has been
hijacked by investment bankers, corporate treasurers and insurance companies and
dominated by the use of financial derivatives and insurance cover. This is clearly not the
way it should be. As Bernstein4 puts it, Risk management guides us over a vast range of
decision-making, from allocating wealth to safeguarding public health, from waging war
to planning a family, from paying insurance premiums to wearing a seat belt, from
planting corn to marketing cornflakes.
Risk is all about vulnerability and taking steps to reduce it. Several factors
contribute to this vulnerability, not just fluctuations in financial parameters. As the
Economist5 has put it: Top managers often fail to understand properly the firms
sensitiveness to different types of risk. This is because the technology for identifying risk
exposures in non financial firms is as yet fairly primitive, but more fundamentally
because managers and boards too often regard risk management as a matter for financial
experts in the corporate treasury department rather than as an integral part of corporate
strategy.
Many organisations make the mistake of dealing with risk in a piecemeal fashion.
Within the same company, the finance, treasury, human resources and legal departments
cover risks independently. An organisation-wide view of risk management can greatly
improve efficiencies and generate synergies. That is why many companies are taking a
serious look at Enterprise Risk Management (ERM), which addresses some fundamental
questions:
What are the various risks faced by the company?
What is the magnitude of each of these risks?
What is the frequency of each of these risks?
What is the relationship between the different risks?
How can the risks be managed to maximise shareholders wealth?
3
4

Financial Times Mastering Risk, Volume I.


Bernsteins book Against the Gods is a must for anyone interested in understanding the
evolution of risk management techniques. The quotes of Bernstein in this chapter are drawn from
this book, unless otherwise mentioned.
February 10, 1996.

Prudent risk management ensures that the firms cash flows are healthy so that its
immediate obligations and future investment needs are both adequately taken care of.
Firms typically run into cash flow problems because they fail to anticipate or handle risks
efficiently. These risks include huge R&D investments which do not pay off, excessive
premium paid for an acquisition, costly litigation (especially class action law suits) by
aggrieved stakeholders, excessive dependence on a single or a few customers and
vulnerability to interest rate, stock index and exchange rate movements. In 1993,
Metallgesellschaft which tried to cover the risk associated with its long term contracts
through oil futures ended up losing a huge amount. In the same year, Philip Morris had to
cut prices of Marlboro sharply due to unexpectedly stiff competition from cheaper,
private labels. Nick Leeson, the rogue trader, drove Barings to bankruptcy when Japans
Nikkei Index collapsed in early 1995. In 1997, the chemicals giant, Hoechst incurred
substantial expenses due to product recall. The star studded team at hedge fund, Long
Term Capital Management could do little as unexpected interest rate and currency
movements brought the fund to the edge of bankruptcy in 1998. Coca Cola faced a big
crisis when its bottles in Europe were found to be contaminated and had to be recalled in
the middle of 1999.

Exploding some myths


Risk Management is not something new. One of the earliest examples of risk management
appears in the Old Testament of the Bible. An Egyptian Pharaoh had a dream which
Joseph interpreted as seven years of plenty to be followed by seven years of famine. To
deal with this risk, the Pharaoh purchased and stored large quantities of corn during the
good times. As a result, Egypt prospered during the famine.
The modern era of risk management probably goes back to the Hindu Arabic
numbering system, which reached the West about 800 years back. Without numbers, it
would have been simply impossible to quantify uncertainty. Mathematics alone was
however not sufficient. What was needed was a change in mindset. This happened during
the Renaissance when long held beliefs were challenged and scientific enquiry was
encouraged. As theories of probability, sampling and statistical inference evolved, the risk
management process became more scientific. Many of the risk management tools used by
traders today originated during the period 1654-1760. These ideas were later
supplemented by advances such as the discovery of the regression to the mean by Francis
Galton in 1885 and the concept of portfolio diversification by Harry Markowitz in 1952.
Risk can neither be avoided nor eliminated completely. Indeed, without taking
risk, no business can grow. And if there were no risks, managers would not be needed.
The Pharaoh in the earlier example was obviously taking a risk in the sense his
investment would have been unproductive had there been no famine. As Dan Borge, the
former managing director of Bankers Trust puts it6: Many people think that the goal of
risk management is to eliminate risk to be as cautious as possible. Not so. The goal of
risk management is to achieve the best possible balance of opportunity and risk.
Sometimes, achieving this balance means exposing yourself to new risks in order to take
advantage of attractive opportunities.
6

Read Borges book The Book of Risk written in a very simple, narrative style.

Risk management is all about making choices and tradeoffs. These choices and
tradeoffs are closely related to a companys assumptions about the external environment.
The word risk has its origins in the Italian word, risicare, which means to dare. So, risk
is about making choices rather than waiting passively for events to unfold. Consider two
leading global pharmaceutical companies, Merck and Pfizer. Merck is betting on a
scenario in which HMOs7 rather than doctors will dominate the drug-buying process.
Hence its acquisition of the drug distribution company Medco. On the other hand, Pfizer
has invested heavily in its sales force on the assumption that doctors will continue to play
an important role. Each company is working out its strategies on the basis of an
assumption and consequently, taking a risk. Similarly, a company which bets on a new
technology could be diverting a lot of resources from its existing business. If the new
technology fails to take off, it may become a severe drain on the companys resources.
But, if the firm decides not to invest in the new technology and it does prove successful,
the very existence of the company is threatened. So, not taking a risk may turn out to be
a risky strategy in many cases.
All risks cannot be attributed to external factors. Many of the risks which
organizations assume have more to do with their own strategies, internal processes,
systems and culture than any external developments. For example, the collapse of
Barings Bank had more to do with poor management control systems than unfavourable
developments in the external environment.

Uncertainty and risk


From time immemorial, human beings have attempted to master uncertainty. While it is
impossible to anticipate and deal with uncertainty in a perfect manner, man has succeeded
over the years in developing various tools to keep uncertainty within reasonable limits.
As Bernstein puts it, The revolutionary idea that defines the boundary between modern
times and the past is the mastery of riskUntil human beings discovered a way across
that boundary, the future was a mirror of the past or the murky domain of oracles and
soothsayers who held a monopoly over knowledge of anticipated events.
Organisations face various types of uncertainty. The challenge they face is to
understand uncertainty, quantify it, weigh the consequences of different actions and then
take appropriate decisions. Milliken8 has classified uncertainty into three broad
categories.
A.

State Uncertainty: This refers to the unpredictability of the environment. Causes


of state uncertainty are:
a)
Volatility in the environment
b)
Complexity in the environment
c)
Heterogeneity in the environment

B.

Effect Uncertainty: This is the uncertainty about the impact of external events on
the organization.

An HMO (Health Maintenance Organization) is appointed by organizations to manage the health


care needs of employees. For a more detailed understanding, see the case on Merck-Medco in
Chapter II.
Academy of Management Review, 1987, Volume 12.

C.

Response Uncertainty: This refers to the unpredictability of the organizations


responses to external developments.

Williamson9 has drawn a distinction among environmental/external uncertainty,


organisational/internal uncertainty and strategic uncertainty. Environmental uncertainty
arises due to random acts of nature and unpredictable changes in consumer preferences.
Organisational uncertainty refers to the lack of timely communication among decisionmakers, each of whom has only incomplete information. This leads to lack of
coordination and consequently, poor decisions. Strategic uncertainty is created by
misrepresentation, non-disclosure and distortion of information and results in uncertainty
for firms in their relations with suppliers, customers and competitors.
Peter Drucker, the venerable management guru, has identified four types of risk 10
at a macro level:
The risk that is built into the very nature of the business and which cannot be
avoided.
The risk one can afford to take
The risk one cannot afford to take
The risk one cannot afford not to take
The dividing line between risk and uncertainty is thin. Some scholars use the word
risk to describe situations where it is possible to construct probability distributions 11 for
different outcomes. They prefer the word uncertainty for situations where such
distributions cannot be constructed. Others argue that this distinction is not really needed.
I agree with them. More than semantics, what is important is to collect more information
and analyse it carefully and deal with uncertainties more efficiently.
Figure I
Impact of Risk
Low
High

Prioritising Risks
Create
Create
Contingency
Contingency
Plans
Plans

Take
Take
Immediate
Immediate
Action
Action

Conduct
Conduct
Periodic
Periodic
Review
Review

Conduct
Conduct
Ongoing
Ongoing
Review
Review

Low

Likelihood of Risk

High

When we think of risk management, we immediately think of how to cut losses or


protect ourselves against vulnerability. But superior risk management processes also hold
9
10
11

Handbook of Industrial Organization, Volume I, 1989.


Managing for Results
A probability distribution can be defined as a list of the outcomes expected, with the probability of
each outcome.

tremendous potential for generating sustainable competitive advantages in the long run.
So, the dividing line between risk management and value creation is much thinner than
we imagine. Indeed, the ultimate objective of Enterprise Risk Management is to maximise
shareholders wealth.
Table I
The Enterprise Risk Management process

Identify the risk


Quantify the risk to the extent possible
Prevent or avoid the risk wherever possible
Take on new risks if they are associated with attractive opportunities
Transfer the risk if holding it is not consistent with the companys business strategy
Diversify the risk by tapping a portfolio of opportunities
Assess the risk intelligently and decide whether it is more important to preserve the possibility
of extremely good outcomes or to reduce the possibility of very bad outcomes.
Hedge the risk by acquiring a new risk that exactly offsets the unwanted risk.
Leverage the risk and magnify the outcomes, both bad and good.
Insure the risk.

Dealing with risk


For any company, Enterprise Risk Management is closely linked to business strategy. The
purpose of this book is to examine the link between business strategy and risk
management. Every company needs to grow and generate adequate profits to survive in
the long run. Unprofitable or stagnating companies are doomed to failure. So,
investments, which are needed to stay ahead of competitors, cannot be avoided. And any
investment does carry some amount of risk. Risk management aims to generate sufficient
cash flows which can keep the company going even if some of the investments run into
rough weather. It also ensures that the company holds only such risks it is comfortable
with and transfers the remaining risks to other parties. A systematic risk management
process ensures that people are encouraged and trained to take calculated risks. By
understanding and controlling risk, a firm can take better decisions about pursuing new
opportunities and withdrawing from risky areas. As Butterworth 12 puts it: Good risk
awareness and management will give organizations the confidence to take on new
ventures, develop new products and expand abroad. Indeed, risk assessment may well
suggest that doing nothing might be the most risky strategy of all.
How does a company decide what risks to retain inhouse and what risks to
transfer? In general, retaining risks makes sense when the cost of insuring the risk is out
of proportion to the probability and impact of any damage. So, the first step for managers
is to understand what risks they are comfortable with and what they are not. Often,
companies are not comfortable with risks caused by external factors. This is probably
why financial risk management, which deals with volatility in interest and exchange
rates, has become popular in the past few decades. Companies also tend to transfer those
risks which are unmanageable. A good example is earthquakes, where an insurance cover
often makes sense. Managers often prefer to retain risks closely connected to their core
competencies. Thus, software companies would in normal circumstances, not transfer
12

Financial Times Mastering Risk, Volume I.

technology risk. These are only general guidelines. Ultimately whether to retain the risk
or to transfer it, should be decided on a case-to-case basis.
Enterprise Risk Management at Infosys Technologies:
An interview with Nandan M Nilekani, CEO
Infosys Technologies is one of Indias most admired companies. The company has been a trendsetter in risk
management. CEO, Nandan Nilekani explains how Infosys handles risk.
On the mechanisms to manage risk at a strategic level.
The following mechanisms need to be in place to manage risks at the strategic level:
(i)
The Board of Directors of the company need to take ultimate bottom-line responsibility for Risk
Management, thus ensuring that Risk Management is part of the charter for the company.
(ii)
The business portfolio of a company needs to be diverse so that vagaries in one segment do not
affect the companys business performance adversely. This is done by putting in place prudential
norms of restricting business exposure, especially in business segments where there is high
volatility.
(iii)
Management Control Systems that ensure timely aggregation of inputs in the external and internal
environment, enabling quick top management decision making on Risk Management are required.
These mechanisms should cascade to the level of line managers so that the company can
implement these decisions quickly.
On the ideal business model
There is no one size fits all kind of business model. The specific aspects of the derisking model for each
company depend on the nature of the business the company is in, its capability in different areas, etc.
The Infosys business model rests on four pillars, Predictability, Sustainability, Profitability and De-risking
(PSPD model). This model helps management evaluate risk-return trade-offs and make effective strategic
choices. This leads to a predictable and sustainable revenue stream for the company. Infosys pioneering
global delivery model has helped the company to consistently be among the most profitable IT services
companies in the world. Derisking provides the company with the strength and stability to effectively
handle variations in the business environment.
On enterprise risk management in India.
In the past, the software industry in India has grown exponentially. There are risks inherent in this kind of
growth and managing this requires strong risk management practices. Since the software sector in India has
had to compete with global companies, the exposure they have to global best practices is significant. The
visionary managements of some software companies in India have implemented these global best practices
in their company. One area in which global best practices have been implemented is enterprise-wide risk
management.
On short-term focus of risk management
Any successful derisking model should be balanced keeping in mind long-term as well as short-term,
financial as well as non-financial aspects. So focusing on short-term financial impact can lead to suboptimal solutions, which may be counter-productive.
On globalization and increase in risks
Globalization means that the war for talent no longer respects geographical boundaries. Hence, the risk of
attrition of highly talented employees is an important factor that companies need to manage. Further,
companies are faced with the challenge of ensuring that their knowledge base, technology and processes
are robust enough to meet changing global market requirements. Risks associated with the international
political environment also have a bearing on the companys performance.
On the Infosys model of derisking
We ensure that we do not become overly dependent on any single segment of our business. For example,
we had put a cap of 25% on our Y2K revenues. We try to diversify our risk by operating in multiple
technologies and multiple market segments. We make sure that no one customer provides more than 10% of
our business. We ensure that we operate in a variety of vertical domains. The whole idea is that one should
not become overly dependent on any one segment and that we broad-base our operations so as to de-risk
the company.
Expansion into under-penetrated markets is part of the derisking strategy at Infosys. Infosys has
already entered markets in Europe and Asia-Pacific by opening marketing offices in Paris, Frankfurt,

Brussels, Stockholm, Tokyo, Hong Kong, Sharjah, Sydney and Melbourne. Our aim is to have multiple
development centers across the globe to respond instantly to our customers needs and to take advantage of
the talent pools available in cost-competitive economies. This strategy also reduces the risk to our
operations due to changes in geo-political equations.
Source: Chartered Financial Analyst, July 2000, Reprinted with permission.
Figure II
Sources of typical Business Risks
External, Non recurrent risks
Consumer boycotts, Technology,
Patent Infringement
Internal risks
Failed / delayed product launches,
product recall/default, plant safety

External Recurrent risks


Demand / Supply cyclicality,
Competition, Supply chain
Financial risks
Interest rate fluctuations
Exchange rate fluctuations
Stock market fluctuations

Types of risk
What are the various risks a company can face? The Economist Intelligence Unit divides
risks into four broad categories.
Hazard risk is related to natural hazards, accidents, fire, etc. that can be insured.
Financial risk has to do with volatility in interest rates and exchange rates,
defaults on loans, asset-liability mismatch, etc.
Operational risk is associated with systems, processes and people and deals with
succession planning, human resources, information technology, control systems
and compliance with regulations.
Strategic risk stems from an inability to adjust to changes in the environment such
as changes in customer priorities, competitive conditions and geopolitical
developments.
The method of classifying risks is not as important as understanding and
analysing them. Indeed, the very nature of uncertainty implies that it is difficult to
identify all risks, leave alone classify them. Moreover, the objective of this book is not to
provide prescriptive solutions but to encourage and motivate companies to think more
deeply, clearly and consistently about the risks they face. Each company should carefully
examine its value chain and come up with its own way of categorising the uncertainties
associated with its important value adding activities. Then, it can quantify these
uncertainties to the extent possible and decide which risks to hold and which to transfer.
In this book, we have categorised risks as follows:
Capacity expansion risks.
Vertical integration risks.
Diversification risks.
Technology risks.
Mergers & Acquisitions risks.
Environmental risks.

Political risks.
Ethical, Legal & Reputation risks.
Financial risks.
Marketing risks.
Human Resources risks.

Outline of the book


A brief description of some of the important risks covered in this book follows:
Capacity expansion involves risks. If demand does not rise with capacity, the
company may find itself burdened with overheads. At the same time, if capacity is not
built in time, competitors may move ahead and grab market share. So, capacity expansion
decisions have to be made carefully.
To what extent must a company integrate vertically? This is a crucial strategic
issue for most companies. While vertical integration reduces uncertainty, it also makes
management tasks more complicated. Outsourcing increases flexibility, but if
relationships with external partners are not managed carefully, coordination becomes a
big problem.
Excessive dependence on a single or few products, or a single or a few regions for
generating revenues results in risk. A diversified product portfolio or geographical base
can stabilise revenues and profits. At the same time, diversification also makes
management tasks more complex. So, understanding the risks associated with
diversification is extremely important.

Vertical Integration
Risks

Diversification
Risks

M&A Risks

Environmental
Risks

Ethical, Legal and


Reputation Risks

Financial
Risks

Technology
Risks

Political Risks

Marketing Risks

Human Resources
Risks

Enterprise Risk Management: A Holistic Perspective

Enterprise Risk Management

Enterprise Risk Management

Capacity Expansion
Risks

10

Technology risk has become important in this age of rapid innovation. Companies
which do not have a strategy to cope with changing technology will find themselves at a
severe disadvantage. The key decision in technology risk management is whether to
move early or to wait and see the impact of a new technology as it emerges.
Many companies today look at mergers and acquisitions as a way of generating
fast growth by gaining quick access to resources such as people, products, technology
and facilities. But, mergers and acquisitions have to be planned and executed carefully, to
ensure that the integration of the pre-merger entities takes place smoothly and the
projected synergies are realised. Otherwise, they may prove to be a severe drain on the
existing resources and even ruin a company in some cases.
Another type of risk is environmental risk. Companies which do not take steps to
protect the natural environment, face the risk of resistance and hostility from society. In
some cases, poor environmental performance may even threaten the very existence of the
company, as illustrated by the example of Union Carbide in Bhopal. In other cases, such
as the Exxon Valdez oil spill in Alaska, the reputation of the company can be severely
damaged.
Political risks also need to be managed carefully. Governments may suddenly
change their policies or may interfere with the companys operations. Understanding the
nature of political instability and anticipating problems is important, especially for
multinational corporations operating in emerging markets. Dabhol Power Corporation is
a good example.
In recent times, legal risks have also become important. Product liability class
action suits by employees or shareholders can pose grave problems. Similarly, anti-trust
proceedings by the government can take a companys attention away from its core
business. A significant proportion of senior managements time, at Microsoft, has been
consumed by the anti-trust suit, which is only now reaching the settlement stage.
In the modern business world, companies are expected to maintain high standards
of ethics and corporate governance. Unethical practices and low standards of corporate
governance can severely erode not only the reputation of a company but also its market
capitalisation. A good example of a company, which has seen a severe decline in its
business owing to unethical and illegal disclosure practices is the famous insurance
company, Lloyds of London. In India, Shaw Wallace has faced similar problems.
The most commonly discussed form of risk is financial risk. When interest or
foreign exchange rates fluctuate, there is an impact on cash flows and profits. Risk also
increases as the debt component in the capital structure increases. This is because debt
involves mandatory cash outflows, while equity holders can be paid dividends at the
discretion of the company. Today, sophisticated hedging tools like derivatives are
available to manage financial risk.
There are various marketing risks which companies have to deal with carefully. A
careful understanding of the marketing activities and the associated risks is extremely
important. Branding, pricing, distribution and product development have all become very
complicated in the contemporary business environment. Unless the marketing mix is
carefully managed, customers may switch over to competitors.
Risks associated with human resources too need to be managed effectively.
Succession planning is probably the most strategic of these risks. Even such well known
companies like Coca Cola and Procter & Gamble have struggled in the recent past due to

11

ineffective succession planning. Other challenges in human resources management


include minimising employee turnover and shaping a corporate culture that aligns
individual aspirations with the companys goals.

Concluding Notes
In their seminal paper, The Balanced score card Measures that drive performance 13
Robert Kaplan and David Norton have emphasised the need for evaluating the
performance of an organisation from four different angles customer perspective,
internal perspective, innovation and learning perspective and shareholder perspective.
The Balanced Score Card considers financial measures that represent the outcome of past
actions. At the same time, it incorporates operational measures relating to customer
satisfaction, internal processes and attempts at innovation and improvement, all of which
drive future financial performance. Similarly, when we talk of risk management, the
various business risks which organisations face must be considered along with the
financial risks. Ultimately, financial risks are the outcome of business strategy. The role
of financial risk management is to minimise uncertainty regarding cash flows; but the
very source of these cash flows is the type of business which the company runs and the
type of strategic decisions it makes.
In todays competitive and complex environment, events are unfolding with a
degree of uncertainty and speed never seen before. The magnitude and nature of risks
faced by companies are constantly changing. Enterprise Risk Management (ERM) has
become more critical than ever before. It is all about changing the way decisions are
made, by systematically collecting and processing information. ERM is not a purely
defensive tool as many believe and does not imply excessive caution. Rather, it is about
creating conditions which encourage managers to achieve the right balance between
minimising risks and exploiting new opportunities. Indeed, the ultimate aim of ERM is to
make available a steady stream of cash flows that can be utilised to maximise
shareholders wealth.
Each chapter in this book discusses a particular type of risk, closely examining the
key issues involved. Discussing management problems is of little use, unless an attempt
is made to develop practical solutions. So, wherever possible, live examples have been
provided to illustrate the concepts. A sufficient number of box items and small cases are
provided in each chapter. Ultimately, there is no better way to understand risk
management than by learning from the successful and not-so-successful experiences of
various companies. Some of the useful models developed by eminent scholars and
industry experts and the best practices of organisations have also been included.
In this book, we shall from time-to-time look at some of the broader philosophical
issues that risk management raises. We can use past data to understand and draw
inferences but to what extent can these inferences be applied to the future? Do numbers at
all make sense in an uncertain world? To what extent should we depend on intuition to
deal with risks which are difficult to quantify? Is risk management an art or a science?
Many feel that in an attempt to master risk, man has become a slave to
mathematical tools, techniques and models. As Bernstein puts it: Our lives team with
numbers but we sometimes forget that numbers are only tools. They have no soul; they
may indeed become fetishes. Many of our most critical decisions are made by computers,
13

Harvard Business Review, January February, 1992.

12

contraptions that devour numbers like voracious monsters and insist on being nourished
with ever greater quantities of digits to crunch, digest and spew back. Of course, a total
reliance on intuition may not be advisable. In this book, we shall try to understand how
organisations can strike the right balance between intuitive thinking and quantitative tools
while managing risk.

13

Annexure 1.1 - Risk Management: A historical perspective14


Till the time of Renaissance, the common man took most of his decisions by instinct and
believed in luck. The Renaissance, a time of discovery, encouraged investigation
experimentation and demonstration of knowledge. Mathematical advances took place,
gradually allowing risk management to evolve as a science.
In 1494, Luca Paccioli wrote a remarkable book which covered the basic
principles of algebra and also provided multiplication tables all the way up to 60 X 60.
Paccioli drew attention to the problem of dividing the stakes between two players after an
unfinished game of cards. This was one of the earliest attempts to quantify risk.
A sixteenth century physician Girolamo Cardano published a book Ars Magna
(The Great Art) in 1545. The book covered advanced topics such as solutions to quadratic
and cubic equations and square root of negative numbers. Cardano wrote another book
Liber de Ludo Alea (Book on Games of Chance), probably the first scientific attempt to
develop the principles of probability. Cardano defined probability as the number of
favourable outcomes divided by the total number of possible outcomes. Galileo, who was
born in 1564 also worked in this area. He dealt with the problem of throwing one or more
dice and estimating the probability of the various outcomes. Interest in the subject also
spread to other countries like Switzerland, Germany and England. Within 50 years of
Galileos death, major problems in probability analysis had been solved.
Three French men, Blaise Pascal, Piere de Fermat and Chevalier de Mere made
immense contributions to the development of probability theory. When Chevalier raised
the problem of how to divide the stakes in an unfinished game of cards, Fermat turned to
algebra while Pascal used a combination of geometry and algebra. Pascals work later
evolved into decision theory. Seven letters exchanged by Pascal and Fermat between July
and October of 1654 formed the genesis of probability theory. The Dutch scientist,
Christian Huygens, based on this correspondence, published the first book on probability
in 1656, covering problems associated with gambling.
In 1662, a book was published by some associates of a monastery with which
Pascal was associated. The book referred to probability explicitly and explained how to
calculate it. The ideas in this book led to the important conclusion that a decision depends
on the strength of ones desire for a particular outcome as well as ones estimate of the
probability of that outcome.
Meanwhile, sampling was also emerging as an important subject. One of the
earliest applications of sampling was in the testing of coins produced by the Royal Mint
in England. The coins were selected at random and compared to a standard to ensure that
the variation was within specified limits.
In 1662, John Graunt published a book covering statistical and sociological
research. Graunt was the first person to condense data into tables and to do descriptive
statistical analysis. Graunt was supported in his efforts by an Irish intellectual, William
Petty. Without being aware of it, Graunt laid the foundation of sampling theory. What he
did was to reason in a systematic way about raw data, in a manner no one had done
before. Graunt and Petty can be called the founders of modern statistics. Graunt did a lot
of work on the causes of death. He made a scientific estimate of the population of
14

The quotes in this section are drawn from Peter Bernsteins book, Against the Gods, unless
otherwise mentioned.

14

London and explained the importance of demographic data. Graunts work gradually led
to concepts such as sampling, averages and the notion of what is normal. These
concepts later formed the basis of statistical analysis. The line of analysis pursued by
Graunt is today known as statistical inference, i.e., infering an estimate about a
population from a sample.
In 1692, John Arbuthnots translation of Huygens work became the first
publication on probability in the English language. The book had a long title Of the laws
of chance or a method of calculation of the hazards of game, plainly demonstrated and
applied to games as present most in use.
Edmund Halley, the famous British astronomer also made a significant
contribution. He developed tables that facilitated calculation of annuities. These tables
were published in a work called Transactions in 1693. Halleys work became the basis for
the modern life insurance business.
A coffee house which Edward Lloyd opened in London in 1687 was the birth
place of Lloyds, the famous insurance company. In 1696, he prepared the Lloyds list,
which provided details about the arrival and departure of ships and conditions at sea. Ship
captains frequented the coffee shop and compared notes on the hazards associated with
different sea routes.
The Lloyds list was subsequently expanded to provide daily news on stock
prices, foreign markets and high water times at London Bridge. The London insurance
industry grew rapidly, fuelled by various innovations. Underwriters wrote policies to
cover various types of risk. In 1771, 79 underwriters came together to set up the Society
of Lloyds. The members of the society came to be known as the Names. An insurance
industry also began to emerge in the American colonies. Benjamin Franklin set up a fire
insurance company called First American in 1752. In 1759, the Prebysterian Ministers
Fund wrote the first life insurance policy.
As trade expanded, judgments about consumer needs, pricing and cost of
financing became important. For these adventurous traders, business forecasting became
important. Indeed, business forecasting was a major innovation of the late 17 th century.
Till then, the principles of probability had been applied to applications like gambling, far
removed from business.
In 1713, Jacob Bernoullis law of large numbers showed how probabilities and
statistical significance could be inferred from limited information. Suppose we throw up
a coin. The law states that the ratio of the number of heads to the total number of throws
will tend towards 0.5 as the number of throws becomes large. In statistical terms,
increasing the number of throws will increase the probability that the ratio of heads to the
total number of throws will vary from 0.5 by less than some stated amount.
In 1738, Daniel Bernoulli published a paper that covered both the subject of risk
as well as human behavior. Bernoulli introduced a very important idea. The utility
resulting from any small increase in wealth will be inversely proportional to the quantity
of wealth previously possessed. For example, all people want to become rich but the
intensity to become rich reduces as they become richer. While probability theory set up
the choices, Bernoulli considered the motivations of the person who did the choosing.
Utility varies across individuals. This has profound implications for the field of risk
management. Rational decision makers attempt to maximise expected utility, not
expected value. As Bernstein puts it so well, If everyone valued every risk in precisely

15

the same way, many risky opportunities would be passed up. Venturesome people place
high utility on the small probability of huge gains and low utility on the larger probability
of loss. Others place little utility on the probability of gain because their paramount goal
is to preserve their capital. Where one sees sunshine, the other finds a thunderstorm.
Without the venturesome, the world would turn a lot more slowly We are indeed
fortunate that human beings differ in their appetite for risk. Bernoullis theory of utility
later led to the laws of demand and supply.
A French mathematician, Abraham de Moivre also made impressive contributions
to the field of risk management. These are documented in his book, The Doctrine of
Chances, first published in 1713. De Moivre demonstrated how observations distribute
themselves around their average value. This led to the normal distribution. De Moivre
also developed the concept of standard deviation. It made it possible to evaluate the
probability that a given number of observations would fall within some specified bound.
Moivre also showed the normal distribution to be an approximate form of the binomial
distribution.
In the 1760s, an Englishman, Richard Price did some pioneering work in the
construction of mortality tables. Based on the work of Halley and de Moivre, Price
published two articles on the subject. In 1771, he published a book titled Observations
on Reversionary Payments. For this work, Price is generally acknowledged as the
founding father of actuarial science. Prices work however, had some errors. He
overestimated mortality rates at younger ages and underestimated them at later ages. He
also underestimated life expectancies. Consequently, life insurance premia were much
higher than they needed to be.
Thomas Bayes, an Englishman born in 1701 worked on determining the
probability of the occurrence of an event given that it had already occurred a certain
number of times and not occurred a certain number of times. In other words, Bayes
focussed attention on using new information to revise probabilities based on old
information. In a dynamic environment, characterised by a high degree of uncertainty,
this can be a very useful tool. As more and more information becomes available, earlier
probabilities can be revised. Bayes most well known paper was Essay towards solving a
problem in the doctrine of chances. The Bayes theorem of conditional probability was
first published in 1763.
Carl Freidrich Gauss, published Disquisitiones Arithmeticae in 1801, which dealt
with the theory of numbers. Gauss rapidly emerged as one of the leading mathematicians
in the world. One of his early attempts to deal with probability was in the book Theoria
Motus (Theory of Motion) published in 1809. In this book, Gauss made attempts to
estimate the orbit of heavenly bodies based on the path that appeared most frequently
over many separate observations. Gauss was also involved in geodesic measurements, the
use of the curvature of the earth to improve the accuracy of geographic measurements.
These measurements involved making estimates based on sample distances within the
area being studied. Gauss noticed that the observations tended to distribute themselves
symmetrically around the mean.
In 1810, Pierre Laplace spotted the weakness in Gauss work. Before
Laplace, probability theory was concerned with games of chance. Laplace applied it to
many scientific and practical problems. In 1809, Laplace also framed the Central Limit
Theorem. It states that the sampling distribution of the mean approaches normal as the

16

sample size increases. In 1812, Laplace published his book, Theorie analytique des
probabilities.
Simeon Denis Poisson came up in 1914 with what came to be known as the
Poisson distribution. It is quite useful in situations where a discrete random variable takes
on an integer value. The distribution can be used to estimate the probability of a certain
number of occurrences in situations such as the number of telephone calls going through
a switchboard system per minute, the number of patients coming for a check up at a
hospital on a given day or the number of accidents at a traffic intersection during a week.
In 1867, Pafnuti Chebyshev, developed another important theorem. He established
that no matter what the shape of the distribution, at least 75% of the values will fall
within (plus minus) two standard deviations from the mean of the distribution and at least
89% of the values will lie within (plus minus) three standard deviations from the mean.
Francis Galton tried to build on the foundation provided by Gauss and others. In,
1885, his work led to the formulation of a general principle that has come to be known as
regression or reversion to the mean. Galtons analysis later led to the concept of
correlation. Using normal distribution and regression to the mean, Galton worked on
problems such as estimating the rate at which tall parents produced children who were
tall relative to their peers but shorter relative to their parents. Galton also computed the
average diameter of 100 seeds produced by different sweet pea plants. He found that the
smallest pea seeds had larger offspring and the largest seeds had smaller offspring.
Similarly, in another study he found that if parents were short, the children were slightly
taller and vice versa. These two experiments led Galton to develop the term regression,
the process of returning to the mean.
Bernstein has explained the importance of Galtons work: Regression to the
mean motivates almost every kind of risk taking and forecasting. Its at the root of
homilies like what goes up must come down, Pride goeth before a fall, and from
shirtsleeves to shirtsleeves in three generations. Probably Joseph had this in mind when
he predicted to Pharaoh that seven years of famine would follow seven years of plenty.
In stock markets, regression to the mean is applied when we talk of over valuation and
under valuation of stocks. We imply that a stocks price is certain to return to the intrinsic
value. According to Bernstein, Galton transformed the notion of probability from a static
concept based on randomness and the Law of Large Numbers into a dynamic process in
which the successors to the outliers are predestined to join the crowd at the centre.
In the late 19th century, the importance of statistics was recognised by the
scientific world. Many advances were made in statistical techniques including the
standard deviation, correlation coefficient and the chi square test. In 1893, Karl Pearson
introduced the concept of standard deviation. In 1897, he developed the concept of
correlation coefficient. In 1900, Karl Pearson presented the idea of the chi-square
distribution, useful for understanding the similarity of different populations. Consider a
politician campaigning for elections. His manager has found out that 30, 40 and 50
percent of the voters surveyed in each region recognize his name. The chi-square
distribution enables him to determine whether the differences in these proportions are
significant. That will be a crucial piece of information in understanding the impact his
speech will make on a particular region. Similarly, marketers would find the test useful in
determining whether the preference for a certain product differs from state to state or
region to region. If a population is classified into several categories with respect to two

17

attributes, the chi-square test can be used to determine if the two attributes are
independent of each other. For very small numbers of degrees of freedom, the chi-square
distribution is severely skewed to the right. But as this number increases, the curve
rapidly becomes more symmetrical. When the number reaches large values, the
distribution can be approximated by the normal. A large value of chi-square indicates a
substantial difference between observed and expected values. On the other hand, if chisquare is zero, it means observed values exactly match expected values.
In 1908, William Gosset presented his work on the t distribution. It is useful for
estimation whenever the sample size is less than 30 and the population standard deviation
is not known. While using the t distribution, we assume that the population is normal or
approximately normal. A t distribution is lower at the mean and higher at the tails than a
normal distribution. From 1915, another period of development of statistical theory
began, led by people like R A Fisher. They worked on sampling theory, development of
distributions of many sample statistics, principles of hypothesis testing and analysis of
variance. Analysis of variance is a technique to test the equality of three or more sample
means and thus make inferences as to whether the samples come from populations having
the same mean. It is useful in applications such as comparing the scholastic performance
of graduating students from different schools or comparing the effectiveness of different
training methods. Essentially, in this technique, the means of more than two samples are
compared. In 1925, Fisher published his book, Statistical Methods for research
workers, the first textbook presentation of the analysis of variance.
Yet another period of development of statistical theory began in 1928. Led by
Jerzy Neymen and Egon Pearson, the work of this period included concepts such as Type
II error15, power of a test and confidence intervals. Statistical quality control techniques
were also developed during this period.
In 1939, Abraham Wald developed statistical decision theory. This is useful in
situations where the decision maker wants to reach an objective, there are several courses
of action each having a certain value, events are beyond the control of the decision maker
and there is uncertainty regarding which outcome or state of nature will happen.
Essentially, managers decide among alternatives by taking into account the financial
implications of their actions. Use of statistical techniques accelerated in the 1940s with
vast increases in computing power which allowed large sets of data to be processed.
Before the first world war, researchers had concentrated on the inputs that went
into decision making. Later, they realised that a decision was only the beginning of a
chain of events. Gradually, they recognised the need to examine the consequences of their
decisions.
Frank Knight, an economist at the University of Chicago published a book Risk,
Uncertainty and Profit, in 1921, probably the first work to deal with decision making
under uncertainty. Knight attempted to draw a distinction between uncertainty and risk:
Uncertainty must be taken in a sense radically distinct from the familiar notion of risk,
from which it has never been properly separatedIt will appear that a measurable
uncertainty, or risk proper is so far different from an unmeasurable one that it is not in
effect an uncertainty at all. Knight argued that it was difficult and not always appropriate
to apply mathematical techniques for forecasting the future. He was also doubtful
15

The assumption being tested is called the null hypothesis. Rejecting a null hypothesis when it is
true is called a Type I error and accepting it when it is false called a Type II error.

18

whether the frequency of past outcomes could be any guide to the future. As Knight put
it: (Any given) instance is so entirely unique that there are no others or not a sufficient
number to make it possible to tabulate enough like it to form a basis for any inference of
value about any real probability in the case we are interested in.
In 1921, John Maynard Keynes compiled a book, A treatise on probability.
Keynes differentiated what was definable from what was undefinable when thinking
about the future. Like Knight, Keynes was also not in favour of taking decisions based on
the frequency of past occurrences. He felt that there was no certainty an event would
occur in the future just because a similar event had been observed repeatedly in the past.
Keynes preferred the term proposition to events as it more accurately reflected degrees of
belief about future events.
In the decades that followed, understanding of risk and uncertainty advanced in
the form of game theory. The utility theory of Daniel Bernoulli had assumed that
individuals made choices in isolation. Game theory, on the other hand, accepted that
many people might try to maximise their utility simultaneously. The true source of
uncertainty lay in the intentions of others. Decisions were made through a series of
negotiations in which people tried to minimise uncertainty by trading off what others
wanted with what they themselves wanted. Since the potentially most profitable
alternative often led to very strong retaliation by competitors, compromises made sense.
Von Neumann, who invented game theory first presented a paper on the subject in
1926. Later, Von Neumann teamed up with German born economist Oskar Morgenstern
and published a book, Theory of Games and Economic Behaviour. They advocated the
use of mathematics in economic decisionmaking and argued that human and
psychological elements of economics did not stand in the way of mathematical analysis.
A monograph developed by Russian Mathematician, Andrei Kolmogorov in 1933 became
the basis for modern probability theory.
In 1952, Harry Markowitz published an article called Portfolio Selection in the
Journal of Finance. It brought Markowitz the Nobel prize in 1990. Markowitzs key
insight was the important role of diversification. The return on a diversified portfolio of
stocks is equal to the average of the rates of return on individual holdings but its volatility
is less than the average volatility of its individual holdings. In a way, Markowitz was
describing a type of game theory in which an individual was playing against the stock
market. So, instead of going for a killing by investing in a single stock, an investor could
decrease his risk, by diversifying. Markowitzs work elevated risk to the same level of
importance as expected return. Markowitz used the term efficient to describe portfolios
that offered the best returns for a given risk. Each efficient portfolio gives the highest
expected return for any given level of risk or the lowest level of risk for a given expected
return. Rational investors can choose the portfolio that best suits their appetite for risk.
Later, Sharpe developed the Capital Asset Pricing Model which explained how financial
assets would be valued if investors religiously followed Markowitzs instructions for
building portfolios.
Two Israeli psychologists, Daniel Kahneman and Amos Tversky conducted indepth research into how people managed risk and uncertainty. Their Prospect theory,
which evolved in the mid-1960s, discovered behavioural patterns that had not been
recognised by proponents of rational decision making. Kahneman and Tversky argued
that human emotions and the inability of people to understand fully what they were

19

dealing with, stood in the way of rational decision making. One of the most important
insights from the prospect theory was the asymmetry between decision-making involving
gains and that involving losses. Where significant sums were involved, most people
rejected a fair gamble in favour of a certain gain.
When Kahneman and Tversky offered a choice between an 80% chance of losing
$4000 and a 20% chance of breaking even and a 100% chance of losing $300, 92% of the
respondents chose the gamble, even though the expected loss at $3200 was higher. But
when they had to choose between an 80% chance of winning $400 and a 20% chance of
winning nothing and a 100% chance of winning $300, 80% of the respondents preferred
the certain outcome.
According to Tversky: Probably the most significant and pervasive characteristic
of the human pleasure machine is that people are much more sensitive to negative than to
positive stimuli Think about how well you feel today and then try to imagine how
much better you could feel There are a few things that would make you feel better, but
the number of things that would make you feel worse is unbounded.
Kahneman and Tversky coined the term failure of invariance to describe
inconsistent choices when the same problem is expressed in different ways. The failure of
invariance is an important insight and has far greater applicability than is commonly
perceived. For example, the way a question is framed in an advertisement may persuade
people to buy something with negative consequences. Later work by some psychologists
revealed that there were circumstances in which additional information got in the way
and distorted decisions, leading to failures of invariance. In a 1992 paper summarising
the advances in Prospect Theory, Kahneman and Tversky commented: Theories of
choice are at best approximate and incomplete Choice is a constructive and contingent
process. When faced with a complex problem, people use computational shortcuts and
editing operations.
Even as efforts continued to develop a better understanding of risk and new risk
management techniques, new uncertainties were faced in the 1970s and 1980s. Financial
deregulation, inflation, volatility in interest and exchange rates and commodity prices all
combined to create an environment where the conventional forms of risk management
were ill equipped. US dollar long-term interest rates, which had been in the range 2-5%
since the Depression, rose to 10% by the end of 1979 and to more than 14% by the
autumn of 1981. Economic and financial uncertainty also had an impact on commodity
prices. Fortunately, the growing sophistication of information technology enabled
managers to manipulate huge quantities of data and execute complex strategies. The term
Risk Management became more commonly used in the 1970s. The first educational
qualifications in risk management were provided in the US in 1973. The US Professional
Insurance Buyers Association changed its name to the Risk and Insurance Management
Society (RIMS) in 1975. Non financial companies began to use derivatives in the early
1970s. The corporate treasury function also began to develop in the 1970s.

20

The Black & Scholes Option Pricing Model


C

SN (d1) Ke-rT N(d2)

price of the call option

current stock price

time until option expiration

option strike price

risk-free interest rate

cumulative standard normal distribution

d1

ln (S/K) + ( r + 2/2) T
T

d2

d1 - T

standard deviation of stock returns

The model makes the following assumptions:


a)
The stock pays no dividends during the options life.
b)
Option is of the European type.
c)
Markets are efficient.
d)
No commissions are charged.
e)
Interest rates remain constant and known.
f)
Returns are lognormally distributed.
In the early 1970s, Fisher Black, Myron Scholes and Robert C Merton completed
the development of an option pricing model. Their paper, was rejected by some reputed
journals before being published in the May/June 1973 issue of The Journal of Political
Economy. The forerunner of the model was a dissertation by A James Boness, A theory
and measurement of stock option value, in 1962. As stock options began to be traded at
the Chicago Board of Exchange and electronic calculators came into the market, the
Black & Scholes Model found rapid acceptance. The model made a number of
assumptions. The stock paid no dividends during the options life. Options were of the
European type. Markets were efficient. No commissions were charged. Interest rates
remained constant and known. Returns were lognormally distributed. In 1973, Robert
Merton relaxed the assumption of no dividends. Three years later, Jonathan Ingerson
relaxed the assumption of no taxes and transaction costs. In 1976, Merton removed the
restriction of constant interest rates. All these efforts have created fairly accurate option
pricing models today though they do go awry at times when there is extreme volatility.
This is exactly what happened in the case of Long Term Capital Management in AugustSeptember, 1998.
In the 1980s, political risk came into the limelight. The overthrow of the Shah of
Iran was a major factor in this regard. MNCs began to realise the need for establishing inhouse political risk management divisions. Even in the late 1980s, however, risk

21

management remained unsystematic in large companies. The establishment of risk


management departments in the late 1980s and early 1990s was mainly to cut costs. Non
financial risk management essentially meant managing insurable risks such as physical
hazards and liability risks.
Use of derivatives by corporates, banks and speculators increased, resulting in
some classic cases of misuse and fraud. In the early 1990s, many companies burnt their
fingers in derivative deals. These included Procter & Gamble, Gibson Greetings and
Metallgesellschaft. In 1995, Barings, Britains oldest merchant bank went bankrupt
because of the risky deals of a reckless trader, Nick Leeson. Between 1984 and 1995, the
actions of trader, Toshihide Iguchi, while trading in US bonds, cost Daiwa Bank more
than $1 billion. Due to unauthorised dealing by Peter Young, Deutsche Morgan Grenfell
lost a similar amount. Similarly, the actions of Yasuo Hamanaka, who tried to manipulate
copper prices cost Sumitomo Corporation dearly. On January 4, 2000, Electrolux, lost
$11.25 million because of unauthorised currency trading by an unnamed employee. Most
of these disasters resulted because corporate executives considered low probability events
as impossible and given a choice between a certain loss and a gamble, chose the gamble.
As understanding of derivatives improves, the current paranoia will disappear.
Derivatives will find their rightful place in the corporate treasurers tool kit.
From the mid-1990s, a new approach to risk management began to take shape. No
longer were companies obsessed with external hazards affecting the company. Managers
became increasingly concerned about the unexpected consequences of decisions. They
also realised they needed more information about the risks involved while taking crucial
decisions. Risk management became more proactive and attempted to ensure better crossfunctional coordination. The range of risks companies faced also increased significantly.
Branding, mergers and acquisitions, succession planning, intellectual property rights and
antitrust rules are all areas where sophisticated risk management has become crucial in
recent times.
Thus, we have come a long way in our attempts to develop new and more
sophisticated techniques of dealing with risk. Yet, as Bernstein puts it 16: Mathematical
innovations are only tools, mere instruments to be employed in the search for a more
exciting objective. The more we stare at the jumble of equations and models, the more we
lose sight of the mystery of life which is what risk is all about. Knowing how and when
to use these tools is the introduction to wisdom. In other words, there is no magic
formula yet available to eliminate risk. Managers will continue to be respected for their
intuitive skills. But the aim of ERM will be to ensure that intuition is backed by numbers
wherever possible.

16

Financial Times Mastering Risk, Volume I.

22

Annexure 1.2 - Some commonly used formulas


in probability theory
A. Probability of r successes in n
n!
pr qn r
r!(n r )!

where r =
n=
p=
q=

number of successes
number of trials
probability of success
1p

np

B. Mean of binomial distribution


=

C. Standard deviation of binomial distribution

npq

D. Probability of exactly x occurrences in a Poisson distribution


x e
P(x) =
x!

where =

mean number of occurrences per


interval of time

E. Standard error of the mean of an infinite population


x =

where =
n=

standard deviation of
population
sample size

F. Standard error of the mean of a finite population

where N =
n=

N n
N 1

size of population
size of sample
estimate of the population
standard deviation.

23

Annexure 1.3 - Some commonly used formulas


for determining confidence limits
A.

Estimating (the population


mean) when (the population
standard deviation) is known

B.

When (the population standard


deviation) is not known

and n (the sample

size)

is larger than 30
C.

D.

When n (the sample size) is 30 or


less and the population is normal
or approximately normal

When the population is


infinite

z
n

N n
t

N 1
n

N n
p z p
N n

pq
n

N n
z

N 1
n

Estimating p (the population


proportion) when n (the sample
size) is larger than 30
and where p

E.

When the population is


finite

N n

z
N 1
n

p z p

Standard deviation of normal population


Confidence limits = s

n 1
n 1
,s
2
x a/2
x 2 1 a / 2

Where x2 = chi-square distribution coefficient


with n 1 degrees of freedom
and
1 = confidence level
F.

Difference between the means of two Normal populations (standard deviations 1


and 2 known)
12 22

Confidence limits = (1- 2) z/2


n1 n 2

G.

Difference between the means of two Normal Populations (1 = 2 but each value
not known)
Confidence limits =

( 1 - 2) t/2

1
1

X
n1 n 2

(x x

) 2 (x x 2 ) 2

n1 n 2 2

24

Annexure 1.4 - Some commonly used formulas


in hypothesis testing
A.

H: = 0 (the mean of a normal population is equal to a specified value 0; is


known)
z=

x 0

/ n

(Normal distribution)
B.

H: = 0 (the mean of a normal population is equal to a specified value 0; is


estimated by s)
t=

x 0
s/ n

(t distribution with n 1 degrees of freedom)


C.

H: 1 = 2 (the mean of population 1 is equal to the mean of population 2;


assumed that 1 2 and the both populations are normal)

t=

x1 x 2
1 / n1 1 / n 2 [(n1 1) s12 ( n 2 1) s 22 ] /( n1 n 2 2)

(t distribution with n1+n2 2 degrees of freedom)


D.

H: = 0 (the standard deviation of a normal population is equal to a specified


value 0)
(n 1) s 2
2
x =
02
Chi-square distribution with (n-1) degrees of freedom

E.

H: p = p0 (the fraction with respect to a variable in a population is equal to a


specified value p0; assume that np0 5)
p p0
Z = p (1 p ) / n
0

(Normal distribution)

25

F.

H: p1 = p2 (the fraction with respect to a variable population 1 is equal to the


fraction in population 2; assume that n1p1, n2p2 5)
x1 x 2
n1
n2
Z=

p (1 p )(1 / n1 1 / n 2 )

x1 x 2
n1 n 2

(Normal distribution)
G.

H: 1 = 2 (the standard deviation of a population 1 is equal to that of population


2, both populations assumed normal)
F=

s12

s 22

F distribution with DF1 = n1 1 and DF2 = n2 1

26

Annexure 1.5 - Simple Illustrations in Probability &


Probability Distributions
Illustration 1
Assume we have a box of 10 balls consisting of the following:
3 are red and dotted
1 is red and striped
2 are green and dotted
4 are green and striped
If we draw a red ball from the box, what is the probability that it is striped?
1

Probability = (3 1) 0.25
Illustration 2
Well prepared students will pass examinations 85% of the time but ill prepared students
will pass only 35% of the time. Past experience indicates that students are well prepared
75% of the time. In three successive examinations, a student passes. What is the revised
probability that the student was well prepared?
Probability of the student passing three examinations
after being well prepared

= (0.85) (0.85) (0.85)


=
0.6141

Probability of the student passing three examinations


after being ill prepared

= (0.35) (0.35) (0.35)


=
0.0429

Probability that the student is well prepared and passes


three times

= (0.75) (0.6141)
= 0.4606

Probability that of student is ill prepared and passes


three times

= (0.25) (0.0429)
= 0.0107

So, the revised probability that the student was well prepared =

0.4606
(0.4606 0.0107)

0.4606
0.4713

0.9773

Thus the probability of being prepared has been revised from 0.75 to 0.9773 after being
given the information that the student passes three examinations in a row.

27

Illustration 3
Suppose in Illustration 2, the student writes the examination 5 times, passes 4 times and
fails once. What is the revised probability of having been prepared for the examination?
Probability of passing 4 times and failing once after having prepared for the examination
=
(0.75) (0.85) (0.85) (0.85) (0.85) (0.15) = 0.05873
Probability of passing 4 times and failing once and without preparation
=

(0.25) (0.35) (0.35) (0.35) (0.35) (0.65) =

So, revised probability of student being prepared

=
=

0.00244
0.06117

0.05873
0.05873 0.02244

0.9601

Illustration 4
The probability of a student passing an examination is 0.6 and of failing is 0.4. What is
the probability of 0,1,2,3,4,5 student(s) failing simultaneously?
Probability of no student failing

= 5C0 (0.6) (0.6) (0.6) (0.6) (0.6) =

0.07776

Probability of one student failing

= 5C1 (0.4) (0.6) (0.6) (0.6) (0.6) =

0.2592

Probability of two students failing

= 5C2 (0.4) (0.4) (0.6) (0.6) (0.6) =

0.3456

Probability of three students failing = 5C3 (0.4) (0.4) (0.4) (0.6) (0.6) =

0.2304

Probability of four students failing

= 5C4 (0.4) (0.4) (0.4) (0.4) (0.6) =

0.0768

Probability of five students failing

= 5C5 (0.4) (0.4) (0.4) (0.4) (0.4) =

0.01024

Illustration 5
The number of telephone calls received through a switchboard in an office normally
averages 5 per minute. What is the probability of receiving no calls, 1 call, 2 calls during
a minute?
Assuming the Poisson distribution holds,
Probability

5 x e 5
x!

Where x = number of calls.

28

Probability of no calls

5 0 e 5
0!

0.00674

Probability of one call

51 e 5
1!

0.03370

Probability of two calls

5 2 e 5
2!

0.08425

Illustration 6
Given that the average monthly demand for a product is normally distributed with a mean
of 500 units and a standard deviation of 100 units. What is the probability of demand
being.
i)
ii)
iii)

more than 500 but less than 650


more than 700
between 420 and 570

i)

Z=

650 500
100

1.5

Referring to normal distribution table, p (Z=1.5) = 0.4332


ii)

Z=

P (Z = 2)
P (Z > 2)
iii)

=
=
=

700 500
100

2.0

0.4772 from normal tables


0.5 - 0.4772 = 0.0228

For a demand of 420, Z =

420 500
100

- 0.8

P (Z = -0.8) =
P (Z = 0.8)
=
0.2881
(This is because the standard normal distribution is symmetrical.)
For a demand of 570,
P (Z = 0.7)

Z=

570 500
= 0.7
100

0.2580

So, P (0.8 < Z < 0.7) = 0.2881 + 0.2580

Illustration 7

0.5461

29

A portfolio has been yielding a mean return of 35% with a standard deviation of 10%.
What is the probability of the return being less than 25%?
Assuming a normal distribution,
Z=

0.25 0.35
=-1
0.10

P (Z = -1) = P (Z = 1) = 0.3413
So, required probability = 0.5 - 0.3413 = 0.1587
Illustration 8
A bank estimates that the amount of money withdrawn by customers from their savings
accounts is normally distributed with a mean of Rs. 2000 and a standard deviation of
Rs. 600. If the bank considers a random sample of 100 accounts, what is the probability
of the sample mean lying between Rs. 1900 and Rs. 2050?
1900 2000
100
Z1 = 600
=
60
100
Probability = 0.4525 (from normal tables)

2050 2000
Z2 = 600
100

Probability

0.2967 (from normal tables)

50
60

- 1.67

0.83

So, total probability = 0.4525 + 0.2967 = 0.7492


Illustration 9
The standard deviation of the life of a product has been estimated on the basis of past
data to be 6 months. When a sample of 100 products is drawn, it is found to have a mean
of 21 months. Estimate the range within which the average life of the population will lie
at a 95% confidence level.
95% confidence level means 47.5% of the area on either side of the normal distribution.
This corresponds to a value of Z = 1.96
Standard error of the mean

So the upper confidence limit is


=

6
100

= 0.6

21 + (1.96) (0.6)
22.18 months

30

The lower confidence limit is =


=

21 - (1.96) (0.6)
19.82 months

So, the mean life of the product will lie within the range 19.82 22.18 months at a 95%
confidence level.
Illustration 10
A sample of 25 rods has a mean length of 54.62 cm and a standard deviation of 5.34 cm.
Find the 95% confidence limits on the mean.
We use the t distribution as population standard deviation is unknown and sample size is
less than 30.
This is a two tailed test
t 0.975,24 = 2.064
upper confidence limit
lower confidence limit

54.62 + (2.064)

56.82

54.62 - (2.064)

52.42

5.34
25
5.34
25

Illustration 11
We project an average score of 90 (out of 200) in an examination which will be taken by
thousands of students all over the country. A sample of 20 indicates an average score of
84, with a standard deviation of 11. Test the hypothesis at a 0.10 level of significance.
The Null Hypothesis is = 90
We estimate the population standard deviation with the
sample standard deviation, i.e., 11
Standard error of the mean =

11

20

= 2.46

For area of 0.10 and 19 degrees of freedom, t = 1.729


Upper confidence limit = 90 + (2.46) (1.729) = 94.25
Lower confidence limit = 90 (2.46) (1.729) = 85.75
Now, 84 lies outside the limit.
So, the null hypothesis, = 90 is rejected.
The alternative hypothesis, 90 is accepted.

31

Illustration 12
A plant manager wants to estimate the daily consumption of a raw material. He studies
consumption for 10 days and finds the average consumption is 11,400 kg while the
standard deviation is 700 kg. Find the 95% confidence interval for the mean daily
consumption of the raw material.
700
Standard error
=
= 221.38 kg
10
We use the t distribution since sample size is less than 30 and the population
standard deviation is not known.
t0.05, 9 = 2.262
Upper confidence limit
=
11,400 + (2.262) (221.38)
=
11,900 kg
Lower confidence limit
=
11,400 (2.262) (221.38)
=
10,899 kg
Illustration 13
In a company there are 100,000 employees. A sample of 75 indicates that 40% of them
are in favour of the companys deferred compensation plan while 60% are not. The
management wants to estimate at a 99% confidence level, the range in which the
proportion of employees in favour of the plan lies.
99% confidence level means 49.5% of the area on either side of the normal distribution.
This corresponds to Z = 2.58
Standard error of the proportion =

pq
n

where p = probability of acceptance


q = probability of rejection
n = no. of people surveyed
Standard error of the proportion can be estimated as

(0.4)(0.6)
75

0.057

So, upper confidence limit = 0.4 + (2.58) (0.057) = 0.547


lower confidence limit = 0.4 - (2.58) (0.057) = 0.253
So, the proportion of employees in favour of the deferred compensation plan will lie
between 0.253 and 0.547 at a 99% confidence level.
Illustration 14
The mean hourly earnings in two industries are Rs. 6.95 and Rs. 7.10 respectively. The
respective standard deviations of the two samples of sizes 200 and 175 are respectively
Re. 0.40 and Re 0.60. Test the hypothesis that there is no difference in hourly earnings
between the two industries at a confidence level of 0.05.

32

The Null Hypothesis is : 1 = 2


Since population standard deviations are unknown, we estimate them using the sample
standard deviation.
1 = 0.40
2 = 0 .60
Standard error of the difference between the two means
=

12 2 2

n1
n2

0 .4 2
0 .6 2

200
175

0.053

Using the normal distribution, Z for area = 0.475 is 1.96


Upper confidence limit = 0 + (1.96) (0.053) = 0.1039
Lower confidence limit = 0 (1.96) (0.053) = - 0.1039
x1 x2 = 6.95 7.10 = - 0.15
This lies outside the acceptance region.
So, the null hypothesis that there is no difference in mean hourly earnings
between the two industries is rejected.
Illustration 15
A chemical company is evaluating an investment project. There is uncertainty associated
with the annual net cash flow and the life of the project. The net present value (NPV) for
the project can be calculated using the formula.
n

NPV =

CFt

(1 i)
t 1

where CF is the annual cash flow, I the initial investment at time, t = 0 and i is the
discount rate. Use Monte Carlo Simulation to estimate NPV given that the cost of capital
is 10% and Initial investment is 5,000,000.

33

Annual cash flow


Value
Rs.
1,000,000
1,500,000
2,000,000
2,500,000
3,000,000
3,500,000
4,000,000

Project life

Probability

Value
Years
3
4
5
6
7
8
9
10

0.02
0.03
0.15
0.15
0.30
0.20
0.15

Probability
0.05
0.10
0.30
0.25
0.15
0.10
0.03
0.02

The firm wants to perform 5 manual simulation runs for this project. We have to
generate values, at random, for the two exogenous variables: annual cash flow and
project life.
We set up the correspondence between the values of exogenous variables and
random numbers, and then choose some random number. Table I shows the
correspondence between various values of the variables and the random numbers. Table
II presents a table of random digits.
Table I
Annual cash flow
Value
Rs.
1,000,000
1,500,000
2,000,000
2,500,000
3,000,000
3,500,000
4,000,000

Cumulative
probability
0.02
0.05
0.20
0.35
0.65
0.85
1.00

Project life
Two digit
random
numbers
00 to 01
02 to 04
05 to 19
20 to 34
35 to 64
65 to 84
85 to 99

Value
Years
3
4
5
6
7
8
9
10

Cumulative
probability

Two digit
random
numbers

0.05
0.15
0.45
0.70
0.85
0.95
0.97
1.00

00 to 04
05 to 14
15 to 44
45 to 69
70 to 84
85 to 94
95 to 97
98 to 99

We draw random numbers starting from the top left corner and select every fourth
number. We first do this for cash flow and then for project life. In Table III, the computed
values of NPV are tabulated.

34

Table II
Random Numbers
53479
97344
66023
99776
30176
81874
19829
09337
31151
67619

81115
70328
38277
75723
48979
83339
90630
33435
58295
52515

98036
58116
74523
03172
92153
14988
71863
53869
40823
03037

12217
91964
71118
43112
38416
99937
95053
52769
41330
81699

59526
26240
84892
83086
42436
13213
55532
18801
21093
17106

Table III
Simulation Results
Annual cash flow

Project life

Run

Random
number

Corresponding
value of annual
cash flow

Random
number

Correspondin
g project life

1
2
3
4
5

53
30
31
38
90

3,000,000
2,500,000
2,500,000
3,000,000
4,000,000

97
81
67
75
33

9
7
6
7
5

Net present
value
12,277,070
7,171,050
5,888,150
9,605,260
10,163,150

Illustration 16
Estimate the duration of a bond, given its YTM is 6%, Coupon Rate is 9%, Time to
Maturity is 4 years and Face Value is Rs. 1,000.
1

4=2x3

5 = 4/ 4

6 = (5) x (1)

35

Year
(t)

Cash Flow
(Rs.)

1/(1+YTM)t

90

1/(1.06)1 = 0.9434

2
3
4

90
90
1090

1/(1.06)2 = 0.8999
1/(1.06)3 = 0.8396
1/(1.06)4 = 0.9434

PV of cash
flows
84.906

PV of cash
flows as
fraction of
P0
0.0769

0.769

80.991
75.564
863.389
P0 = 1104.85

0.0733
0.0684
0.7814
1.0000

0.1466
0.2052
3.1256
3.5543

The duration of the 4 year bond is 3.5543 years.

36

References:
1. Peter F Drucker, Managing for results, William Heinemann, 1964.
2. Tom M Apostl, Calculus, Volume II, 2nd edition, John Wiley & Sons, 1969.
3. F Milliken, Three types of perceived uncertainty about the environment: State, effect
and response uncertainty, Academy of Management Review, 1987, Vol. 12,
pp. 133-143.
4. O E Williamson, Transaction Cost Economics, Handbook of Industrial
Organization, 1989, Volume I, pp. 135-182.
5. Robert S Kaplan and David P Norton, The Balanced Scorecard Measures that
drive performance, Harvard Business Review, JanuaryFebruary, 1992,
pp. 71-79.
6. Kenneth A Froot, David S Scharfstein and Jeremy C Stein, A framework for risk
management, Harvard Business Review, NovemberDecember, 1994,
pp. 91102.
7. Joseph L Bower and Clayton M Christensen, Disruptive Technologies: Catching the
Wave, Harvard Business Review, January February, 1995, pp. 27-37.
8. Mathew Bishop, Corporate Risk Management Survey, The Economist, February 10,
1996.
9. James M Utterback, Mastering the Dynamics of Innovation, Harvard Business
School Press, 1996.
10. P Bradley and T Louis, Bayes and Empirical Bayes Methods for data analysis,
Chapman & Hall, 1996.
11. Heidi Deringer, Jennifer Wang and Debora Spar, Note on Political Risk Analysis,
Harvard Business School Case No. 9-798-022, September 17, 1997.
12. Hugh G Courtney, Jane Kiruland and S Patrick Viguerie, Strategy under
uncertainty, Harvard Business Review, November-December, 1997.
13. Mark L Sirower, The Synergy Trap, The Free Press, New York, 1997.
14. Clayton M Christensen, The Innovators Dilemma, Harvard Business School Press,
1997.
15. Patric Wetzel and Oliver de Perregaux, Must it always be risky business? The
McKinsey Quarterly, 1998, Number 1.
16. Peter L Bernstein, Against the Gods, John Wiley & Sons, 1998.
17. Harold D Skipper, Jr., International Risk and Insurance: An EnvironmentalManagerial Approach, Irvin McGraw-Hill, 1998.
18. Robert Simons, How risky is your company? Harvard Business Review, May
June 1999, pp. 85 94.
19. Clayton M Christensen and Michael Overdorf, Meeting the challenge of disruptive
change, Harvard Business Review, March April, 2000, pp. 66-76.
20. Guido A Krickx, The relationship between Uncertainty and Vertical Integration,
International Journal of Organizational Analysis, Issue 3, 2000, pp. 309-329.
21. Forest L Reinhardt, Down to Earth, Harvard Business School Press, 2000.
22. D G Prasuna, Scanning for De-risking, Chartered Financial Analyst, July 2001,
pp. 23-31.
23. Fredrick F Reichfield, Lead for Loyalty, Harvard Business Review, July-August
2001, pp. 76-84.
24. The new enemy, The Economist, September 15, 2001, pp. 15-16.

37

25. Taking stock, The Economist, September 22, 2001, p. 59.


26. Chris Lewin Refining the art of the probable, Financial Times Mastering Risk
Volume I, 2001, pp. 35-41.
27. Kiriakos Vlahos, Tooling up for risky decisions, Financial Times Mastering Risk
Volume I, 2001, pp. 47-52.
28. Peter L Bernstein, The enlightening struggle against uncertainty, Financial Times
Mastering Risk Volume I, 2001, pp. 5-10.
29. Enterprise Risk Management Implementing New Solutions, The Economist
Intelligence Unit Research Report 2001.
30. Dan Borge, The Book of Risk, John Wiley & Sons, 2001.
31. N Johnson & S Kotz, Encyclopedia of Statistical Sciences, John Wiley & Sons,
New York, pp. 197-205.

You might also like