Chapter - I Enterprise Risk Management: An Introduction
Chapter - I Enterprise Risk Management: An Introduction
Chapter - I Enterprise Risk Management: An Introduction
of identifying and managing risks effectively. Companies such as Procter & Gamble,
investment banks such as Barings and government organisations like the Orange County
have all burnt their fingers due to faulty risk management practices. Closer home, we
have seen many Non Banking Finance Companies (NBFCs) such as CRB winding up
after taking risks totally inconsistent with their resources or capabilities. Quite clearly,
companies need to develop and apply an integrated risk management framework, that can
inspire the confidence of shareholders by stablising earnings and lowering the cost of
capital.
Organisations face various types of risk. Unfortunately, in many organisations,
much of the focus of risk management has been on fluctuations in financial parameters
such as interest rates and exchange rates. As Butterworth 3 puts it: A strong appreciation
of finance and accounting is useful, since all risk effects will have an impact on the profit
and loss account and the balance sheet. But this focus on finance as an important core
skill may have been overemphasized. Just like the field of knowledge management has
been dominated by software companies, risk management has been strongly associated
with treasury, forex and portfolio management. The risk management agenda has been
hijacked by investment bankers, corporate treasurers and insurance companies and
dominated by the use of financial derivatives and insurance cover. This is clearly not the
way it should be. As Bernstein4 puts it, Risk management guides us over a vast range of
decision-making, from allocating wealth to safeguarding public health, from waging war
to planning a family, from paying insurance premiums to wearing a seat belt, from
planting corn to marketing cornflakes.
Risk is all about vulnerability and taking steps to reduce it. Several factors
contribute to this vulnerability, not just fluctuations in financial parameters. As the
Economist5 has put it: Top managers often fail to understand properly the firms
sensitiveness to different types of risk. This is because the technology for identifying risk
exposures in non financial firms is as yet fairly primitive, but more fundamentally
because managers and boards too often regard risk management as a matter for financial
experts in the corporate treasury department rather than as an integral part of corporate
strategy.
Many organisations make the mistake of dealing with risk in a piecemeal fashion.
Within the same company, the finance, treasury, human resources and legal departments
cover risks independently. An organisation-wide view of risk management can greatly
improve efficiencies and generate synergies. That is why many companies are taking a
serious look at Enterprise Risk Management (ERM), which addresses some fundamental
questions:
What are the various risks faced by the company?
What is the magnitude of each of these risks?
What is the frequency of each of these risks?
What is the relationship between the different risks?
How can the risks be managed to maximise shareholders wealth?
3
4
Prudent risk management ensures that the firms cash flows are healthy so that its
immediate obligations and future investment needs are both adequately taken care of.
Firms typically run into cash flow problems because they fail to anticipate or handle risks
efficiently. These risks include huge R&D investments which do not pay off, excessive
premium paid for an acquisition, costly litigation (especially class action law suits) by
aggrieved stakeholders, excessive dependence on a single or a few customers and
vulnerability to interest rate, stock index and exchange rate movements. In 1993,
Metallgesellschaft which tried to cover the risk associated with its long term contracts
through oil futures ended up losing a huge amount. In the same year, Philip Morris had to
cut prices of Marlboro sharply due to unexpectedly stiff competition from cheaper,
private labels. Nick Leeson, the rogue trader, drove Barings to bankruptcy when Japans
Nikkei Index collapsed in early 1995. In 1997, the chemicals giant, Hoechst incurred
substantial expenses due to product recall. The star studded team at hedge fund, Long
Term Capital Management could do little as unexpected interest rate and currency
movements brought the fund to the edge of bankruptcy in 1998. Coca Cola faced a big
crisis when its bottles in Europe were found to be contaminated and had to be recalled in
the middle of 1999.
Read Borges book The Book of Risk written in a very simple, narrative style.
Risk management is all about making choices and tradeoffs. These choices and
tradeoffs are closely related to a companys assumptions about the external environment.
The word risk has its origins in the Italian word, risicare, which means to dare. So, risk
is about making choices rather than waiting passively for events to unfold. Consider two
leading global pharmaceutical companies, Merck and Pfizer. Merck is betting on a
scenario in which HMOs7 rather than doctors will dominate the drug-buying process.
Hence its acquisition of the drug distribution company Medco. On the other hand, Pfizer
has invested heavily in its sales force on the assumption that doctors will continue to play
an important role. Each company is working out its strategies on the basis of an
assumption and consequently, taking a risk. Similarly, a company which bets on a new
technology could be diverting a lot of resources from its existing business. If the new
technology fails to take off, it may become a severe drain on the companys resources.
But, if the firm decides not to invest in the new technology and it does prove successful,
the very existence of the company is threatened. So, not taking a risk may turn out to be
a risky strategy in many cases.
All risks cannot be attributed to external factors. Many of the risks which
organizations assume have more to do with their own strategies, internal processes,
systems and culture than any external developments. For example, the collapse of
Barings Bank had more to do with poor management control systems than unfavourable
developments in the external environment.
B.
Effect Uncertainty: This is the uncertainty about the impact of external events on
the organization.
C.
Prioritising Risks
Create
Create
Contingency
Contingency
Plans
Plans
Take
Take
Immediate
Immediate
Action
Action
Conduct
Conduct
Periodic
Periodic
Review
Review
Conduct
Conduct
Ongoing
Ongoing
Review
Review
Low
Likelihood of Risk
High
tremendous potential for generating sustainable competitive advantages in the long run.
So, the dividing line between risk management and value creation is much thinner than
we imagine. Indeed, the ultimate objective of Enterprise Risk Management is to maximise
shareholders wealth.
Table I
The Enterprise Risk Management process
technology risk. These are only general guidelines. Ultimately whether to retain the risk
or to transfer it, should be decided on a case-to-case basis.
Enterprise Risk Management at Infosys Technologies:
An interview with Nandan M Nilekani, CEO
Infosys Technologies is one of Indias most admired companies. The company has been a trendsetter in risk
management. CEO, Nandan Nilekani explains how Infosys handles risk.
On the mechanisms to manage risk at a strategic level.
The following mechanisms need to be in place to manage risks at the strategic level:
(i)
The Board of Directors of the company need to take ultimate bottom-line responsibility for Risk
Management, thus ensuring that Risk Management is part of the charter for the company.
(ii)
The business portfolio of a company needs to be diverse so that vagaries in one segment do not
affect the companys business performance adversely. This is done by putting in place prudential
norms of restricting business exposure, especially in business segments where there is high
volatility.
(iii)
Management Control Systems that ensure timely aggregation of inputs in the external and internal
environment, enabling quick top management decision making on Risk Management are required.
These mechanisms should cascade to the level of line managers so that the company can
implement these decisions quickly.
On the ideal business model
There is no one size fits all kind of business model. The specific aspects of the derisking model for each
company depend on the nature of the business the company is in, its capability in different areas, etc.
The Infosys business model rests on four pillars, Predictability, Sustainability, Profitability and De-risking
(PSPD model). This model helps management evaluate risk-return trade-offs and make effective strategic
choices. This leads to a predictable and sustainable revenue stream for the company. Infosys pioneering
global delivery model has helped the company to consistently be among the most profitable IT services
companies in the world. Derisking provides the company with the strength and stability to effectively
handle variations in the business environment.
On enterprise risk management in India.
In the past, the software industry in India has grown exponentially. There are risks inherent in this kind of
growth and managing this requires strong risk management practices. Since the software sector in India has
had to compete with global companies, the exposure they have to global best practices is significant. The
visionary managements of some software companies in India have implemented these global best practices
in their company. One area in which global best practices have been implemented is enterprise-wide risk
management.
On short-term focus of risk management
Any successful derisking model should be balanced keeping in mind long-term as well as short-term,
financial as well as non-financial aspects. So focusing on short-term financial impact can lead to suboptimal solutions, which may be counter-productive.
On globalization and increase in risks
Globalization means that the war for talent no longer respects geographical boundaries. Hence, the risk of
attrition of highly talented employees is an important factor that companies need to manage. Further,
companies are faced with the challenge of ensuring that their knowledge base, technology and processes
are robust enough to meet changing global market requirements. Risks associated with the international
political environment also have a bearing on the companys performance.
On the Infosys model of derisking
We ensure that we do not become overly dependent on any single segment of our business. For example,
we had put a cap of 25% on our Y2K revenues. We try to diversify our risk by operating in multiple
technologies and multiple market segments. We make sure that no one customer provides more than 10% of
our business. We ensure that we operate in a variety of vertical domains. The whole idea is that one should
not become overly dependent on any one segment and that we broad-base our operations so as to de-risk
the company.
Expansion into under-penetrated markets is part of the derisking strategy at Infosys. Infosys has
already entered markets in Europe and Asia-Pacific by opening marketing offices in Paris, Frankfurt,
Brussels, Stockholm, Tokyo, Hong Kong, Sharjah, Sydney and Melbourne. Our aim is to have multiple
development centers across the globe to respond instantly to our customers needs and to take advantage of
the talent pools available in cost-competitive economies. This strategy also reduces the risk to our
operations due to changes in geo-political equations.
Source: Chartered Financial Analyst, July 2000, Reprinted with permission.
Figure II
Sources of typical Business Risks
External, Non recurrent risks
Consumer boycotts, Technology,
Patent Infringement
Internal risks
Failed / delayed product launches,
product recall/default, plant safety
Types of risk
What are the various risks a company can face? The Economist Intelligence Unit divides
risks into four broad categories.
Hazard risk is related to natural hazards, accidents, fire, etc. that can be insured.
Financial risk has to do with volatility in interest rates and exchange rates,
defaults on loans, asset-liability mismatch, etc.
Operational risk is associated with systems, processes and people and deals with
succession planning, human resources, information technology, control systems
and compliance with regulations.
Strategic risk stems from an inability to adjust to changes in the environment such
as changes in customer priorities, competitive conditions and geopolitical
developments.
The method of classifying risks is not as important as understanding and
analysing them. Indeed, the very nature of uncertainty implies that it is difficult to
identify all risks, leave alone classify them. Moreover, the objective of this book is not to
provide prescriptive solutions but to encourage and motivate companies to think more
deeply, clearly and consistently about the risks they face. Each company should carefully
examine its value chain and come up with its own way of categorising the uncertainties
associated with its important value adding activities. Then, it can quantify these
uncertainties to the extent possible and decide which risks to hold and which to transfer.
In this book, we have categorised risks as follows:
Capacity expansion risks.
Vertical integration risks.
Diversification risks.
Technology risks.
Mergers & Acquisitions risks.
Environmental risks.
Political risks.
Ethical, Legal & Reputation risks.
Financial risks.
Marketing risks.
Human Resources risks.
Vertical Integration
Risks
Diversification
Risks
M&A Risks
Environmental
Risks
Financial
Risks
Technology
Risks
Political Risks
Marketing Risks
Human Resources
Risks
Capacity Expansion
Risks
10
Technology risk has become important in this age of rapid innovation. Companies
which do not have a strategy to cope with changing technology will find themselves at a
severe disadvantage. The key decision in technology risk management is whether to
move early or to wait and see the impact of a new technology as it emerges.
Many companies today look at mergers and acquisitions as a way of generating
fast growth by gaining quick access to resources such as people, products, technology
and facilities. But, mergers and acquisitions have to be planned and executed carefully, to
ensure that the integration of the pre-merger entities takes place smoothly and the
projected synergies are realised. Otherwise, they may prove to be a severe drain on the
existing resources and even ruin a company in some cases.
Another type of risk is environmental risk. Companies which do not take steps to
protect the natural environment, face the risk of resistance and hostility from society. In
some cases, poor environmental performance may even threaten the very existence of the
company, as illustrated by the example of Union Carbide in Bhopal. In other cases, such
as the Exxon Valdez oil spill in Alaska, the reputation of the company can be severely
damaged.
Political risks also need to be managed carefully. Governments may suddenly
change their policies or may interfere with the companys operations. Understanding the
nature of political instability and anticipating problems is important, especially for
multinational corporations operating in emerging markets. Dabhol Power Corporation is
a good example.
In recent times, legal risks have also become important. Product liability class
action suits by employees or shareholders can pose grave problems. Similarly, anti-trust
proceedings by the government can take a companys attention away from its core
business. A significant proportion of senior managements time, at Microsoft, has been
consumed by the anti-trust suit, which is only now reaching the settlement stage.
In the modern business world, companies are expected to maintain high standards
of ethics and corporate governance. Unethical practices and low standards of corporate
governance can severely erode not only the reputation of a company but also its market
capitalisation. A good example of a company, which has seen a severe decline in its
business owing to unethical and illegal disclosure practices is the famous insurance
company, Lloyds of London. In India, Shaw Wallace has faced similar problems.
The most commonly discussed form of risk is financial risk. When interest or
foreign exchange rates fluctuate, there is an impact on cash flows and profits. Risk also
increases as the debt component in the capital structure increases. This is because debt
involves mandatory cash outflows, while equity holders can be paid dividends at the
discretion of the company. Today, sophisticated hedging tools like derivatives are
available to manage financial risk.
There are various marketing risks which companies have to deal with carefully. A
careful understanding of the marketing activities and the associated risks is extremely
important. Branding, pricing, distribution and product development have all become very
complicated in the contemporary business environment. Unless the marketing mix is
carefully managed, customers may switch over to competitors.
Risks associated with human resources too need to be managed effectively.
Succession planning is probably the most strategic of these risks. Even such well known
companies like Coca Cola and Procter & Gamble have struggled in the recent past due to
11
Concluding Notes
In their seminal paper, The Balanced score card Measures that drive performance 13
Robert Kaplan and David Norton have emphasised the need for evaluating the
performance of an organisation from four different angles customer perspective,
internal perspective, innovation and learning perspective and shareholder perspective.
The Balanced Score Card considers financial measures that represent the outcome of past
actions. At the same time, it incorporates operational measures relating to customer
satisfaction, internal processes and attempts at innovation and improvement, all of which
drive future financial performance. Similarly, when we talk of risk management, the
various business risks which organisations face must be considered along with the
financial risks. Ultimately, financial risks are the outcome of business strategy. The role
of financial risk management is to minimise uncertainty regarding cash flows; but the
very source of these cash flows is the type of business which the company runs and the
type of strategic decisions it makes.
In todays competitive and complex environment, events are unfolding with a
degree of uncertainty and speed never seen before. The magnitude and nature of risks
faced by companies are constantly changing. Enterprise Risk Management (ERM) has
become more critical than ever before. It is all about changing the way decisions are
made, by systematically collecting and processing information. ERM is not a purely
defensive tool as many believe and does not imply excessive caution. Rather, it is about
creating conditions which encourage managers to achieve the right balance between
minimising risks and exploiting new opportunities. Indeed, the ultimate aim of ERM is to
make available a steady stream of cash flows that can be utilised to maximise
shareholders wealth.
Each chapter in this book discusses a particular type of risk, closely examining the
key issues involved. Discussing management problems is of little use, unless an attempt
is made to develop practical solutions. So, wherever possible, live examples have been
provided to illustrate the concepts. A sufficient number of box items and small cases are
provided in each chapter. Ultimately, there is no better way to understand risk
management than by learning from the successful and not-so-successful experiences of
various companies. Some of the useful models developed by eminent scholars and
industry experts and the best practices of organisations have also been included.
In this book, we shall from time-to-time look at some of the broader philosophical
issues that risk management raises. We can use past data to understand and draw
inferences but to what extent can these inferences be applied to the future? Do numbers at
all make sense in an uncertain world? To what extent should we depend on intuition to
deal with risks which are difficult to quantify? Is risk management an art or a science?
Many feel that in an attempt to master risk, man has become a slave to
mathematical tools, techniques and models. As Bernstein puts it: Our lives team with
numbers but we sometimes forget that numbers are only tools. They have no soul; they
may indeed become fetishes. Many of our most critical decisions are made by computers,
13
12
contraptions that devour numbers like voracious monsters and insist on being nourished
with ever greater quantities of digits to crunch, digest and spew back. Of course, a total
reliance on intuition may not be advisable. In this book, we shall try to understand how
organisations can strike the right balance between intuitive thinking and quantitative tools
while managing risk.
13
The quotes in this section are drawn from Peter Bernsteins book, Against the Gods, unless
otherwise mentioned.
14
London and explained the importance of demographic data. Graunts work gradually led
to concepts such as sampling, averages and the notion of what is normal. These
concepts later formed the basis of statistical analysis. The line of analysis pursued by
Graunt is today known as statistical inference, i.e., infering an estimate about a
population from a sample.
In 1692, John Arbuthnots translation of Huygens work became the first
publication on probability in the English language. The book had a long title Of the laws
of chance or a method of calculation of the hazards of game, plainly demonstrated and
applied to games as present most in use.
Edmund Halley, the famous British astronomer also made a significant
contribution. He developed tables that facilitated calculation of annuities. These tables
were published in a work called Transactions in 1693. Halleys work became the basis for
the modern life insurance business.
A coffee house which Edward Lloyd opened in London in 1687 was the birth
place of Lloyds, the famous insurance company. In 1696, he prepared the Lloyds list,
which provided details about the arrival and departure of ships and conditions at sea. Ship
captains frequented the coffee shop and compared notes on the hazards associated with
different sea routes.
The Lloyds list was subsequently expanded to provide daily news on stock
prices, foreign markets and high water times at London Bridge. The London insurance
industry grew rapidly, fuelled by various innovations. Underwriters wrote policies to
cover various types of risk. In 1771, 79 underwriters came together to set up the Society
of Lloyds. The members of the society came to be known as the Names. An insurance
industry also began to emerge in the American colonies. Benjamin Franklin set up a fire
insurance company called First American in 1752. In 1759, the Prebysterian Ministers
Fund wrote the first life insurance policy.
As trade expanded, judgments about consumer needs, pricing and cost of
financing became important. For these adventurous traders, business forecasting became
important. Indeed, business forecasting was a major innovation of the late 17 th century.
Till then, the principles of probability had been applied to applications like gambling, far
removed from business.
In 1713, Jacob Bernoullis law of large numbers showed how probabilities and
statistical significance could be inferred from limited information. Suppose we throw up
a coin. The law states that the ratio of the number of heads to the total number of throws
will tend towards 0.5 as the number of throws becomes large. In statistical terms,
increasing the number of throws will increase the probability that the ratio of heads to the
total number of throws will vary from 0.5 by less than some stated amount.
In 1738, Daniel Bernoulli published a paper that covered both the subject of risk
as well as human behavior. Bernoulli introduced a very important idea. The utility
resulting from any small increase in wealth will be inversely proportional to the quantity
of wealth previously possessed. For example, all people want to become rich but the
intensity to become rich reduces as they become richer. While probability theory set up
the choices, Bernoulli considered the motivations of the person who did the choosing.
Utility varies across individuals. This has profound implications for the field of risk
management. Rational decision makers attempt to maximise expected utility, not
expected value. As Bernstein puts it so well, If everyone valued every risk in precisely
15
the same way, many risky opportunities would be passed up. Venturesome people place
high utility on the small probability of huge gains and low utility on the larger probability
of loss. Others place little utility on the probability of gain because their paramount goal
is to preserve their capital. Where one sees sunshine, the other finds a thunderstorm.
Without the venturesome, the world would turn a lot more slowly We are indeed
fortunate that human beings differ in their appetite for risk. Bernoullis theory of utility
later led to the laws of demand and supply.
A French mathematician, Abraham de Moivre also made impressive contributions
to the field of risk management. These are documented in his book, The Doctrine of
Chances, first published in 1713. De Moivre demonstrated how observations distribute
themselves around their average value. This led to the normal distribution. De Moivre
also developed the concept of standard deviation. It made it possible to evaluate the
probability that a given number of observations would fall within some specified bound.
Moivre also showed the normal distribution to be an approximate form of the binomial
distribution.
In the 1760s, an Englishman, Richard Price did some pioneering work in the
construction of mortality tables. Based on the work of Halley and de Moivre, Price
published two articles on the subject. In 1771, he published a book titled Observations
on Reversionary Payments. For this work, Price is generally acknowledged as the
founding father of actuarial science. Prices work however, had some errors. He
overestimated mortality rates at younger ages and underestimated them at later ages. He
also underestimated life expectancies. Consequently, life insurance premia were much
higher than they needed to be.
Thomas Bayes, an Englishman born in 1701 worked on determining the
probability of the occurrence of an event given that it had already occurred a certain
number of times and not occurred a certain number of times. In other words, Bayes
focussed attention on using new information to revise probabilities based on old
information. In a dynamic environment, characterised by a high degree of uncertainty,
this can be a very useful tool. As more and more information becomes available, earlier
probabilities can be revised. Bayes most well known paper was Essay towards solving a
problem in the doctrine of chances. The Bayes theorem of conditional probability was
first published in 1763.
Carl Freidrich Gauss, published Disquisitiones Arithmeticae in 1801, which dealt
with the theory of numbers. Gauss rapidly emerged as one of the leading mathematicians
in the world. One of his early attempts to deal with probability was in the book Theoria
Motus (Theory of Motion) published in 1809. In this book, Gauss made attempts to
estimate the orbit of heavenly bodies based on the path that appeared most frequently
over many separate observations. Gauss was also involved in geodesic measurements, the
use of the curvature of the earth to improve the accuracy of geographic measurements.
These measurements involved making estimates based on sample distances within the
area being studied. Gauss noticed that the observations tended to distribute themselves
symmetrically around the mean.
In 1810, Pierre Laplace spotted the weakness in Gauss work. Before
Laplace, probability theory was concerned with games of chance. Laplace applied it to
many scientific and practical problems. In 1809, Laplace also framed the Central Limit
Theorem. It states that the sampling distribution of the mean approaches normal as the
16
sample size increases. In 1812, Laplace published his book, Theorie analytique des
probabilities.
Simeon Denis Poisson came up in 1914 with what came to be known as the
Poisson distribution. It is quite useful in situations where a discrete random variable takes
on an integer value. The distribution can be used to estimate the probability of a certain
number of occurrences in situations such as the number of telephone calls going through
a switchboard system per minute, the number of patients coming for a check up at a
hospital on a given day or the number of accidents at a traffic intersection during a week.
In 1867, Pafnuti Chebyshev, developed another important theorem. He established
that no matter what the shape of the distribution, at least 75% of the values will fall
within (plus minus) two standard deviations from the mean of the distribution and at least
89% of the values will lie within (plus minus) three standard deviations from the mean.
Francis Galton tried to build on the foundation provided by Gauss and others. In,
1885, his work led to the formulation of a general principle that has come to be known as
regression or reversion to the mean. Galtons analysis later led to the concept of
correlation. Using normal distribution and regression to the mean, Galton worked on
problems such as estimating the rate at which tall parents produced children who were
tall relative to their peers but shorter relative to their parents. Galton also computed the
average diameter of 100 seeds produced by different sweet pea plants. He found that the
smallest pea seeds had larger offspring and the largest seeds had smaller offspring.
Similarly, in another study he found that if parents were short, the children were slightly
taller and vice versa. These two experiments led Galton to develop the term regression,
the process of returning to the mean.
Bernstein has explained the importance of Galtons work: Regression to the
mean motivates almost every kind of risk taking and forecasting. Its at the root of
homilies like what goes up must come down, Pride goeth before a fall, and from
shirtsleeves to shirtsleeves in three generations. Probably Joseph had this in mind when
he predicted to Pharaoh that seven years of famine would follow seven years of plenty.
In stock markets, regression to the mean is applied when we talk of over valuation and
under valuation of stocks. We imply that a stocks price is certain to return to the intrinsic
value. According to Bernstein, Galton transformed the notion of probability from a static
concept based on randomness and the Law of Large Numbers into a dynamic process in
which the successors to the outliers are predestined to join the crowd at the centre.
In the late 19th century, the importance of statistics was recognised by the
scientific world. Many advances were made in statistical techniques including the
standard deviation, correlation coefficient and the chi square test. In 1893, Karl Pearson
introduced the concept of standard deviation. In 1897, he developed the concept of
correlation coefficient. In 1900, Karl Pearson presented the idea of the chi-square
distribution, useful for understanding the similarity of different populations. Consider a
politician campaigning for elections. His manager has found out that 30, 40 and 50
percent of the voters surveyed in each region recognize his name. The chi-square
distribution enables him to determine whether the differences in these proportions are
significant. That will be a crucial piece of information in understanding the impact his
speech will make on a particular region. Similarly, marketers would find the test useful in
determining whether the preference for a certain product differs from state to state or
region to region. If a population is classified into several categories with respect to two
17
attributes, the chi-square test can be used to determine if the two attributes are
independent of each other. For very small numbers of degrees of freedom, the chi-square
distribution is severely skewed to the right. But as this number increases, the curve
rapidly becomes more symmetrical. When the number reaches large values, the
distribution can be approximated by the normal. A large value of chi-square indicates a
substantial difference between observed and expected values. On the other hand, if chisquare is zero, it means observed values exactly match expected values.
In 1908, William Gosset presented his work on the t distribution. It is useful for
estimation whenever the sample size is less than 30 and the population standard deviation
is not known. While using the t distribution, we assume that the population is normal or
approximately normal. A t distribution is lower at the mean and higher at the tails than a
normal distribution. From 1915, another period of development of statistical theory
began, led by people like R A Fisher. They worked on sampling theory, development of
distributions of many sample statistics, principles of hypothesis testing and analysis of
variance. Analysis of variance is a technique to test the equality of three or more sample
means and thus make inferences as to whether the samples come from populations having
the same mean. It is useful in applications such as comparing the scholastic performance
of graduating students from different schools or comparing the effectiveness of different
training methods. Essentially, in this technique, the means of more than two samples are
compared. In 1925, Fisher published his book, Statistical Methods for research
workers, the first textbook presentation of the analysis of variance.
Yet another period of development of statistical theory began in 1928. Led by
Jerzy Neymen and Egon Pearson, the work of this period included concepts such as Type
II error15, power of a test and confidence intervals. Statistical quality control techniques
were also developed during this period.
In 1939, Abraham Wald developed statistical decision theory. This is useful in
situations where the decision maker wants to reach an objective, there are several courses
of action each having a certain value, events are beyond the control of the decision maker
and there is uncertainty regarding which outcome or state of nature will happen.
Essentially, managers decide among alternatives by taking into account the financial
implications of their actions. Use of statistical techniques accelerated in the 1940s with
vast increases in computing power which allowed large sets of data to be processed.
Before the first world war, researchers had concentrated on the inputs that went
into decision making. Later, they realised that a decision was only the beginning of a
chain of events. Gradually, they recognised the need to examine the consequences of their
decisions.
Frank Knight, an economist at the University of Chicago published a book Risk,
Uncertainty and Profit, in 1921, probably the first work to deal with decision making
under uncertainty. Knight attempted to draw a distinction between uncertainty and risk:
Uncertainty must be taken in a sense radically distinct from the familiar notion of risk,
from which it has never been properly separatedIt will appear that a measurable
uncertainty, or risk proper is so far different from an unmeasurable one that it is not in
effect an uncertainty at all. Knight argued that it was difficult and not always appropriate
to apply mathematical techniques for forecasting the future. He was also doubtful
15
The assumption being tested is called the null hypothesis. Rejecting a null hypothesis when it is
true is called a Type I error and accepting it when it is false called a Type II error.
18
whether the frequency of past outcomes could be any guide to the future. As Knight put
it: (Any given) instance is so entirely unique that there are no others or not a sufficient
number to make it possible to tabulate enough like it to form a basis for any inference of
value about any real probability in the case we are interested in.
In 1921, John Maynard Keynes compiled a book, A treatise on probability.
Keynes differentiated what was definable from what was undefinable when thinking
about the future. Like Knight, Keynes was also not in favour of taking decisions based on
the frequency of past occurrences. He felt that there was no certainty an event would
occur in the future just because a similar event had been observed repeatedly in the past.
Keynes preferred the term proposition to events as it more accurately reflected degrees of
belief about future events.
In the decades that followed, understanding of risk and uncertainty advanced in
the form of game theory. The utility theory of Daniel Bernoulli had assumed that
individuals made choices in isolation. Game theory, on the other hand, accepted that
many people might try to maximise their utility simultaneously. The true source of
uncertainty lay in the intentions of others. Decisions were made through a series of
negotiations in which people tried to minimise uncertainty by trading off what others
wanted with what they themselves wanted. Since the potentially most profitable
alternative often led to very strong retaliation by competitors, compromises made sense.
Von Neumann, who invented game theory first presented a paper on the subject in
1926. Later, Von Neumann teamed up with German born economist Oskar Morgenstern
and published a book, Theory of Games and Economic Behaviour. They advocated the
use of mathematics in economic decisionmaking and argued that human and
psychological elements of economics did not stand in the way of mathematical analysis.
A monograph developed by Russian Mathematician, Andrei Kolmogorov in 1933 became
the basis for modern probability theory.
In 1952, Harry Markowitz published an article called Portfolio Selection in the
Journal of Finance. It brought Markowitz the Nobel prize in 1990. Markowitzs key
insight was the important role of diversification. The return on a diversified portfolio of
stocks is equal to the average of the rates of return on individual holdings but its volatility
is less than the average volatility of its individual holdings. In a way, Markowitz was
describing a type of game theory in which an individual was playing against the stock
market. So, instead of going for a killing by investing in a single stock, an investor could
decrease his risk, by diversifying. Markowitzs work elevated risk to the same level of
importance as expected return. Markowitz used the term efficient to describe portfolios
that offered the best returns for a given risk. Each efficient portfolio gives the highest
expected return for any given level of risk or the lowest level of risk for a given expected
return. Rational investors can choose the portfolio that best suits their appetite for risk.
Later, Sharpe developed the Capital Asset Pricing Model which explained how financial
assets would be valued if investors religiously followed Markowitzs instructions for
building portfolios.
Two Israeli psychologists, Daniel Kahneman and Amos Tversky conducted indepth research into how people managed risk and uncertainty. Their Prospect theory,
which evolved in the mid-1960s, discovered behavioural patterns that had not been
recognised by proponents of rational decision making. Kahneman and Tversky argued
that human emotions and the inability of people to understand fully what they were
19
dealing with, stood in the way of rational decision making. One of the most important
insights from the prospect theory was the asymmetry between decision-making involving
gains and that involving losses. Where significant sums were involved, most people
rejected a fair gamble in favour of a certain gain.
When Kahneman and Tversky offered a choice between an 80% chance of losing
$4000 and a 20% chance of breaking even and a 100% chance of losing $300, 92% of the
respondents chose the gamble, even though the expected loss at $3200 was higher. But
when they had to choose between an 80% chance of winning $400 and a 20% chance of
winning nothing and a 100% chance of winning $300, 80% of the respondents preferred
the certain outcome.
According to Tversky: Probably the most significant and pervasive characteristic
of the human pleasure machine is that people are much more sensitive to negative than to
positive stimuli Think about how well you feel today and then try to imagine how
much better you could feel There are a few things that would make you feel better, but
the number of things that would make you feel worse is unbounded.
Kahneman and Tversky coined the term failure of invariance to describe
inconsistent choices when the same problem is expressed in different ways. The failure of
invariance is an important insight and has far greater applicability than is commonly
perceived. For example, the way a question is framed in an advertisement may persuade
people to buy something with negative consequences. Later work by some psychologists
revealed that there were circumstances in which additional information got in the way
and distorted decisions, leading to failures of invariance. In a 1992 paper summarising
the advances in Prospect Theory, Kahneman and Tversky commented: Theories of
choice are at best approximate and incomplete Choice is a constructive and contingent
process. When faced with a complex problem, people use computational shortcuts and
editing operations.
Even as efforts continued to develop a better understanding of risk and new risk
management techniques, new uncertainties were faced in the 1970s and 1980s. Financial
deregulation, inflation, volatility in interest and exchange rates and commodity prices all
combined to create an environment where the conventional forms of risk management
were ill equipped. US dollar long-term interest rates, which had been in the range 2-5%
since the Depression, rose to 10% by the end of 1979 and to more than 14% by the
autumn of 1981. Economic and financial uncertainty also had an impact on commodity
prices. Fortunately, the growing sophistication of information technology enabled
managers to manipulate huge quantities of data and execute complex strategies. The term
Risk Management became more commonly used in the 1970s. The first educational
qualifications in risk management were provided in the US in 1973. The US Professional
Insurance Buyers Association changed its name to the Risk and Insurance Management
Society (RIMS) in 1975. Non financial companies began to use derivatives in the early
1970s. The corporate treasury function also began to develop in the 1970s.
20
d1
ln (S/K) + ( r + 2/2) T
T
d2
d1 - T
21
16
22
where r =
n=
p=
q=
number of successes
number of trials
probability of success
1p
np
npq
where =
where =
n=
standard deviation of
population
sample size
where N =
n=
N n
N 1
size of population
size of sample
estimate of the population
standard deviation.
23
B.
size)
is larger than 30
C.
D.
z
n
N n
t
N 1
n
N n
p z p
N n
pq
n
N n
z
N 1
n
E.
N n
z
N 1
n
p z p
n 1
n 1
,s
2
x a/2
x 2 1 a / 2
G.
Difference between the means of two Normal Populations (1 = 2 but each value
not known)
Confidence limits =
( 1 - 2) t/2
1
1
X
n1 n 2
(x x
) 2 (x x 2 ) 2
n1 n 2 2
24
x 0
/ n
(Normal distribution)
B.
x 0
s/ n
t=
x1 x 2
1 / n1 1 / n 2 [(n1 1) s12 ( n 2 1) s 22 ] /( n1 n 2 2)
E.
(Normal distribution)
25
F.
p (1 p )(1 / n1 1 / n 2 )
x1 x 2
n1 n 2
(Normal distribution)
G.
s12
s 22
26
Probability = (3 1) 0.25
Illustration 2
Well prepared students will pass examinations 85% of the time but ill prepared students
will pass only 35% of the time. Past experience indicates that students are well prepared
75% of the time. In three successive examinations, a student passes. What is the revised
probability that the student was well prepared?
Probability of the student passing three examinations
after being well prepared
= (0.75) (0.6141)
= 0.4606
= (0.25) (0.0429)
= 0.0107
So, the revised probability that the student was well prepared =
0.4606
(0.4606 0.0107)
0.4606
0.4713
0.9773
Thus the probability of being prepared has been revised from 0.75 to 0.9773 after being
given the information that the student passes three examinations in a row.
27
Illustration 3
Suppose in Illustration 2, the student writes the examination 5 times, passes 4 times and
fails once. What is the revised probability of having been prepared for the examination?
Probability of passing 4 times and failing once after having prepared for the examination
=
(0.75) (0.85) (0.85) (0.85) (0.85) (0.15) = 0.05873
Probability of passing 4 times and failing once and without preparation
=
=
=
0.00244
0.06117
0.05873
0.05873 0.02244
0.9601
Illustration 4
The probability of a student passing an examination is 0.6 and of failing is 0.4. What is
the probability of 0,1,2,3,4,5 student(s) failing simultaneously?
Probability of no student failing
0.07776
0.2592
0.3456
Probability of three students failing = 5C3 (0.4) (0.4) (0.4) (0.6) (0.6) =
0.2304
0.0768
0.01024
Illustration 5
The number of telephone calls received through a switchboard in an office normally
averages 5 per minute. What is the probability of receiving no calls, 1 call, 2 calls during
a minute?
Assuming the Poisson distribution holds,
Probability
5 x e 5
x!
28
Probability of no calls
5 0 e 5
0!
0.00674
51 e 5
1!
0.03370
5 2 e 5
2!
0.08425
Illustration 6
Given that the average monthly demand for a product is normally distributed with a mean
of 500 units and a standard deviation of 100 units. What is the probability of demand
being.
i)
ii)
iii)
i)
Z=
650 500
100
1.5
Z=
P (Z = 2)
P (Z > 2)
iii)
=
=
=
700 500
100
2.0
420 500
100
- 0.8
P (Z = -0.8) =
P (Z = 0.8)
=
0.2881
(This is because the standard normal distribution is symmetrical.)
For a demand of 570,
P (Z = 0.7)
Z=
570 500
= 0.7
100
0.2580
Illustration 7
0.5461
29
A portfolio has been yielding a mean return of 35% with a standard deviation of 10%.
What is the probability of the return being less than 25%?
Assuming a normal distribution,
Z=
0.25 0.35
=-1
0.10
P (Z = -1) = P (Z = 1) = 0.3413
So, required probability = 0.5 - 0.3413 = 0.1587
Illustration 8
A bank estimates that the amount of money withdrawn by customers from their savings
accounts is normally distributed with a mean of Rs. 2000 and a standard deviation of
Rs. 600. If the bank considers a random sample of 100 accounts, what is the probability
of the sample mean lying between Rs. 1900 and Rs. 2050?
1900 2000
100
Z1 = 600
=
60
100
Probability = 0.4525 (from normal tables)
2050 2000
Z2 = 600
100
Probability
50
60
- 1.67
0.83
6
100
= 0.6
21 + (1.96) (0.6)
22.18 months
30
21 - (1.96) (0.6)
19.82 months
So, the mean life of the product will lie within the range 19.82 22.18 months at a 95%
confidence level.
Illustration 10
A sample of 25 rods has a mean length of 54.62 cm and a standard deviation of 5.34 cm.
Find the 95% confidence limits on the mean.
We use the t distribution as population standard deviation is unknown and sample size is
less than 30.
This is a two tailed test
t 0.975,24 = 2.064
upper confidence limit
lower confidence limit
54.62 + (2.064)
56.82
54.62 - (2.064)
52.42
5.34
25
5.34
25
Illustration 11
We project an average score of 90 (out of 200) in an examination which will be taken by
thousands of students all over the country. A sample of 20 indicates an average score of
84, with a standard deviation of 11. Test the hypothesis at a 0.10 level of significance.
The Null Hypothesis is = 90
We estimate the population standard deviation with the
sample standard deviation, i.e., 11
Standard error of the mean =
11
20
= 2.46
31
Illustration 12
A plant manager wants to estimate the daily consumption of a raw material. He studies
consumption for 10 days and finds the average consumption is 11,400 kg while the
standard deviation is 700 kg. Find the 95% confidence interval for the mean daily
consumption of the raw material.
700
Standard error
=
= 221.38 kg
10
We use the t distribution since sample size is less than 30 and the population
standard deviation is not known.
t0.05, 9 = 2.262
Upper confidence limit
=
11,400 + (2.262) (221.38)
=
11,900 kg
Lower confidence limit
=
11,400 (2.262) (221.38)
=
10,899 kg
Illustration 13
In a company there are 100,000 employees. A sample of 75 indicates that 40% of them
are in favour of the companys deferred compensation plan while 60% are not. The
management wants to estimate at a 99% confidence level, the range in which the
proportion of employees in favour of the plan lies.
99% confidence level means 49.5% of the area on either side of the normal distribution.
This corresponds to Z = 2.58
Standard error of the proportion =
pq
n
(0.4)(0.6)
75
0.057
32
12 2 2
n1
n2
0 .4 2
0 .6 2
200
175
0.053
NPV =
CFt
(1 i)
t 1
where CF is the annual cash flow, I the initial investment at time, t = 0 and i is the
discount rate. Use Monte Carlo Simulation to estimate NPV given that the cost of capital
is 10% and Initial investment is 5,000,000.
33
Project life
Probability
Value
Years
3
4
5
6
7
8
9
10
0.02
0.03
0.15
0.15
0.30
0.20
0.15
Probability
0.05
0.10
0.30
0.25
0.15
0.10
0.03
0.02
The firm wants to perform 5 manual simulation runs for this project. We have to
generate values, at random, for the two exogenous variables: annual cash flow and
project life.
We set up the correspondence between the values of exogenous variables and
random numbers, and then choose some random number. Table I shows the
correspondence between various values of the variables and the random numbers. Table
II presents a table of random digits.
Table I
Annual cash flow
Value
Rs.
1,000,000
1,500,000
2,000,000
2,500,000
3,000,000
3,500,000
4,000,000
Cumulative
probability
0.02
0.05
0.20
0.35
0.65
0.85
1.00
Project life
Two digit
random
numbers
00 to 01
02 to 04
05 to 19
20 to 34
35 to 64
65 to 84
85 to 99
Value
Years
3
4
5
6
7
8
9
10
Cumulative
probability
Two digit
random
numbers
0.05
0.15
0.45
0.70
0.85
0.95
0.97
1.00
00 to 04
05 to 14
15 to 44
45 to 69
70 to 84
85 to 94
95 to 97
98 to 99
We draw random numbers starting from the top left corner and select every fourth
number. We first do this for cash flow and then for project life. In Table III, the computed
values of NPV are tabulated.
34
Table II
Random Numbers
53479
97344
66023
99776
30176
81874
19829
09337
31151
67619
81115
70328
38277
75723
48979
83339
90630
33435
58295
52515
98036
58116
74523
03172
92153
14988
71863
53869
40823
03037
12217
91964
71118
43112
38416
99937
95053
52769
41330
81699
59526
26240
84892
83086
42436
13213
55532
18801
21093
17106
Table III
Simulation Results
Annual cash flow
Project life
Run
Random
number
Corresponding
value of annual
cash flow
Random
number
Correspondin
g project life
1
2
3
4
5
53
30
31
38
90
3,000,000
2,500,000
2,500,000
3,000,000
4,000,000
97
81
67
75
33
9
7
6
7
5
Net present
value
12,277,070
7,171,050
5,888,150
9,605,260
10,163,150
Illustration 16
Estimate the duration of a bond, given its YTM is 6%, Coupon Rate is 9%, Time to
Maturity is 4 years and Face Value is Rs. 1,000.
1
4=2x3
5 = 4/ 4
6 = (5) x (1)
35
Year
(t)
Cash Flow
(Rs.)
1/(1+YTM)t
90
1/(1.06)1 = 0.9434
2
3
4
90
90
1090
1/(1.06)2 = 0.8999
1/(1.06)3 = 0.8396
1/(1.06)4 = 0.9434
PV of cash
flows
84.906
PV of cash
flows as
fraction of
P0
0.0769
0.769
80.991
75.564
863.389
P0 = 1104.85
0.0733
0.0684
0.7814
1.0000
0.1466
0.2052
3.1256
3.5543
36
References:
1. Peter F Drucker, Managing for results, William Heinemann, 1964.
2. Tom M Apostl, Calculus, Volume II, 2nd edition, John Wiley & Sons, 1969.
3. F Milliken, Three types of perceived uncertainty about the environment: State, effect
and response uncertainty, Academy of Management Review, 1987, Vol. 12,
pp. 133-143.
4. O E Williamson, Transaction Cost Economics, Handbook of Industrial
Organization, 1989, Volume I, pp. 135-182.
5. Robert S Kaplan and David P Norton, The Balanced Scorecard Measures that
drive performance, Harvard Business Review, JanuaryFebruary, 1992,
pp. 71-79.
6. Kenneth A Froot, David S Scharfstein and Jeremy C Stein, A framework for risk
management, Harvard Business Review, NovemberDecember, 1994,
pp. 91102.
7. Joseph L Bower and Clayton M Christensen, Disruptive Technologies: Catching the
Wave, Harvard Business Review, January February, 1995, pp. 27-37.
8. Mathew Bishop, Corporate Risk Management Survey, The Economist, February 10,
1996.
9. James M Utterback, Mastering the Dynamics of Innovation, Harvard Business
School Press, 1996.
10. P Bradley and T Louis, Bayes and Empirical Bayes Methods for data analysis,
Chapman & Hall, 1996.
11. Heidi Deringer, Jennifer Wang and Debora Spar, Note on Political Risk Analysis,
Harvard Business School Case No. 9-798-022, September 17, 1997.
12. Hugh G Courtney, Jane Kiruland and S Patrick Viguerie, Strategy under
uncertainty, Harvard Business Review, November-December, 1997.
13. Mark L Sirower, The Synergy Trap, The Free Press, New York, 1997.
14. Clayton M Christensen, The Innovators Dilemma, Harvard Business School Press,
1997.
15. Patric Wetzel and Oliver de Perregaux, Must it always be risky business? The
McKinsey Quarterly, 1998, Number 1.
16. Peter L Bernstein, Against the Gods, John Wiley & Sons, 1998.
17. Harold D Skipper, Jr., International Risk and Insurance: An EnvironmentalManagerial Approach, Irvin McGraw-Hill, 1998.
18. Robert Simons, How risky is your company? Harvard Business Review, May
June 1999, pp. 85 94.
19. Clayton M Christensen and Michael Overdorf, Meeting the challenge of disruptive
change, Harvard Business Review, March April, 2000, pp. 66-76.
20. Guido A Krickx, The relationship between Uncertainty and Vertical Integration,
International Journal of Organizational Analysis, Issue 3, 2000, pp. 309-329.
21. Forest L Reinhardt, Down to Earth, Harvard Business School Press, 2000.
22. D G Prasuna, Scanning for De-risking, Chartered Financial Analyst, July 2001,
pp. 23-31.
23. Fredrick F Reichfield, Lead for Loyalty, Harvard Business Review, July-August
2001, pp. 76-84.
24. The new enemy, The Economist, September 15, 2001, pp. 15-16.
37