HUM Quiz 1 Note

Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

Ethics and Professionalism

Engineers must prioritize public safety. ABET should learn from mistakes.
Recognize both risks and benefits in technology. Ethics involves appreciating
engineering's positive impact. This chapter outlines engineering ethics, stresses
moral responsibility, and aligns professionalism with business goals in the
corporate setting.

1.1 SCOPE OF ENGINEERING ETHICS

1.11 Overview of Themes


This book discusses seven key themes in engineering ethics:
1. Engineers create possibilities and risks; they must prioritize benefits, prevent
harm, and identify dangers.
2. Moral values are essential in technological development, aligning with ethics
and excellence in engineering.
3. Personal values matter in engineering ethics, along with principles stated in
codes for all engineers.
4. Encouraging responsible conduct is crucial, more than punishing wrongdoing.
5. Engineering faces ethical dilemmas due to conflicting moral values.
6. Engineering ethics should address both micro and macro issues, often
interconnected.
7. Cautious optimism is needed in technological development—optimism with
caution.
Let’s talk more elaborately.

(1) Engineering as Social Experimentation: In 2003, the Columbia space


shuttle tragedy hit hard. I remember the fear of a terrorist attack, but it turned out
a piece of insulating foam caused the disaster by hitting the left wing. That wing,
made of reinforced carbon carbon, was supposed to handle high temperatures
during re-entry. The investigation not only looks into the immediate cause but
questions why past incidents with insulation breaking off weren't scrutinized
enough. They're also asking about new hazards, like faulty "bolt catchers." It
makes you wonder about NASA's safety culture and whether it's as improved as
we thought since the Challenger disaster in 1986. This whole thing shows how
technology is a double-edged sword. It brings new possibilities but also new
dangers. In Chapter 4, we talk about "engineering as social experimentation." It's
about engineers taking responsibility, being careful, foreseeing issues, and
alerting others to risks. We need to find that balance between the good and bad
in engineering. It's not just about fixing problems; it's about preventing them in
the first place.
(2) Ethics and Excellence: Moral Values are embedded in Engineering: In
engineering, moral values are not just added on; they're part of even the simplest
projects. Let me tell you about a college assignment. Students were asked to
design a chicken coop in Guatemala. They had to use local materials, keep it easy
for villagers to maintain, and ensure the chickens were treated well. The goal was
to double egg and chicken production while respecting the community's traditions.
The project involved choosing building materials, deciding on the coop's
structure, and considering safety for both people and chickens. They had to be
mindful of the environment and recycle chicken droppings as fertilizers. It wasn't
just about making something that worked; it was about doing it right. The students
even went to Guatemala in 1997 to build the coop with the help of the villagers.
In engineering, moral values come into play at different stages. Safety, efficiency,
and the way engineering communities work are all influenced by ethical
considerations. Engineers, as leaders in technology, also carry a responsibility for
moral values. This connection between excellence and ethics is something the
ancient Greeks understood, calling it "excellence" or "virtue." So, in engineering,
doing things excellently also means doing them ethically.

(3) Personal Commitment and Meaning: Imagine engineers redesigning an


artificial lung. They're busy, stressed, and mostly focused on technical stuff. One
smart idea changes everything: invite users and their families to share how the
artificial lung impacts their lives. Families talk about children breathing freely
and enjoying life. The engineers feel a real sense of purpose, boosting morale.
Engineers, like everyone, have different reasons for their work—meaningful jobs,
making a living, caring for others. Their responsibilities go beyond just products;
they have moral connections with customers and others. All engineers must
follow a code of ethics, setting high standards. But personal commitments matter
too. Some are shared, like those in the code, while others are more personal, like
beliefs about religion, the environment, or family. "Personal commitments" mean
both shared and individual responsibilities influencing professional efforts.

(4) Promoting Responsible Conduct and Preventing Wrong: In the early 2000s,
corporate scandals, such as Enron's bankruptcy, raised concerns about ethical practices.
Compliance issues are all about ensuring individuals adhere to professional standards,
preventing fraudulent and unethical behaviours. Laws and regulations play a crucial
role in deterring immoral acts, including fraud, theft, and bribery. It's essential to delve
into the reasons why some engineers might participate in wrongdoing rather than
reporting it. Yet, the primary emphasis should be on "preventive ethics"—taking actions
to stop unethical practices before they occur. Most engineers and corporations are
morally committed, and the connection between ethics and excellence should be value-
driven rather than solely reliant on compliance-based procedures, as discussed in
management theory. The main goal is to encourage and support responsible conduct,
fostering a strong ethical foundation in both individuals and corporations.
(5) Myriad Moral Reasons generate Ethical Dilemmas: Picture a chemical
engineer at a computer company. She discovers the company might be releasing
too much lead and arsenic into the city sewer, which becomes fertilizer for local
farmers. The engineer thinks stricter pollution controls are needed, but her
manager disagrees, saying it's too costly. She's torn between her duty to the
company, the community, her family, and her career. What should she do? These
tough situations are called ethical dilemmas, where moral reasons clash. In
engineering, moral values can conflict, and this book explores ways to understand
and solve such dilemmas. It emphasizes that ethical dilemmas don't always mean
something's wrong; they reveal moral complexity, even if we solve all
preventable problems like corporate scandals. The term "corporation" includes
companies that may not be officially incorporated. A corporation is a legal
structure that allows people to pool money for big projects without taking
individual responsibility for outcomes, raising questions about accountability and
shared responsibility.

(6) Micro and Macro Issues: Think of ethical issues in engineering like a big
picture and a small picture. The small picture, or micro issues, is about choices
made by individuals and companies. The big picture, or macro issues, involves
global matters like the direction of technology, laws that should be made, and
responsibilities of groups like engineering societies and consumer organizations.
Both are crucial in engineering ethics, and they often connect. For example,
consider debates about SUVs. Micro issues involve problems with specific SUVs
like the Ford Explorer and Bridgestone/Firestone tires. Macro issues look at the
bigger impact of all SUVs on the road, focusing on their harm due to rollovers,
accidents, vision problems, gas consumption, and pollution. Should the SUV
issue be tackled by the entire engineering community, or should individuals
handle it? In a democratic society, should engineers stay out of it, letting
consumer groups and lawmakers deal with it? Bigger macro issues surround
public transportation as our population grows, and our resources shrink. It's like
deciding if we should fix a small problem in a car or look at the whole traffic
system. Both matter in engineering ethics.

(7) Cautious Optimism about Technology: Think of technology like a big puzzle. The
big picture, or macro issues, looks at all of technology—how it helps us and what
problems it might bring. Some people worry a lot about technology, saying it causes
pollution, uses up too many natural resources, and even leads to dangerous things like
wars and weapons. On the bright side, technology has made our lives way better. Just
think about electricity, cars, planes, computers, and many more things we use every day.
As authors, we're hopeful about technology, but we also want to be careful. We know
how important technology is for making life better. But, just like in a science experiment,
we need to be realistic about possible problems and dangers that might come with it.
It's like saying, "Let's use technology to make life awesome, but let's also be smart and
careful about it."
1.2 ACCEPTING AND SHARING RESPONSIBILITY
Think about Herbert Hoover, who was a mining engineer before becoming the
president. He thought being an engineer was awesome. It's like creating
something amazing from an idea, turning it into real stuff that helps people—like
building homes and bringing jobs. Engineers make life better. But here's the catch:
when engineers make things, everyone sees them. If something goes wrong, they
can't hide it like doctors or blame others like politicians. Hoover said engineers
can't deny their mistakes—if something they build doesn't work, they're in trouble.
In Hoover's time, individual engineers had more control over projects. If a bridge
fell, you knew who was responsible. Nowadays, even though we have more
engineers, they're not as visible. Mistakes get a lot of attention, but people blame
companies or government, not the actual engineers. The public often sees the top
managers of a company, not the hardworking engineers behind the scenes. This
makes it tough for engineers to connect with the public. Even though engineers
might feel "invisible," those who take responsibility for their work can make a
big difference. That's what Hoover believed, and it still holds true today.

1.2.1 Saving Citicorp Tower


Imagine a challenge faced by structural engineer Bill LeMessurier and architect
Hugh Stub bins while planning New York's fifth-highest skyscraper, Citicorp
Center. St. Peter's Lutheran Church stood on the same lot where the tower needed
to go. So, they agreed: the tower would rise on tall stilts at the center of each side,
and a new St. Peter's church would stand underneath one of the corners. In 1977,
the Citicorp Center was completed, and you can see the new church under the
lower left corner of the raised tower in the picture. It was a clever solution to build
both a modern skyscraper and preserve a historic church.

Fig: 1-1
Imagine a unique skyscraper, the Citicorp Center, designed by engineer Bill
LeMessurier. It stands out because the big stilts aren't at the corners but in the
middle of each side. Half its weight and all the wind force are handled by a special
frame on the outside, not the corners. To keep it steady, they even added
something called a tuned mass damper. Later, a student's questions made
LeMessurier check things. He wondered if the building could handle strong
diagonal winds. The design was only tested for straight-on winds. To his surprise,
he found out that the strong steel joints meant to hold the building together were
never welded, only bolted. The New York office approved it, not realizing how
crucial it was for handling diagonal winds. It was a big oversight that needed
fixing. Engineer Bill LeMessurier faces a tough choice. On one side, he has a
duty to keep his building safe for people. On the other, there are financial
pressures and personal interests telling him to stay quiet. What should he do? He
escapes to his summer house by a lake in Maine. There, in the calm, he goes over
all the numbers again. Suddenly, he realizes he's the only one who can prevent a
disaster by taking action. Once he decides, he acts fast. He meets with insurers,
lawyers, the bank, and city officials to explain the problem. They agree on a plan:
strengthen the wind braces by welding two-inch-thick steel plates over 200 bolted
joints. Journalists are curious, but when a strike shuts down newspapers, they
vanish. Lawyers seek advice from disaster expert Leslie Robertson. LeMessurier
ensures the area around the building can be evacuated if needed. He also sets up
sensors on the structure and insists on an emergency generator for uninterrupted
damper operation. Hurricane Ella looms, causing concern. Luckily, work on the
critical joints is nearly done. The hurricane changes course, so no evacuation is
needed. Still, they're ready for a storm that comes only once every 200 years.
They settle out of court, with Stubbing not blamed. LeMessurier and partners pay
$2 million, covered by insurance. The whole fix costs over $12.5 million.
LeMessurier not only saves lives and keeps his integrity but also boosts his
reputation through the whole ordeal.

1.2.2 Meanings of ‘Responsibility’


When we talk about LeMessurier being responsible, it means different things: he
fulfilled his duties (obligations), he was answerable (accountable) for his actions,
he behaved responsibly (conscientiously), and he deserves praise (admirable).
Let's explore these aspects, starting with obligations - the central concept that
influences the rest.

1.Obligations: Obligations are morally mandatory actions. Some are universal


duties, like being honest, fair, and decent. Others are role responsibilities tied to
specific roles, such as parents, employees, or professionals. For example, a safety
engineer is obligated to conduct regular inspections at a construction site, while
an operations engineer has responsibilities to assess potential benefits and risks
in different systems.

2. Accountable: Being responsible equals being accountable. It involves having


the capacity for moral agency, understanding and acting on moral reasons. Being
answerable for meeting specific obligations means being liable to be held
accountable by others or those in authority. We may need to explain our actions,
providing justifications or reasonable excuses. Holding ourselves accountable
can evoke feelings of self-respect, pride, guilt for harming others, or shame for
falling short of our ideals. Wrongdoing occurs when we knowingly engage in
voluntary actions that are wrong, either through recklessness (flagrant disregard
of known risks and responsibilities) or weakness of will (yielding to temptation
or not trying hard enough). On the other hand, negligence happens when we
unintentionally fail to exercise due care in meeting responsibilities, often due to
incompetence, as seen in shoddy engineering.

3. Conscientious: Engineers like LeMessurier, who are morally admirable, fulfill


their obligations conscientiously, striving to do what is right even in challenging
situations. While no one is perfect, individuals may be conscientious in certain
aspects of life, like their profession, while being less so in other areas, such as
personal responsibilities like raising a child.

4. Blameworthy/Praiseworthy: In situations where accountability for


wrongdoing is the focus, "responsible" can mean blameworthy. In situations
centered on correct conduct, "responsible" becomes a synonym for praiseworthy.
For example, asking "Who is responsible for designing the antenna tower?" could
refer to identifying someone to blame for its collapse or to credit for its success
in withstanding a severe storm.

1.2.3 Dimensions of Engineering

Let's explore the complexities of sharing responsibility within corporations,


unveiling a broader spectrum of moral issues in engineering. As a product evolves
from a concept to completion, engineers grapple with moral and technical
challenges related to materials, work quality, time constraints, market pressures,
and corporate hierarchies. The process, depicted in Figure 1-2, spans from
conceptualization to design, manufacture, sale, use, and disposal.
Products vary from household appliances to entire systems or industrial
complexes, with manufacturing occurring in diverse settings. Engineers may be
employees, entrepreneurs, or consultants. Tasks encompass creating new
concepts, improving existing products, detailed design, and manufacturing based
on specifications. The process involves conceptual design, performance analysis,
detailed analysis, and the creation of specifications and shop drawings.
Manufacturing involves material procurement, part fabrication, assembly, and
product testing. Subsequent stages include selling, delivery, installation, training,
maintenance, repair, and recycling or disposal. However, the process is seldom
linear; modifications and backtracking occur due to errors, performance
improvements, or cost and time constraints. As Herbert Simon noted, design
problem-solving is often ill-structured, with goals and alternatives emerging
throughout the process.
Fig- 1-2
The process is like a loop, as indicated by the arrows in Figure 1-2. Engineers
often pause during an initial solution attempt if they face challenges or think of
improvements. They then go back to an earlier stage with changes in mind. This
revisiting doesn't always follow the same path in subsequent passes through
design, manufacture, and implementation. It depends on the latest findings from
ongoing experiments, influenced by earlier iterations and experience with similar
designs.
1.3 RESPONSIBLE PROFESSIONALS AND ETHICAL CORPORATIONS
Engineering, as a profession, has been closely tied to corporations due to its goal
of creating safe and cost-effective products. Historically, two main stages shaped
engineering's professional development: first, the construction boom of the 19th
century led to large-scale projects, and second, from 1880 to 1920, the demand
for engineers surged, linking engineering more closely to corporations. This shift
brought ethical dilemmas, notably a conflict between professional independence
and corporate loyalty. While corporate influence is not unique to engineering, it
presents challenges that we'll explore alongside broader professional and business
ethics.

1.3.1 What are professions?


In simple terms, a profession, as discussed here, refers to occupations requiring
advanced expertise, self-regulation, and dedicated service to the public good.

1. Advanced expertise: Professions demand advanced skills and theoretical


knowledge. To qualify, individuals usually need extensive formal education,
including technical studies and broader learning in liberal arts. Continuous
education is often necessary for staying updated.

2. Self-regulation: Professional societies, recognized by the public, play a key


role in the profession. They set admission standards, create codes of ethics,
enforce conduct standards, and represent the profession to the public and
government. This autonomy allows individual professionals to exercise
independent judgment in their work.

3. Public Good: The profession serves a crucial public good by making a


collective effort to uphold high ethical standards. For instance, medicine aims to
promote health, law strives to protect legal rights, and engineering focuses on
technological solutions for public well-being, safety, and health. Professional
codes of ethics outline specific goals and guidelines to ensure the commitment to
the public good is consistently maintained across the entire profession.
Critics argue highlighting the positive aspects of "profession" is elitist, suggesting
various forms of work, like hair cutting and garbage collection, contribute to the
public good without advanced expertise. In response, we affirm the value of these
roles but assert that professionalism, involving high ethical standards, skill, and
autonomy, deserves traditional recognition. This is discussed further in the
following questions.

1.3.2 Morally Committed Corporations


In 2001, Enron's corporate scandal, the largest bankruptcy in U.S. history, shook
American confidence. Enron's rapid growth using fraudulent accounting practices
and off-balance sheet partnerships led to its collapse. While most companies
prioritize ethical procedures, Quickie Designs, a wheelchair manufacturer, is an
example of "caring capitalism." Founded by Marilyn Hamilton, paralyzed in a
hang-gliding accident, the company creates innovative products for people with
disabilities and supports charitable initiatives like Winners on Wheels. Larger
corporations, facing intense competition and profit pressures, also strive to
maintain an ethical climate.

1.3.3 Social Responsibility Movement


Since the 1960s, the "social responsibility movement" has focused on product
quality, worker well-being, community, and the environment. This aligns with
"stakeholder theory," emphasizing corporations' responsibilities to groups like
employees, customers, dealers, suppliers, communities, and the public.

Ethics in Engineering
The broader question of how a corporation's product is ultimately used and by
whom is often overlooked because its effects are not immediately evident. Take
used dry-cell batteries, for example, with nearly three billion ending up in the U.S.
municipal waste stream annually. While some corporations genuinely care about
a product's post-factory life, others use excuses, citing a lack of control over how
it's used or discarded.
Socially responsible corporations actively seek solutions, often requiring
industry-wide and government efforts. One notable example is Cardiac
Pacemakers Inc., which invites heart patients to the plant, fostering customer
relationships and employee awareness of their responsibilities.
Critics argue that corporations should focus solely on maximizing profits for
stockholders, claiming no additional responsibilities to society. Nobel Laureate
economist Milton Friedman, in his essay "The Social Responsibility of Business
is to Increase Its Profits," asserts that management's primary responsibility is to
satisfy stockholders' desires for maximum returns. He opposes management
pursuing social goals or philanthropy.
However, Friedman's narrow view invites recognition of wider corporate
responsibilities. Society expects corporations to contribute to the public good and
protect the environment, becoming moral customs. While Friedman's extreme
view could be self-defeating, today's CEOs rarely publicly declare a sole devotion
to maximum profits, recognizing the importance of ethical commitments to long-
term success.
To ensure the alignment of good engineering, good business, and good ethics, it's
crucial for engineering and corporations to be "morally aligned." This alignment, as
argued by Howard Gardner, results in "good work" that is excellent in quality and
socially responsible when professional aims, corporate goals, clients' needs, and
public expectations are congruent. However, certain marketplace forces can
challenge professional standards in engineering, with some corporations prioritizing
quality, safety, and ethics more than others, even under normal economic conditions.
1.3.4 Senses of Corporate Responsibility
When we talk about corporate responsibility, it's crucial to understand that the
term carries various meanings, akin to its application to individuals:

1. Obligations of Corporations: Like individuals, corporations have obligations.


They are structured groups of individuals within legal frameworks. Assigning
responsibilities through policies results in collective corporate actions. For
instance, when Intel establishes a subsidiary, individuals with authority take
specific steps.

2. Accountability of Corporations: Corporations, similar to individuals, are


accountable to the public, employees, customers, and stockholders. They can be
morally responsible entities, with actions carried out by individuals or subgroups
following specified authorities.

3. Virtue of Responsibility in Corporations: Corporations, like individuals,


demonstrate responsibility by consistently meeting obligations. Virtues such as
honesty and fairness can be attributed to corporations based on their conduct.

4. Responsibility in Wrongdoing and Right Conduct: In instances of wrongdoing,


"responsible" means blameworthy; in contexts involving right conduct, it means
praiseworthy. This applies to corporations just as it does to individuals.

These moral differences differ from causal responsibility (being a cause) and
legal responsibility (compliance with the law). Legal responsibility may not align
with moral responsibility, exemplified in cases where a firm is legally responsible
but not morally blameworthy. A moral responsibility arises from the special
relationship created by the incident, requiring the corporation to address the harm
caused by a defective product.
COMMITMENT TO SAFETY

"A ship in harbor is safe, - but that is not what ships are built for." John A. Shedd
Pilot Dan Gellert flew an Eastern Airlines Lockheed L-1011 at 10,000 feet. He
inadvertently dropped his flight plan, causing a sudden dive. Despite shaking
passengers, Gellert recovered control. A colleague experienced a simulated crash
due to autopilot failure. Soon after, an Eastern Airlines L-1011 crashed in Miami
during autopilot engagement, killing 103. Investigating landing gear issues, the
crew engaged autopilot at 2000 feet, leading to a distracted crash. A year later,
Gellert faced another autopilot issue, requiring full takeoff power to land safely
at 200 feet.
These incidents underscore the vulnerability of complex systems, emphasizing
the need for proper human-machine interactions. In this chapter, we delve into
safety from both public and engineering perspectives.
Various groups are involved in safety issues, each with distinct interests. Within
each group, differing opinions on what is safe create ambiguity. We explore the
elusive nature of "safety" and "risk" and discuss safety and risk assessment, along
with methods to reduce risk. Examining nuclear accidents at Three Mile Island
and Chernobyl, we consider the challenges posed by increasing system
complexity and the imperative for safe exits.

5.1 SAFETY AND RISK


We want products and services to be safe, but safety comes at a cost. What's
considered safe can vary from person to person based on perceptions and
vulnerabilities. A power saw is safer in an adult's hands than a child's. A sick
person is more affected by air pollution than a healthy one. Achieving absolute
safety, either as completely risk-free or universally satisfying, is impossible and
costly. Yet, it's crucial to establish a shared understanding of safety.

5.1.2 Risks

We call something unsafe if it puts us at an unacceptable risk. Risk refers to the


potential harm from unwanted events. William D. Rowe talks about the "potential
for the realization of unwanted consequences from impending events," implying
a future possibility.
Risk is a broad concept covering various types of harm, including bodily harm,
economic loss, or environmental degradation related to technology. This can
result from delayed job completion, faulty products, or environmentally harmful
solutions.
Safety has always been a concern in engineering. With technology's increasing
impact on society, public worry about technological risks has also risen.
Measurable hazards from products and production processes are joined by less
obvious effects now gaining public attention, often labeled as new risks. These
might be newly identified or have changed perception due to education, media
coverage, or reduced exposure to other dominant risks.
Natural hazards still pose threats, and technology both mitigates and increases
vulnerability. Our concentrated populations and intricate technological networks
amplify risks from natural disasters. Waste disposal services and public
awareness of potential hazards are also significant concerns.

5.1.3 Acceptability of Risks

In exploring safety, we've modified Lowrance's definition, focusing on


acceptability. William D. Rowe notes a risk is acceptable when those affected are
no longer apprehensive about it. Apprehensiveness depends on factors like (1)
voluntary acceptance, (2) knowledge impact on perceived probabilities, (3) job-
related risks or pressures affecting awareness, (4) immediacy of noticeable effects,
and (5) identifiability of potential victims. Let's illustrate these with examples.

1. Voluntarism And Control: John, Ann, and their children enjoy riding
motorcycles for fun, embracing voluntary risks in this thrilling activity. They
understand that dirt bikes, unlike daily commuting cars, don't need extensive safety
features. All-terrain vehicles, on the other hand, pose greater hazards and were
responsible for many injuries and deaths, especially among children, before being
prohibited in the United States. Connected to voluntarism is the element of control;
the Smiths carefully choose when and where to ride, emphasizing their perception
of control over the situation. Despite acknowledging accident statistics, they
maintain a common tendency to believe that hazards are under their control.
Hazardous sports, including motorbiking, skiing, and others, are often pursued with
a sense of assumed control by enthusiasts, who worry less about their risks compared
to concerns like air pollution or airline safety. Additionally, these sports seldom
harm innocent bystanders.

2. Effect of Information on Risk Assessments: The way information is presented


plays a crucial role in shaping our perception of risks. Consider the example of the
Smiths, who may not use seat belts because they perceive the likelihood of an
accident as extremely low. However, if presented with data indicating a 1 in 3 chance
of sustaining a disabling injury over 50 years of driving at 800 trips per year, their
attitude might change.
Research demonstrates that altering how information is framed can lead to a
significant shift in preferences. In an experiment involving disease control strategies,
participants were asked to choose between Program A (saving 200 people) and
Program B (a 1/3 chance of saving 600 people and a 2/3 chance of saving none).
When presented this way, 72% chose Program A. Yet, when the options were
worded differently—Program C (400 people will die) and Program D (a 1/3 chance
of no deaths and a 2/3 chance of 600 deaths)—only 22% chose Program C, and 78%
opted for Program D.
This experiment suggests that people tend to prefer options perceived as yielding
certain gains over those seen as risky or only probable gains. Moreover,
individuals are more inclined to take risks to avoid perceived certain losses than
they are to secure only possible gains. The framing of information, therefore,
significantly influences our decision-making processes.

3. Job-related Risks: John Smith, working in a shipyard, has been exposed to


asbestos, a concern shared by many coworkers. Despite initial disregard for safety
concerns, he now realizes the prevalence of asbestosis cases among colleagues.
Ann, in a clerical position at the shipyard, exhibits symptoms due to handling her
husband's asbestos-exposed clothes.
John, like many workers, initially dismissed safety precautions, relying on
occasional masks and company assurances. Workers often accept job-related
risks, viewing them as part of the job. While technically voluntary, job-related
risks are sometimes unavoidable due to limited job options. Employees may lack
information about unseen dangers like toxic substances.
Unions and workplace safety rules address severe issues, but standards for
workplace conditions, like air quality, lag behind public environment standards.
The public includes many with low health thresholds, demanding a clean
environment. Factory workers, not rigorously screened, face safety risks, and
unions may resist change. Engineers designing workstations must consider
workers' lax safety attitudes, especially in piecework settings. Complaints about
unsafe conditions, even if isolated, deserve attention. Some worker afflictions,
like carpal tunnel syndrome, lack proper compensation, highlighting gaps in
safety regulations. OSHA's insufficient ergonomic rules for musculoskeletal
disorders underline the need for vigilant attention from engineers.

4. Magnitude and Proximity: Our response to risk is influenced by the fear of


potential harm to individuals close to us, like a child or someone we know, which
affects us more than distant, anonymous threats. The impact of a single significant
event, such as a plane crash, is felt more strongly than ongoing, but less personal,
occurrences like highway accidents. We are more sensitive to risks affecting a
small group of intimate friends than a larger group of strangers.
Misperceptions of numbers can lead to overlooking significant losses. For
example, the collapse of the Quebec Bridge in 1907 affected not only the 75 men
reported but had profound consequences for the Mohawk Indians among them,
altering their community.
Public perceptions of safety pose challenges for engineers. Some view familiar,
controllable things optimistically, while others fear accidents that harm many or
people they know, despite low statistical probabilities. Industry leaders
dismissing concerns about pollution, toxic waste, or nuclear power as emotional
or irrational miss recognizing legitimate public concerns. Engineers must
acknowledge and consider these widely held risk perceptions in their designs.
5.2 ASSESSING AND REDUCING RISK

Improving safety in engineered products often leads to increased product costs.


Conversely, unsafe products result in additional costs for manufacturers,
including warranty expenses, loss of customer goodwill, potential litigation, and
possible disruptions in manufacturing. Both manufacturers and users need to
comprehend the associated risks of a product and evaluate the costs involved in
mitigating or not mitigating those risks.

Products, whether low or high-risk, involve costs. Primary costs include


production and safety measures, while secondary costs involve warranties,
customer goodwill, litigation, and downtime. The least-cost point (M) balances
these costs. If the highest acceptable risk (H) exceeds this point, H becomes the
chosen design or operating point.

5.2.1 Uncertainties in Design

Experience and historical data are crucial for gauging product safety, yet gaps
persist due to industries withholding information, nondisclosure after legal
settlements, and evolving technologies. For example, in the aviation industry, the
purpose of a design—whether to maximize airline profits or optimize return on
investment—impacts decisions and economic success.
Risk emerges from uncertainties faced by engineers in design, manufacturing,
and application. Despite efforts to minimize risk, unforeseen challenges arise, and
new applications or material substitutions can alter safety dynamics. This
unpredictability underscores the complex nature of ensuring safety in engineered
products. For instance, investing $100 million in a large jet for maximum profits
of $20 million may yield a different return on investment than spending $48
million on a smaller jet for a $12 million return.
Designs that perform well under static loads might fail under dynamic loading.
For instance, a historical wooden bridge collapsed when Napoleon's army
marched in step, and Robert Stephenson's steel bridge vibrated violently under
British troops. Wind-induced vibrations led to incidents like the collapse of
"Galloping Gertie" and a power line in Turkey causing arcing and damage below.
Uncertainties also exist in material selection and design skill. Economic changes
or extreme temperatures may impact product design. In 1981, a new bridge in
Wisconsin had to close due to brittle steel in critical components. Variations in
material quality are common, requiring engineers to consider statistical averages
and introduce a "factor of safety" to account for uncertainties in loads, capabilities,
and operating conditions. This factor safeguards against unexpected deviations in
anticipated stresses and the product's designed ability to withstand them.
A safe product surpasses its duty with known capability. However, variations in
calculated and realized stress occur due to component tolerances. Assembly
capability is expressed as a probability density depicted in a "capability" curve
(Figure 5-2). The curve shows the probability that capability equals a specific
strength value. A similar curve for duty considers stress variations due to loads,
environmental conditions, and usage. Nominal values (C and D) guide thinking,
but material properties and load variations challenge certainty. Probability
density curves can take broader shapes (Figure 5-2b), causing overlap in the
shaded region at concerning probability values. Edward B. Haugen warns that the
safety factor concept ignores variability, impacting reliability.

Fig-5.2

A better safety measure is the "margin of safety," illustrated in figure 5-2a.


Computing this margin is challenging. Probability density curves for stress in an
engineered system (a) show variability in a relatively safe case, while (b) depicts
lower safety due to overlap in stress distributions for ordinary daily loads.
Additional complexities arise when dealing with repeatedly changing loads.

5.2.2 Risk Benefit Analyses

Projects, especially public ones, often undergo risk-benefit analysis to determine


if the associated risks are justified by the benefits. This involves assessing
whether the benefits outweigh the risks and if the product, system, or activity is
worth pursuing. When risks and benefits share a common unit (e.g., lives or
dollars), the analysis is straightforward. For instance, an inoculation program may
pose some risks but is deemed worthwhile if it prevents a larger loss of lives
during an epidemic.
However, examining risk-benefit analyses uncovers conceptual challenges. Both
risks and benefits are future-oriented, introducing uncertainty. To address this,
expected values—obtained by multiplying potential loss or gain magnitude by
the probability of occurrence—are considered. Determining these values poses
questions: Who sets them, and how? When benefits materialize soon, but risks
are distant, discounting the future becomes crucial. Additionally, when benefits
go to one party and risks to another, assessing the overall worth becomes more
complex.
Analysing risks and benefits becomes challenging when faced with delayed
effects, especially in times of high interest rates. Discounting the future heavily
distorts the true picture for future generations. Handling risks or benefits with
different units, such as health, aesthetics, and reliability, requires comparing
designs with specified constraints. Shifting from risk-benefit analysis to cost-
effectiveness analysis is subtle but important when evaluating various designs.
Engineers must be aware of these differences to avoid carrying assumptions
between considerations.
Despite challenges, there's a need for a transparent process to assess the
acceptability of risky projects in today's technological society. An ethical
question arises regarding when it's acceptable to impose risks for supposed
benefits. Consideration should extend beyond average risks and benefits to worst-
case scenarios where individuals face maximum risks with minimal benefits.
Ensuring rights aren't violated and providing safer alternatives becomes crucial.
Engineers impact real individuals, and this human element should be considered
as seriously as statistical risk studies.

5.2.3 Personal Risk

Individuals tend to accept voluntary risks more readily than involuntary ones,
even if the voluntary risks are 1000 times more likely to result in fatalities.
Assessing personal risks becomes more challenging, especially with involuntary
risks. For example, imagine John and Ann Smith's concerns about living near a
refinery. If the public supported building a new refinery in their area, should they,
and others in their situation, have the right to oppose its construction?
Willingness to accept voluntary risks compared to involuntary ones depends on
the benefits associated with those risks. Questions arise in cases where
individuals oppose a project but it proceeds. For example, in nuclear power plant
siting, compensation entitlement and adequate amounts become key concerns.
Quantifying personal safety and risk poses numerous challenges. Determining the
dollar value of an individual's life and whether the marketplace should decide are
complex issues. Factors like market manipulation and individual tolerance levels
further complicate the assessment of compensation adequacy.
Assessing personal risk is challenging, leading analysts to use available
quantitative measures. For voluntary activities, life insurance amounts or ransom
considerations may provide insights. Hazardous job evaluations may involve
studying increased wages demanded by workers. Given the complexity of
variables, an open procedure overseen by trained arbiters is suggested. Statistical
averages are more straightforward for population-wide assessments without
singling out individuals.

5.2.4 Public Risk and Public Acceptance

Assessing risks and benefits for the public is easier as individual differences even
out in larger populations. Figure 5-4 vividly illustrates the contrast between costs
of disability from a private versus societal standpoint. Technological safety
assessments can be conducted more easily on a macroscopic scale, with statistical
parameters gaining significance. NHTSA proposed a "blue-book value" for
human life at $200,725 (1972 dollars), considering loss of future income and
accident-associated costs. While this is a convenient measure, recent court cases
show complexity, like a mechanic awarded $6 million for the loss of a
breadwinner and an additional $20 million in punitive damages.
Assessing the social costs of disability involves using the National Safety
Council's equivalent of 6000 disability days for death and L. A. Sagan's 1972 rate
of $50 per day (from "Human Cost of Nuclear Power"). This yields a "death
equivalent" of $3,300,000 for societal value analysis. In a recent case, a mechanic
received $6 million for a breadwinner's loss, with an additional $20 million in
punitive damages. Shulamit Kahn suggests an $8 million labor market value for
life. People value others' lives 115-230% more than this. The $8 million figure is
higher than typically used in policy analysis, implying analysts might undervalue
subjects' lives compared to public perception.

5.2.5 Examples of Improved Society

This isn't a design guide, but here are examples showing safety doesn't require
complex features:
1. Magnetic Door Catch:
- Problem Addressed: Preventing child asphyxiation in refrigerators.
- Safety Feature: Modern, affordable magnetic catches allow easy opening from
inside.
2. Dead-Man Handle (Train Speed Control):
- Problem Addressed: Ensuring control if the engineer becomes incapacitated.
- Safety Feature: Train powers only when the handle is pressed; release halts it.
3. Highball Signaling System (Railroads):
- Problem Addressed: Signaling trains to proceed or stop safely.
- Safety Feature: Fail-safe ball-raising mechanism with a reliable cable system.
4. Motor-Reversing System (Figure 5-5):
- Problem Addressed: Avoiding battery short-circuits in motor-reversing systems.
- Safety Feature: Simple wire reconnection prevents damage after sticky contacts.
These examples show safety can be effective without complex features or added
costs.

Fig:5-5

In the rush to launch a product, safety is often overlooked. If projects were treated
as experiments entering an active phase, akin to space flights, safety might
receive more attention. Mundane ventures, however, tend to neglect less obvious
dangers.
Consider the reversing switch for a permanent magnet motor, where a stuck
switch can lead to a battery short circuit. This example illustrates the importance
of safety features. In the broader context, if moral concerns don't drive engineers
to prioritize safety, recent shifts in product liability law should encourage more
vigilance.

You might also like