HUM Quiz 1 Note
HUM Quiz 1 Note
HUM Quiz 1 Note
Engineers must prioritize public safety. ABET should learn from mistakes.
Recognize both risks and benefits in technology. Ethics involves appreciating
engineering's positive impact. This chapter outlines engineering ethics, stresses
moral responsibility, and aligns professionalism with business goals in the
corporate setting.
(4) Promoting Responsible Conduct and Preventing Wrong: In the early 2000s,
corporate scandals, such as Enron's bankruptcy, raised concerns about ethical practices.
Compliance issues are all about ensuring individuals adhere to professional standards,
preventing fraudulent and unethical behaviours. Laws and regulations play a crucial
role in deterring immoral acts, including fraud, theft, and bribery. It's essential to delve
into the reasons why some engineers might participate in wrongdoing rather than
reporting it. Yet, the primary emphasis should be on "preventive ethics"—taking actions
to stop unethical practices before they occur. Most engineers and corporations are
morally committed, and the connection between ethics and excellence should be value-
driven rather than solely reliant on compliance-based procedures, as discussed in
management theory. The main goal is to encourage and support responsible conduct,
fostering a strong ethical foundation in both individuals and corporations.
(5) Myriad Moral Reasons generate Ethical Dilemmas: Picture a chemical
engineer at a computer company. She discovers the company might be releasing
too much lead and arsenic into the city sewer, which becomes fertilizer for local
farmers. The engineer thinks stricter pollution controls are needed, but her
manager disagrees, saying it's too costly. She's torn between her duty to the
company, the community, her family, and her career. What should she do? These
tough situations are called ethical dilemmas, where moral reasons clash. In
engineering, moral values can conflict, and this book explores ways to understand
and solve such dilemmas. It emphasizes that ethical dilemmas don't always mean
something's wrong; they reveal moral complexity, even if we solve all
preventable problems like corporate scandals. The term "corporation" includes
companies that may not be officially incorporated. A corporation is a legal
structure that allows people to pool money for big projects without taking
individual responsibility for outcomes, raising questions about accountability and
shared responsibility.
(6) Micro and Macro Issues: Think of ethical issues in engineering like a big
picture and a small picture. The small picture, or micro issues, is about choices
made by individuals and companies. The big picture, or macro issues, involves
global matters like the direction of technology, laws that should be made, and
responsibilities of groups like engineering societies and consumer organizations.
Both are crucial in engineering ethics, and they often connect. For example,
consider debates about SUVs. Micro issues involve problems with specific SUVs
like the Ford Explorer and Bridgestone/Firestone tires. Macro issues look at the
bigger impact of all SUVs on the road, focusing on their harm due to rollovers,
accidents, vision problems, gas consumption, and pollution. Should the SUV
issue be tackled by the entire engineering community, or should individuals
handle it? In a democratic society, should engineers stay out of it, letting
consumer groups and lawmakers deal with it? Bigger macro issues surround
public transportation as our population grows, and our resources shrink. It's like
deciding if we should fix a small problem in a car or look at the whole traffic
system. Both matter in engineering ethics.
(7) Cautious Optimism about Technology: Think of technology like a big puzzle. The
big picture, or macro issues, looks at all of technology—how it helps us and what
problems it might bring. Some people worry a lot about technology, saying it causes
pollution, uses up too many natural resources, and even leads to dangerous things like
wars and weapons. On the bright side, technology has made our lives way better. Just
think about electricity, cars, planes, computers, and many more things we use every day.
As authors, we're hopeful about technology, but we also want to be careful. We know
how important technology is for making life better. But, just like in a science experiment,
we need to be realistic about possible problems and dangers that might come with it.
It's like saying, "Let's use technology to make life awesome, but let's also be smart and
careful about it."
1.2 ACCEPTING AND SHARING RESPONSIBILITY
Think about Herbert Hoover, who was a mining engineer before becoming the
president. He thought being an engineer was awesome. It's like creating
something amazing from an idea, turning it into real stuff that helps people—like
building homes and bringing jobs. Engineers make life better. But here's the catch:
when engineers make things, everyone sees them. If something goes wrong, they
can't hide it like doctors or blame others like politicians. Hoover said engineers
can't deny their mistakes—if something they build doesn't work, they're in trouble.
In Hoover's time, individual engineers had more control over projects. If a bridge
fell, you knew who was responsible. Nowadays, even though we have more
engineers, they're not as visible. Mistakes get a lot of attention, but people blame
companies or government, not the actual engineers. The public often sees the top
managers of a company, not the hardworking engineers behind the scenes. This
makes it tough for engineers to connect with the public. Even though engineers
might feel "invisible," those who take responsibility for their work can make a
big difference. That's what Hoover believed, and it still holds true today.
Fig: 1-1
Imagine a unique skyscraper, the Citicorp Center, designed by engineer Bill
LeMessurier. It stands out because the big stilts aren't at the corners but in the
middle of each side. Half its weight and all the wind force are handled by a special
frame on the outside, not the corners. To keep it steady, they even added
something called a tuned mass damper. Later, a student's questions made
LeMessurier check things. He wondered if the building could handle strong
diagonal winds. The design was only tested for straight-on winds. To his surprise,
he found out that the strong steel joints meant to hold the building together were
never welded, only bolted. The New York office approved it, not realizing how
crucial it was for handling diagonal winds. It was a big oversight that needed
fixing. Engineer Bill LeMessurier faces a tough choice. On one side, he has a
duty to keep his building safe for people. On the other, there are financial
pressures and personal interests telling him to stay quiet. What should he do? He
escapes to his summer house by a lake in Maine. There, in the calm, he goes over
all the numbers again. Suddenly, he realizes he's the only one who can prevent a
disaster by taking action. Once he decides, he acts fast. He meets with insurers,
lawyers, the bank, and city officials to explain the problem. They agree on a plan:
strengthen the wind braces by welding two-inch-thick steel plates over 200 bolted
joints. Journalists are curious, but when a strike shuts down newspapers, they
vanish. Lawyers seek advice from disaster expert Leslie Robertson. LeMessurier
ensures the area around the building can be evacuated if needed. He also sets up
sensors on the structure and insists on an emergency generator for uninterrupted
damper operation. Hurricane Ella looms, causing concern. Luckily, work on the
critical joints is nearly done. The hurricane changes course, so no evacuation is
needed. Still, they're ready for a storm that comes only once every 200 years.
They settle out of court, with Stubbing not blamed. LeMessurier and partners pay
$2 million, covered by insurance. The whole fix costs over $12.5 million.
LeMessurier not only saves lives and keeps his integrity but also boosts his
reputation through the whole ordeal.
Ethics in Engineering
The broader question of how a corporation's product is ultimately used and by
whom is often overlooked because its effects are not immediately evident. Take
used dry-cell batteries, for example, with nearly three billion ending up in the U.S.
municipal waste stream annually. While some corporations genuinely care about
a product's post-factory life, others use excuses, citing a lack of control over how
it's used or discarded.
Socially responsible corporations actively seek solutions, often requiring
industry-wide and government efforts. One notable example is Cardiac
Pacemakers Inc., which invites heart patients to the plant, fostering customer
relationships and employee awareness of their responsibilities.
Critics argue that corporations should focus solely on maximizing profits for
stockholders, claiming no additional responsibilities to society. Nobel Laureate
economist Milton Friedman, in his essay "The Social Responsibility of Business
is to Increase Its Profits," asserts that management's primary responsibility is to
satisfy stockholders' desires for maximum returns. He opposes management
pursuing social goals or philanthropy.
However, Friedman's narrow view invites recognition of wider corporate
responsibilities. Society expects corporations to contribute to the public good and
protect the environment, becoming moral customs. While Friedman's extreme
view could be self-defeating, today's CEOs rarely publicly declare a sole devotion
to maximum profits, recognizing the importance of ethical commitments to long-
term success.
To ensure the alignment of good engineering, good business, and good ethics, it's
crucial for engineering and corporations to be "morally aligned." This alignment, as
argued by Howard Gardner, results in "good work" that is excellent in quality and
socially responsible when professional aims, corporate goals, clients' needs, and
public expectations are congruent. However, certain marketplace forces can
challenge professional standards in engineering, with some corporations prioritizing
quality, safety, and ethics more than others, even under normal economic conditions.
1.3.4 Senses of Corporate Responsibility
When we talk about corporate responsibility, it's crucial to understand that the
term carries various meanings, akin to its application to individuals:
These moral differences differ from causal responsibility (being a cause) and
legal responsibility (compliance with the law). Legal responsibility may not align
with moral responsibility, exemplified in cases where a firm is legally responsible
but not morally blameworthy. A moral responsibility arises from the special
relationship created by the incident, requiring the corporation to address the harm
caused by a defective product.
COMMITMENT TO SAFETY
"A ship in harbor is safe, - but that is not what ships are built for." John A. Shedd
Pilot Dan Gellert flew an Eastern Airlines Lockheed L-1011 at 10,000 feet. He
inadvertently dropped his flight plan, causing a sudden dive. Despite shaking
passengers, Gellert recovered control. A colleague experienced a simulated crash
due to autopilot failure. Soon after, an Eastern Airlines L-1011 crashed in Miami
during autopilot engagement, killing 103. Investigating landing gear issues, the
crew engaged autopilot at 2000 feet, leading to a distracted crash. A year later,
Gellert faced another autopilot issue, requiring full takeoff power to land safely
at 200 feet.
These incidents underscore the vulnerability of complex systems, emphasizing
the need for proper human-machine interactions. In this chapter, we delve into
safety from both public and engineering perspectives.
Various groups are involved in safety issues, each with distinct interests. Within
each group, differing opinions on what is safe create ambiguity. We explore the
elusive nature of "safety" and "risk" and discuss safety and risk assessment, along
with methods to reduce risk. Examining nuclear accidents at Three Mile Island
and Chernobyl, we consider the challenges posed by increasing system
complexity and the imperative for safe exits.
5.1.2 Risks
1. Voluntarism And Control: John, Ann, and their children enjoy riding
motorcycles for fun, embracing voluntary risks in this thrilling activity. They
understand that dirt bikes, unlike daily commuting cars, don't need extensive safety
features. All-terrain vehicles, on the other hand, pose greater hazards and were
responsible for many injuries and deaths, especially among children, before being
prohibited in the United States. Connected to voluntarism is the element of control;
the Smiths carefully choose when and where to ride, emphasizing their perception
of control over the situation. Despite acknowledging accident statistics, they
maintain a common tendency to believe that hazards are under their control.
Hazardous sports, including motorbiking, skiing, and others, are often pursued with
a sense of assumed control by enthusiasts, who worry less about their risks compared
to concerns like air pollution or airline safety. Additionally, these sports seldom
harm innocent bystanders.
Experience and historical data are crucial for gauging product safety, yet gaps
persist due to industries withholding information, nondisclosure after legal
settlements, and evolving technologies. For example, in the aviation industry, the
purpose of a design—whether to maximize airline profits or optimize return on
investment—impacts decisions and economic success.
Risk emerges from uncertainties faced by engineers in design, manufacturing,
and application. Despite efforts to minimize risk, unforeseen challenges arise, and
new applications or material substitutions can alter safety dynamics. This
unpredictability underscores the complex nature of ensuring safety in engineered
products. For instance, investing $100 million in a large jet for maximum profits
of $20 million may yield a different return on investment than spending $48
million on a smaller jet for a $12 million return.
Designs that perform well under static loads might fail under dynamic loading.
For instance, a historical wooden bridge collapsed when Napoleon's army
marched in step, and Robert Stephenson's steel bridge vibrated violently under
British troops. Wind-induced vibrations led to incidents like the collapse of
"Galloping Gertie" and a power line in Turkey causing arcing and damage below.
Uncertainties also exist in material selection and design skill. Economic changes
or extreme temperatures may impact product design. In 1981, a new bridge in
Wisconsin had to close due to brittle steel in critical components. Variations in
material quality are common, requiring engineers to consider statistical averages
and introduce a "factor of safety" to account for uncertainties in loads, capabilities,
and operating conditions. This factor safeguards against unexpected deviations in
anticipated stresses and the product's designed ability to withstand them.
A safe product surpasses its duty with known capability. However, variations in
calculated and realized stress occur due to component tolerances. Assembly
capability is expressed as a probability density depicted in a "capability" curve
(Figure 5-2). The curve shows the probability that capability equals a specific
strength value. A similar curve for duty considers stress variations due to loads,
environmental conditions, and usage. Nominal values (C and D) guide thinking,
but material properties and load variations challenge certainty. Probability
density curves can take broader shapes (Figure 5-2b), causing overlap in the
shaded region at concerning probability values. Edward B. Haugen warns that the
safety factor concept ignores variability, impacting reliability.
Fig-5.2
Individuals tend to accept voluntary risks more readily than involuntary ones,
even if the voluntary risks are 1000 times more likely to result in fatalities.
Assessing personal risks becomes more challenging, especially with involuntary
risks. For example, imagine John and Ann Smith's concerns about living near a
refinery. If the public supported building a new refinery in their area, should they,
and others in their situation, have the right to oppose its construction?
Willingness to accept voluntary risks compared to involuntary ones depends on
the benefits associated with those risks. Questions arise in cases where
individuals oppose a project but it proceeds. For example, in nuclear power plant
siting, compensation entitlement and adequate amounts become key concerns.
Quantifying personal safety and risk poses numerous challenges. Determining the
dollar value of an individual's life and whether the marketplace should decide are
complex issues. Factors like market manipulation and individual tolerance levels
further complicate the assessment of compensation adequacy.
Assessing personal risk is challenging, leading analysts to use available
quantitative measures. For voluntary activities, life insurance amounts or ransom
considerations may provide insights. Hazardous job evaluations may involve
studying increased wages demanded by workers. Given the complexity of
variables, an open procedure overseen by trained arbiters is suggested. Statistical
averages are more straightforward for population-wide assessments without
singling out individuals.
Assessing risks and benefits for the public is easier as individual differences even
out in larger populations. Figure 5-4 vividly illustrates the contrast between costs
of disability from a private versus societal standpoint. Technological safety
assessments can be conducted more easily on a macroscopic scale, with statistical
parameters gaining significance. NHTSA proposed a "blue-book value" for
human life at $200,725 (1972 dollars), considering loss of future income and
accident-associated costs. While this is a convenient measure, recent court cases
show complexity, like a mechanic awarded $6 million for the loss of a
breadwinner and an additional $20 million in punitive damages.
Assessing the social costs of disability involves using the National Safety
Council's equivalent of 6000 disability days for death and L. A. Sagan's 1972 rate
of $50 per day (from "Human Cost of Nuclear Power"). This yields a "death
equivalent" of $3,300,000 for societal value analysis. In a recent case, a mechanic
received $6 million for a breadwinner's loss, with an additional $20 million in
punitive damages. Shulamit Kahn suggests an $8 million labor market value for
life. People value others' lives 115-230% more than this. The $8 million figure is
higher than typically used in policy analysis, implying analysts might undervalue
subjects' lives compared to public perception.
This isn't a design guide, but here are examples showing safety doesn't require
complex features:
1. Magnetic Door Catch:
- Problem Addressed: Preventing child asphyxiation in refrigerators.
- Safety Feature: Modern, affordable magnetic catches allow easy opening from
inside.
2. Dead-Man Handle (Train Speed Control):
- Problem Addressed: Ensuring control if the engineer becomes incapacitated.
- Safety Feature: Train powers only when the handle is pressed; release halts it.
3. Highball Signaling System (Railroads):
- Problem Addressed: Signaling trains to proceed or stop safely.
- Safety Feature: Fail-safe ball-raising mechanism with a reliable cable system.
4. Motor-Reversing System (Figure 5-5):
- Problem Addressed: Avoiding battery short-circuits in motor-reversing systems.
- Safety Feature: Simple wire reconnection prevents damage after sticky contacts.
These examples show safety can be effective without complex features or added
costs.
Fig:5-5
In the rush to launch a product, safety is often overlooked. If projects were treated
as experiments entering an active phase, akin to space flights, safety might
receive more attention. Mundane ventures, however, tend to neglect less obvious
dangers.
Consider the reversing switch for a permanent magnet motor, where a stuck
switch can lead to a battery short circuit. This example illustrates the importance
of safety features. In the broader context, if moral concerns don't drive engineers
to prioritize safety, recent shifts in product liability law should encourage more
vigilance.