Measuring The Sustainability Performance of Software Projects
Measuring The Sustainability Performance of Software Projects
Measuring The Sustainability Performance of Software Projects
Felipe Albertao, Jing Xiao, Chunhua Tian Yu Lu, Kun Qiu Zhang, Cheng Liu
Shenyang Eco-city Research Institute (SERI) Software College
IBM Research - China Northeastern University
Beijing, China Shenyang, China
{albertao, xiaojcrl, chtian}@cn.ibm.com [email protected], [email protected], [email protected]
Authorized licensed use limited to: Monash University. Downloaded on August 27,2021 at 02:03:11 UTC from IEEE Xplore. Restrictions apply.
The Sustainability Performance is measured and analyzed – Social Benefit: Reduced cost for technology adop-
against a set of Quality Attributes[9], which if improved tion, by minimizing user’s dependency on latest
will bring economic, social and environmental benefits: technology.
• Development-Related Properties: Properties that im- – Environmental Benefit: Minimizes e-waste by ex-
pact the development process. tending the lifetime of old hardware.
– Modifiability: The ability to introduce changes • Supportability:
quickly and cost effectively. – Economic Benefit: Increased customer base due to
– Reusability: Level in which system components reduced support costs.
can be reused in other systems. – Social Benefit: Vendor’s independence increases
– Portability: Ability of the system to run under the product usability and thus accessible to a larger
different computing environments. population.
– Supportability: System’s ability to be easily con- – Environmental Benefit: Indirect benefit: minimizes
figured and maintained after deployment. resources required to provide support (transporta-
• Usage-Related Properties: Properties that impact the
tion, physical material, etc. . . )
user at run-time. • Performance:
– Performance: The time required by the system to – Economic Benefit: Improves productivity.
respond to user requests. – Social Benefit: Minimizes dependency on latest
– Dependability: The ability of a system to function technology.
as expected at any given time. – Environmental Benefit: Minimizes e-waste by ex-
– Usability: Features that enable a system to be user tending hardware lifetime. Minimizes energy con-
friendly. sumption through less computer usage time.
– Accessibility: The system’s ability to serve people • Dependability:
regardless of location, experience, background, or – Economic Benefit: Minimizes support and mainte-
the type of computer technology used. nance costs.
• Process-Related Properties: Properties that impact – Social Benefit: Increases societal productivity.
project management – Environmental Benefit: Indirect benefit: Minimizes
energy waste.
– Predictability: The team’s ability to accurately
estimate effort and cost upfront. • Usability:
– Efficiency: The overhead of production processes – Economic Benefit: Increases customer satisfaction.
over the bottom-line value perceived by the cus- Minimizes support costs.
tomer. – Social Benefit: Contributes to the digital inclusion,
– Project’s Footprint: Natural resources and environ- by eliminating barriers (learning curve) and mak-
mental impact used during software development. ing system more accessible to a broader number
The economic, social and environmental benefits of each of users.
property are detailed below: – Environmental Benefit: Less waste of resources
used in training (books, training rooms, energy,
• Modifiability:
etc. . . ).
– Economic Benefit: Minimizes development and • Accessibility:
support costs.
– Economic Benefit: Increases potential market
– Social Benefit: Enables system to be continuously
and/or audience.
adapted to meet societal demands.
– Social Benefit: Enables technology to minori-
– Environmental Benefit: Minimizes environmental
ties, elderly, people with disabilities, non-English
waste through less effort in producing and main-
speaking communities, and illiterate population.
taining existing system.
– Environmental Benefit: Increases multicultural
• Reusability:
awareness and provides equal opportunities. Indi-
– Economic Benefit: Accelerates time-to-market. rect benefit.
– Social Benefit: Enables the production of new • Predictability:
products with less effort. – Economic Benefit: Minimizes risks of budget over-
– Environmental Benefit: Minimizes environmental run.
impact through less effort in producing system. – Social Benefit: Increases team’s conditions of work
• Portability: (avoid long workhours).
– Economic Benefit: Increases potential market and – Environmental Benefit: Optimizes use of environ-
system’s lifetime. mental resources.
370
Authorized licensed use limited to: Monash University. Downloaded on August 27,2021 at 02:03:11 UTC from IEEE Xplore. Restrictions apply.
• Efficiency: However, a system maximally stable is also unchangeable,
– Economic Benefit: Maximizes product value. and in fact some packages must be unstable enough to allow
– Social Benefit: Minimizes effort waste. changes. Therefore it is also necessary to measure how much
– Environmental Benefit: Optimizes use of environ- a package can withstand change, which in object-oriented
mental resources. design is accomplished by the use of abstract classes. This
• Project’s Footprint: metric is called Abstractness:
– Economic Benefit: Indirect benefit. A = N a/N c
– Social Benefit: Indirect benefit. Where:
– Environmental Benefit: Reduces fuel consumption • Na: Number of abstract classes in a given package
and emissions, office space utilization, and maxi- • Nc: Number of concrete classes in a given package
mizes use of shared resources. Abstractness ranges from 0 to 1, where 0 means the package
The list below outlines the metrics used for each property. is completely concrete and 1 means completely abstract.
• Modifiability: Distance From Main Sequence.
We can now measure the relationship between Instability
• Reusability: Abstractness, Instability.
and Abstractness: The two extremes (a maximally stable and
• Portability: Estimated System Lifetime.
concrete package, versus a maximally unstable and abstract
• Supportability: Support Rate.
package) are both undesirable: The ideal package is one with
• Performance: Relative Response Time.
a “balanced” Instability and Abstractness. The final metric
• Dependability: Defect Density, Testing Efficiency, Test-
measures how far a package is from the idealized balance,
ing Effectiveness. or the Distance From Main Sequence:
• Usability: Learnability, Effectiveness, Error Rate.
D = |A + I − 1|
• Accessibility: Support for motor-impaired users, visual-
Distance ranges from 0 to 1, where 0 means the package
impaired users, blind users, users with language and has the ideal balance and 1 means the package requires
cognitive disabilities, illiterate users, and international- redesign and refactoring.
ization and localization. The table below shows the metrics for the UWMP source
• Predictability: Estimation Quality Rate.
code:
Package Ca Ce I A D
• Efficiency: Project Efficiency.
Java: map.bean 8 2 0.20 0 0.80
• Project’s Footprint: Work-From-Home Days, Long-
Java: map.io 2 3 0.60 0 0.40
Haul Roundtrips.
Java: common 1 0 0 0 1
IV. C ASE S TUDY Java: kpi.servlet 0 1 1 0 0
The metrics used to assess each one of the properties Java: map.business 0 4 1 0 0
are described below, along with the Sustainability Perfor- Flex: comp 2 19 0.90 0 0.10
mance findings for the Urban Water Management Platform Flex: comp.key 1 1 0.50 0 0.50
(UWMP), a software project developed by IBM Research - Flex: kpi 1 1 0.50 0 0.50
China, Northeastern University and the Shenyang Eco-city Flex: beans 21 1 0.04 0 0.96
Research Institute (SERI). Flex: maps 1 18 0.95 0 0.05
A. Modifiability and Reusability Flex: business 1 1 0.50 0 0.50
Flex: events 20 1 0.05 0 0.95
Systems with a high number of interdependencies are hard
The metrics indicate a high number of packages that
to maintain because the impact of a given change is hard to
require redesign (D close to 1). The fact that packages were
assess. In order to address this problem, a common practice
organized by design patterns (beans, business, servlets, io)
in object-oriented design is to measure the dependency
rather than their actual functions are the most likely cause
among the classes of a given system[10]. The first metric to
for such instability. The key conclusion is that the packages
be analyzed is the Instability, which measures the potential
should be reorganized by system functionality.
impact of changes in a given package:
I = Ce/(Ca + Ce) B. Portability
Where: The goal is to maximize the hardware’s lifetime by its
• Afferent Couplings (Ca): The number of classes outside actual physical durability rather than forcing its obsolesce
a package that depend upon classes within the package. by software platform requirements. Therefore it is desirable
• Efferent Couplings (Ce): The number of classes inside a to measure the Estimated System Lifetime, or the estimated
package that depend upon classes outside the package. number of years the minimum hardware required by the
Instability ranges from 0 to 1, where 0 means that the system reached the market.
package is maximally stable, and 1 means that the package UWMP depends on the following platforms and com-
is maximally unstable. ponents: Adobe Flash Player 10, Java 6 on Windows XP,
371
Authorized licensed use limited to: Monash University. Downloaded on August 27,2021 at 02:03:11 UTC from IEEE Xplore. Restrictions apply.
IBM SPSS Statistics 18, Geoserver and PostgreSQL. By • Effectiveness: Number of tasks completed without as-
investigating its minimum hardware requirements, we have sistance / Total number of tasks
estimated that the system can run in hardware as old as from 5 / 11 = 0.45
2003, giving it a lifetime of 7 years. We concluded that 7 • Error Rate: Number of tasks which were completed but
years is an acceptable time frame based on the types of PCs deviated from normal course of action / Total number
used by the UWMP’s target users, and therefore the system of tasks
is not going to require unnecessary hardware upgrades. 2 / 11 = 0.18
C. Supportability G. Accessibility
The metric Support Rate was used, which is calculated by We reviewed accessibility based on a list of
the number of user questions that required assistance divided requirements[12] and using the following score: 0=Non
by the number of minutes the system was used in a given Existent, 1=Not Adequate, 2=Acceptable, 3=Adequate.
session. Because UWMP is a new system with no support • Support for motor impaired users: 1 = Flash Player 10
history, we have used the results of the usability study to supports motor control, but UWMP does not provide
calculate the Support Rate, as such: The usability test subject shortcuts
performed 4 tasks that required assistance / 8.3 minutes = • Support for visual-impaired users: 1 = Flash Player 10
0.48 supports color adjustments, however it does not include
UWMP is a web-based system, not requiring installation features for low-vision users, such as zoom-in or font
on the user’s desktop. Desktop-based systems can also sizing.
use the Estimated Installation Time metric, which is the • Support for blind users: 2 = Flash Player 10 supports
amount of time the user takes to install the product without screen readers.
assistance. • Support for users with language and cognitive disabil-
ities: 0 = UWMP uses very technical language, and no
D. Performance
investigation was carried to find out what professionals
A measure of performance is relative to the objective of with such disabilities in the water management industry
the system. Since our goal is to improve user’s productivity, do to overcome such challenges.
we use the Relative Response Time metric, which is the • Support for illiterate users: 2 = UWMP has no support
number of tasks with an unacceptable response-time divided for illiterate users, however it is extremely unlikely that
by the total number of tasks tested. The usability study the target user (water-management professionals) will
indicated one task with an unacceptable response-time from be illiterate.
a total of 11 tasks, or a Relative Response Time of 0.09. • Internationalization and localization support: 2 =
372
Authorized licensed use limited to: Monash University. Downloaded on August 27,2021 at 02:03:11 UTC from IEEE Xplore. Restrictions apply.
the number of iterations where the difference was +/- 20% VI. C ONCLUSION AND F UTURE W ORK
by the number of total iterations in the project. The result This study introduces a method to monitor the sustain-
was zero, given that no iteration had more or less than 20% ability of software projects by measuring a set of metrics
difference in points. over several releases of a software product. The method
outlined in this paper uses well-known existing software
I. Efficiency
measurements and practices, however they were selected
The Project Efficiency is measured by the effort towards and organized considering their environmental, social and
deliverables that add direct value to the customer (such as economic benefits.
coding, manuals, etc. . . ) versus project-related effort (such Further work is necessary in order to develop benchmarks
as infrastructure maintenance, project management, etc...). In for the metrics, by applying this methodology on several
the UWMP project, 38 points were used in infrastructure- other projects of different sizes. Such effort would be greatly
related tasks, out of a total of 310 points (12.25%). Therefore facilitated if the collection of the metrics were automated.
the Project Efficiency is 88%.
R EFERENCES
J. Project’s Footprint
[1] World Commission on Environment and Development, Our
It is hard to accurately calculate the amount of natural Common Future, 1987
resources used by the team, such as fuel or paper. However,
the goal is not to extract an exact amount, but instead to [2] Silicon Valley Toxics Coalition, Poison PCs and Toxic TVs,
2001.
allow the team to progressively improve the metrics over
time. We have chosen two metrics that reflect resource- [3] Robert C. Seacord, et al., Measuring Software Sustainability,
intensive activities, such as transportation from/to the office, IEEE, 2003
and long-haul trips: http://www.computer.org/portal/web/csdl/doi/10.1109/ICSM.2003.1235455
• Work-From-Home Days: 2 days out of 165 total team- [4] Kevin Tate, Sustainable Software Development: An Agile
days (33 project days * 5 team-members) = 1.21% Perspective, 2005
• Long-Haul Roundtrips: By airplane: 6; By train: 0.
[5] Rogier van Bakel, Origin’s Original, Wired Magazine, Issue
V. R ESULTS 4.11, 1996.
http://www.wired.com/wired/archive/4.11/es_wintzen.html
Based on the analysis above, the following problems were
identified: [6] IfPeople, ResponsibleIT Standard.
1) Modifiability and Reusability: Packages organized by http://www.ifpeople.net/learn/resources/sustainability/responsibleIT-
ifpeople.pdf
design instead of functionality.
2) Portability: No action required. [7] IBM Green Sigma Solution Brief
3) Supportability: No action required. ftp://public.dhe.ibm.com/common/ssi/ecm/en/gvs03015usen/GVS03015USEN.PDF
4) Performance: Low response time of the “Data Audit
Analysis” function. [8] Felipe Albertao, Sustainable Software Engineering, Carnegie
Mellon University, 2004.
5) Dependability: No action required. http://www.scribd.com/doc/5507536/Sustainable-Software-
6) Usability: No action required. Engineering
7) Accessibility: Short-cuts not available; Zoom-in or
font-sizing not available; Chinese “wan” not correctly [9] Len Bass, Paul Clements, and Rick Kazman, Software Ar-
translated. chitecture In Practice - CMU SEI Series, Carnegie Mellon
University
8) Predictability: No action required.
9) Efficiency: No action required. [10] Robert Martin, OO Design Quality Metrics: An Analysis of
10) Project’s Footprint: Team-members rarely work from Dependencies, 1994.
home; No trips by train. http://www.objectmentor.com/resources/articles/oodmetrc.pdf
The list above was prioritized based on 3 attributes (cus- [11] Jakob Nielsen, Usability Engineering, 1994
tomer value, effort, and level of improvement) and the
following specific Sustainability Improvement Goals were [12] List partially adapted from the Web Accessibility Initiative,
defined for the next release of the software: Ball State University.
http://www.bsu.edu/web/bsuwai/use.htm
• Goal #1: Improve the “Data Audit Analysis” response
time by 50%
• Goal #2: Redesign the package structure and improve
the Distance metric on at least 5 packages.
• Goal #3: Prioritize train trips whenever possible.
373
Authorized licensed use limited to: Monash University. Downloaded on August 27,2021 at 02:03:11 UTC from IEEE Xplore. Restrictions apply.