Pega Express Testing and Validation

Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

Pega Express testing and validation

6 Topics
Release 1
Pega Express
English
The goal of the Pega Express™ approach is to deliver a high-quality working system in the simplest, most efficient way
possible. Pega Platform™ provides tools and processes to streamline your team's application testing. Improved testing
ensures that the solution you create, build, and deliver is one that your client loves — a Minimum Lovable Product (MLP).

After completing this module, you should be able to:

Incorporate Pega Express best practices into your application testing


Identify Pega Platform native tools to accelerate testing cycles
Support best practice testing, including unit, functional, performance, security, and scenario tests
Validate your MLP release for lovability

Available in the following mission:

Pega Express Delivery

Topics

Lovability testing
Testing for lovability explained
The Pega Express™ testing strategy is designed and executed during the Build phase. Testing is important to validate that
the Pega Platform solution is not just functional but also lovable by the client.

The testing strategy does this through:

Shifting left – Start testing earlier in the project and make it a component of each sprint
User testing – Include business users earlier in the project to provide feedback, as well as professional testers; user
testing is different from UAT (user acceptance testing)
Planning – Create a test plan to ensure that the application meets business outcomes, not just functional requirements
Consistency – Agree and adhere to a definition of done (DoD)
Tools – Use built-in validation tools such as the Application Quality Dashboard or Pega Predictive Diagnostic Cloud
Automation – Build an automated test suite with Pega Platform or third-party tools

Note: Testing is integral to project success, ensuring that the Pega solution is bothlovable and high quality.
Unlike traditional IT projects that only do testing at the end, the Pega Express strategy is to start testing activities
during the Prepare (or even Discover) phase and continue for the duration of the project.
View the short video for an overview on Pega Express testing.

Shifting left
Shifting left means that you begin testing activities earlier in the project life cycle. DevOps uses this practice in software
development to help teams focus on quality, work on problem prevention instead of detection, and begin testing earlier in the
project timeline. It plans for and validates your application quality from the first day.

As shown in the following image, the Pega Express delivery approach front-loads the project with various testing and
validation activities. This concept of testing early, often, and continuously is central to Pega Express and supported by a
formal process and set of built-in tools to promote quality checks within each sprint and throughout the entire project life
cycle.

Shifting left is straight forward with Pega Express, it is achieved through the following (note due to this approach issue
identification should be greatest during Sprint Testing and start to significantly reduce as your move towards Release
Testing):

Testing Focus Testing Type Shift Left Best Practice

Sprint Testing Unit Testing Use Pega Diagnostic Cloud to resolve issues
Use Guardrails
Lean Usability Test on XD Prototypes during Prepare
Pega’s tools ensure you focus only on the MLP
Focus only on what has changed
Mandatory Adherence to Definition of Done
Automate Testing with Pega Unit
Sprint Testing Scrum Testing Test against real interfaces early to flush issues
Business involved to ensure early validation
Automate with Pega API & Pega Scenario

Functional End to End Deploy with DevOps & use automated regression
Testing tests
Pega Express Test Plan to drive effort and focus

Functional UAT Deploy with DevOps & use automated smoke tests
Testing

Release Testing Performance and Security Utilize deployment manager

Defects are easier and cheaper to fix the earlier they are found. The cheapest place to correct an issue is during the
Discover or Prepare phase. The most expensive place to find and fix defects is during a post-build QA phase or Acceptance
Testing as was traditionally done in waterfall projects. Therefore, Pega Express encourages validating and testing the
solution as early as possible.

Testing activities in Prepare


Your test strategy begins to take shape during the Prepare phase. Before any code is written, business architects (BAs) and
testers validate the business outcomes, scope, and design with the client. As BAs create the user stories, testers validate
that each user story is clear, concise, actionable, and contains acceptance criteria.

All user stories must have acceptance criteria


Acceptance criteria complement the user story narrative. They describe the conditions that must be fulfilled before the story
is done. The acceptance criteria enrich the story, make it testable, and ensure that the story can be validated, demonstrated,
or released to the users and other stakeholders.

Well-written acceptance criteria have the following characteristics:

Testable with clearly-defined expected results


Clear and concise, not ambiguous

Tip: A Pega Express best practice is to create three to five acceptance criteria for detailed stories.

Creation and adherence to a Definition of Done


In addition to creating individual acceptance criteria for each user story, the Scrum team creates a formalDefinition of Done
(DoD), which identifies consistent acceptance criteria required across all user stories. A DoD ensures the quality of work. It is
a checklist of what must be done for user story development work to be considered complete.

The Definition of Done ensures everyone on the team knows exactly what is expected of every component the team delivers.
DoD ensures transparency and quality fit for the purpose of the product and organization.

The DoD typically includes items such as:

A feature that is developed, unit tested, and meets the acceptance criteria
Unit test scripts that are created in thePegaUnit application and are available for future regression testing
An item is ready for acceptance testing
Any code that goes through code review
A releasable item
No increased technical debt

You can download a Definition of Done template from thePega Express Delivery Resource page.

Microjourneys and business outcome focus


One question that always comes up is, "How do you organize testing so that it is effective, efficient, and validates that we are
delivering the client's business outcomes with a product they will love?"

Pega has designed a simple test plan that achieves these goals while providing a Microjourney™-centric approach. The plan
ensures comprehensive test coverage and enables reporting on the testing status in terms of business outcomes rather than
disparate functional elements and generic test coverage stats.

The test plan gives the client assurance that their desired Microjourneys work as intended and that the resulting product is
both lovable and fulfills their business outcomes.

Test plan creation


Creating a Pega Express test plan is very simple. To start:

1. Build a table with six columns to easily identify and fully test all of your Microjourneys.
2. In the first column, list your Microjourneys, such as "Obtain Mortgage Loan."
3. In the second column, list the subcases, the basic Pega building block, of each Microjourney.
4. Then, break it down further in the subsequent columns, by stages, personas, channels, and interfaces for each
Microjourney.

Once completed, you have a table for each Microjourney, similar to the following example shown. Each row in the table
represents a path to test in the Microjourney ensuring that all outcomes of the Microjourney are tested.

Note: One of the advantages of this approach is that it shows the application's health in terms that business
stakeholders understand. Rather than report on features, the plan reports on specific customer Microjourneys and
business outcomes. It shows how much of the journey has undergone testing, which paths passed testing, and
which areas have issues. The business stakeholders can see what is working and what is not because the reporting
is in a language and format that they understand.

Validation tools and automated testing


Automation is about driving to quality and repeatability. The System Architect's work is not complete until they have unit
tested their work and created a PegaUnit test script.
Built-in validation tools
Pega Platform includes several built-in validation tools and dashboards that provide a holistic view of the health of the
application, such as the Application Quality dashboard, which focuses on application health and the Pega Predictive
Diagnostic Cloud (PDC), which provides a view into the health of the cloud service.

Tip: Check the Application Quality dashboard and PDC at least once per sprint.

Build up your automated test pyramid using Pega's built-in tools or 3rd party tools

Pega Platform's native tools support components of the test pyramid. You can also use industry-standard tools. However,
when testing a Pega Platform application, Pega tools are usually the best option.

In the following image, click the + icons to learn more about the test pyramid.

This learning is interactive and cannot be experienced offline. Please visit https://academy.pega.com to complete.

Checklist for success

Shift left and involve testing resources from the start of the project
Ensure that all user stories have acceptance criteria
Create and adhere to a DoD
Create a test plan to focus on Microjourneys and business outcomes
Use built-in validation tools
Automate your testing

To learn more about Unit Testing and how to use PegaUnit, see the Pega Academy moduleUnit Testing Rules.

Check your knowledge with the following interaction.

This learning is interactive and cannot be experienced offline. Please visit https://academy.pega.com to complete.

Application Quality Dashboard


Application quality
The Application Quality dashboard is a Pega Platform™ tool that lets you monitor your application's health as you build. The
dashboard identifies areas within your application that need improvement. It is a Pega Express™ best practice to check this
dashboard regularly. The Application Quality dashboard provides an easy and holistic way to validate your application and
identify potential issues before becoming a major problem.

Tip: Check the Application Quality dashboard often. It is one of the quickest ways to validate elements of your
application's quality.

Metrics available
The dashboard looks similar to the following image and provides metrics commonly used during application development.
You see numbers, percentages, and scores along with severity metrics. Use the dashboard to investigate further,
resolve violations, and improve the application quality.

You can view the following metrics:

Rule, case, and application includes the number of executable rules (functional rules that are supported by test
coverage) and the number of case types in the selected applications.
Guardrail compliance shows the compliance score, the number of guardrail violations for the included applications,
and a graph depicting compliance score changes over time.
Test coverage shows the percentage and number of rules covered by tests, the last generation date of the application-
level coverage report for the selected applications, and a graph of changes to application-level coverage over time.
Unit testing displays the percentage and number of Pega unit test cases that passed for the selected applications, over
the period selected on the Application Quality Settings landing page. The graph illustrates the changes to the test pass
rate over time.
Case types provides guardrail score, severe guardrail warnings, test coverage, unit test pass rate, and scenario test
pass rate for each case type in the applications.
Data types shows guardrail score, severe guardrail warnings, test coverage, and unit test pass rate for each data type.
Other rules displays guardrail score, test coverage, test pass rate, the number of warnings, a list of rules with warnings,
the number and list of uncovered rules, and the number and list of failed test cases for rules that are used in the
selected applications but that are not a part of any case type.

Note: You can learn more about testing and the Application Quality Dashboard on the Pega Community.

In the following image, click the + icons to view Application Quality dashboard metrics and drill down to identify individual
issues and problem areas.

This learning is interactive and cannot be experienced offline. Please visit https://academy.pega.com to complete.

This learning is interactive and cannot be experienced offline. Please visit https://academy.pega.com to complete.

Pega Platform analytical tools


Pega Platform tools to use in each sprint
Pega Platform™ includes several built-in validation tools and dashboards to provide a holistic view of the application's health.
These allow you to quickly analyze where improvements can be made, and ensure your application adheres to the Definition
of Done (DoD) defined during the Prepare phase.

Pega Predictive Diagnostic Cloud™ (PDC) – AI-powered technology that is used throughout the application life cycle
to assess the health of your application
Performance Profiler – Identifies which part of the process might be having performance issues
Database Trace – Validates the performance of your application's interactions with the Pega database or other
relational databases
Performance Analyzer (PAL) – Provides a view of all the performance statistics that Pega Platform captures

The tools pinpoint problem areas and, in some cases, identify ways to mitigate or fix the issues. Using these tools throughout
the project life cycle, you identify potential performance, security, and stability issues earlier in the project. Because
application issues are often tied to architecture and design, they must be found and rectified as soon as possible, when it is
significantly easier and less expensive to resolve.
Tip: Incorporate Pega tools into your in-sprint testing. Include these within your formal Definition of Done (DoD) to
ensure they are used.

Application performance analysis with PDC


Pega Predictive Diagnostic Cloud™ (PDC) is an AI-powered technology used throughout the application life cycle to
assess your application's health, notify you of critical issues, and resolve performance, security, and stability problems.

PDC gathers, aggregates, and analyzes alerts, system health pulses, and guardrail violations generated from Pega Platform
applications to produce trending dashboards. PDC empowers business stakeholders and IT administrators to take
preventative action by predicting potential system performance and business logic issues and providing remediation
suggestions.

PDC identifies and helps you resolve issues like:

Application logic – Issues related to the configuration used to incorporate the business logic in your application
Connectors – Issues related to the connectors in your application
Database – Issues that occur in your database configuration
Decisioning – Issues related to Decision Strategy Manager (DSM)
Exceptions – Issues related to errors in the code
Operations – Issues related to the run-time operation of your application
Pega Platform tuning – Issues related to your configuration
Event-related – Data points that PDC receives from the systems that it monitors

PDC events include alerts and exceptions, it:

Checks whether your system encountered any exceptions (unplanned events that interrupt the flow of a running
program)
Investigates events that preceded system failure
Evaluates the circumstances of an issue:
Investigates events that coincided with diminished performance to find areas for improvement
Assesses whether a node that has affected performance is running correctly

For more information about accessing and using PDC, see the Pega Academy topicAnalyzing application performance with
PDC.

Application performance analysis with Performance Profiler


The Performance Profiler determines which part of the process might have performance issues, or identifies the particular
step of a data transform or activity that might have a performance issue. When you use the Performance Profiler, you first
record readings of the application's performance and analyze the readings to identify problems.

To learn more about the Performance Profiler landing page, see the Pega Academy topicAnalyzing application performance
with Performance Profiler.

Application performance analysis with Database Trace


The Database Trace produces a text file containing the SQL statements, rule cache hit statistics, timings, and other data that
reflect the interactions of your requestor session with the Pega Platform™ database or other relational databases. Familiarity
with SQL is not required to interpret the output.
For more information about the Database Trace, see the Pega Academy topicAnalyzing application performance with
Database Trace.

Application performance analysis with the Performance Analyzer


The Performance Analyzer (PAL) provides a view of all the performance statistics that Pega Platform captures. You can
use PAL to understand the system resources consumed by processing a single requestor session.

For more information about Performance Analyzer, see the Pega Academy topicAnalyzing application performance with the
Performance Analyzer.

Check your knowledge with the following interaction.

This learning is interactive and cannot be experienced offline. Please visit https://academy.pega.com to complete.

Usability testing
Usability testing defined
Usability testing is about validating concepts with users to evaluate lovability and make sure solution design resonates with
Usability testing is about validating concepts with users to evaluate lovability and make sure solution design resonates with
them. It is the process employed to evaluate a prototype or concept application by testing it with representative users, who
perform specific tasks under realistic conditions.

A usability test:

Provides tangible evidence of what is and is not working


Lets actual users determine whether assumptions made (such as in a Design Sprint) and the goals are aligned
Is not limited to a Design Sprint and is used often throughout an engagement
Occurs every 1-2 sprints during the development cycle to ensure that application designs are on track
May also include usability testing to validate the ease of use of the user interface design, so that time and resources
are not spent on developing a poor user interface

Note: Pega Express™ usability testing involves users early on. It is not user acceptance testing (UAT). UAT only
occurs after an application is built.

Benefits of usability testing


User testing is fast and provides immediate feedback from the kinds of people who work with and use the solution. It takes
about 45-60 minutes to conduct as testing focuses on a defined problem space.

User test benefits include:

Unbiased results. You work with and get insights directly from end users. Your team can translate those observations
directly into action. Business decisions are driven by validated by evidence rather than assumptions and opinions.
Early issue identification. Issues can be found and addressed before a single line of code is written. You avoid
the long software development lead time and can address issues iteratively.
Time savings. Precious time and resources are not wasted rewriting code.
Pain avoided. You may discover and avoid user pain points that might otherwise have been overlooked.
Early positive feedback. You learn quickly how satisfied users are with the product.
Risk mitigation. You avoid issues and increase the team and Product Owners' confidence that the released application
is adopted.
Positive experience. User testing gives end users an early experience with the solution. That drives loyalty, which may
ultimately result in an increase in market share for the company.
Cost savings. You avoid costly research labs or having to wade through lengthy reports.

Frequency of usability testing


Conduct usability testing early, often, and continuously. Incorporate usability testing and feedback into the project cadence
and execute user test sessions whenever there is a need to validate assumptions. In a Pega Express project, testing is an
iterative process. Each iteration reduces risk, allowing your team to adapt to feedback.

At a minimum, execute usability testing:

On Day 5 of a Design Sprint


Every 1-2 Sprints during the Build phase of your project

Usability test planning and execution


Whether in-person or in a virtual setting, work together with your experience design team on the details of staging a
usability test.

To design an effective user test:

1. Start with a research goal: Define the problem that you want to solve and the questions you want to answer before
testing.
2. Select a scenario: Determine where you want the user to begin and set the stage for them. Doing so provides context
for the user and gives the interviewer a few opening lines to continue with the rest of their questions.
3. Write an interview script: Prepare a series of open-ended questions to observe the user’s reactions to the tasks you
want them to perform during the test. Line up the questions with the tasks you want the user to complete.
4. Do a dry run: After drafting an interview script, make sure it syncs with the prototype or application. Then, conduct a dry
run with the team. Elect one person to go through the entire prototype, narrating as they go, while everyone observes.
5. Run the test: Welcome participants with a brief introduction to help the end users (testers) feel at ease.
6. Debrief: After running the test, ask users to share additional thoughts, comments, and reactions. Document
your observations and capture their feedback to distill the test results into final findings and the next steps.

Tip: Think about what level of fidelity your project needs.Do you need to get a single idea across? Can a (low
fidelity) paper prototype work? Or, do you need a more robust version with a (high fidelity) clickable prototype as
close to the end state as possible?

Final considerations
User research is a team sport. Consider who is best suited to take on the various roles of:

Writing the interview script


Interviewing
Observing
Documenting observations and findings

You also want to consider:

Whether the user test is held in-person or in a virtual setting


What kind of experience design support you need
Who on the client stakeholder team should be involved
Who to invite to participate in and audit the tests (work with the team leads to come up with a list)

Once each user test event is done, analyze the results, and share them with your project team. Then decide on the next
actions together. Consider presenting any surprising results, insights, and facts to the wider community and senior
management.

Tip: Make the findings report interesting to consume: use direct quotes, keep it visual, and short. Think of the
report as a highlight reel.

Check your knowledge with the following interaction.

This learning is interactive and cannot be experienced offline. Please visit https://academy.pega.com to complete.
Crucial testing approaches
Testing with Pega Express
The Pega Express™ delivery approach is designed to ensure that your team delivers an application that works. There are
multiple kinds of testing to verify that your MLP delivers on the expected business outcomes and works fast enough not to
frustrate end users. There are four kinds of testing to determine to use each during the Build phase:

Unit testing
Scenario testing
Security testing
Performance testing

In the image below, you see what each kind of testing accomplishes.

Unit testing: This resource involvement is less intensive. Test each unit when it is configured.

Scenario testing: It is a resource intensive process, as it requires hands-on testing by end-users.

Security testing: It is done in partnership with the client IT team using a checklist.

Performance testing: It tests how long it takes to complete transactions - a speed test.

Unit testing
A System Architect or developer performs unit testing on their work or a peer's work. It is the least expensive form of testing.
Unit testing lets you verify that the application is configured correctly by testing the smallest units of functionality. In a Pega
Platform™ application, the smallest unit is an individual rule. The purpose of unit testing is to verify that each element of the
application, such as a decision table or a report definition, works as expected.

Unit testing reduces the risk of a configuration error in one rule affecting others and delaying case processing. Use unit
testing to find and eliminate configuration errors. By unit testing the individual rules when you configure them, you know that
each rule works as expected.

You can avoid common testing mistakes by reviewing documentation on the Pega Express Delivery Resources page, and
you can find detailed instructions within the Pega Academy module Unit Testing rules.

Scenario testing

Pega Platform provides user-interface (UI) based scenario testing to confirm that the end-to-end cases work as expected. As
Pega Platform provides user-interface (UI) based scenario testing to confirm that the end-to-end cases work as expected. As
part of the Pega Express delivery approach, it is vital to validate that your target end users can use the application features
as designed and that the application behaves as expected.

For detailed instructions on how to set up and structure scenario tests,see the Pega Academy topic, Scenario testing.

In the image below, you see how scenario testing works.

Creating scenario tests: Record a set of interactions for a case type or portal in scenario tests. You can run these tests to
verify and improve the quality of your application.

Opening a scenario test case: You can view a list of the scenario test cases that have been created for your application and
select the one that you want to open.

Updating scenario tests: When the user interface or process flow changes, update the existing scenario test. You can save
time and effort by keeping existing tests functional instead of creating new ones.

Group scenario tests into suites: Group related scenario tests into test suites to run multiple scenario test cases in a specified
order.

Application - Scenario testing landing page: The scenario testing landing page provides a graphical test creation tool to
increase test coverage without writing complex code.

Security testing using a Security Checklist


Security is often set up by user role. Security testing validates that your application allows authorized users to access the
system and keeps unauthorized users out. You can test the security configuration of your MLP before release by using the
security checklist.
Pega Platform provides a Security Checklist directly in the application, as shown above. In addition, your client’s IT support
team might want to verify that the application meets their specific security compliance requirements.

Performance testing
Your team conducts performance testing to verify the speed of the application as perceived by end users. Pega Platform
applications provide a full suite of tools to monitor and address performance issues. Your testing team uses these tools
during the Build phase to ensure that the application configuration to support your MLP release performs as it should.

You can learn more about the performance tools in the help topicUnderstanding system resources. Use the performance
tools to collect performance statistics. Performance statistics can help you distinguish between performance issues that arise
in the Pega Platform server, the database, or external systems called. In all cases, the statistics can help you determine how
to improve performance.

You can find downloadable testing documents (such as PDFs that describe how unit testing and performance testing
work) on the Pega Express Delivery Resources page.

Check your knowledge with the following interaction.

This learning is interactive and cannot be experienced offline. Please visit https://academy.pega.com to complete.

Security Checklist
Security is critical
Inadequate security can prevent your application from being deployed.

Most clients require approval from an IT security team who reviews the application security before the project team is
allowed to move the application to production.
Deployment Manager blocks applications from being deployed in production if the Security Checklist is not completed.

Click each image to learn more about each security area.

This learning is interactive and cannot be experienced offline. Please visit https://academy.pega.com to complete.

Check your knowledge with the following interaction.

This learning is interactive and cannot be experienced offline. Please visit https://academy.pega.com to complete.

Security goals
Pega takes application and system security seriously. Security is a shared responsibility between Pega and our clients. This
common goal ensures the AIC Triad – availability, integrity, and confidentiality of your application.

Unauthorized individuals cannot access or modify the application or the data it creates and stores. Authorized individuals, in
turn, only have access to those application functions and data that are necessary to perform their jobs.

In the following image, click the + icons to learn more about each security goal.

This learning is interactive and cannot be experienced offline. Please visit https://academy.pega.com to complete.

Check your knowledge with the following interaction.

This learning is interactive and cannot be experienced offline. Please visit https://academy.pega.com to complete.

Security Checklist
The Security Checklist is a key feature of Pega Platform that assists clients in hardening their applications and systems. To
assist in tracking the completion of the tasks in the checklist, Pega Platform automatically installs an application guideline
rule instance that includes the tasks in the Security Checklist for each version of your application. For more information,
see Assessing your application using the Security Checklist.

In the following image, click the + icons to learn more about how the Security Checklist helps you secure your application.

This learning is interactive and cannot be experienced offline. Please visit https://academy.pega.com to complete.

Guardrail compliance
The most important security requirement for any Pega Platform application is to maintain guardrail compliance. Pega
Platform security features are not always successfully enforced when using custom code.

To protect your application, use the built-in security configuration features in Pega Platform. Do not rely on custom code built
by developers who are not security experts.

Security Checklist tasks


The Security Checklist tasks are organized by when each task is performed, and the key security area involved. Key areas
include monitoring, authentication, authorization, auditing, and production testing. As you review the Security Checklist core
tasks, it is important to understand the nature of the application, what Pega Platform features are used, how and to whom the
application will be deployed.

Not all security tasks are required for all applications or releases. The tasks you use depend on many factors including the
Pega Platform features your application uses, how much you customize a Pega application, and the amount of sensitive data
created and stored within the application, to name a few.
Note: Some steps for securing the deployment environment are performed automatically for Pega Cloud clients as
part of their Service Agreement and may be skipped.

Security is one of several special considerations for public-facing applications. For more information, seeBasic requirements
for deploying public-facing applications.

Check your knowledge with the following interaction.

This learning is interactive and cannot be experienced offline. Please visit https://academy.pega.com to complete.

Module Quiz
This learning is interactive and cannot be experienced offline. Please visit https://academy.pega.com to complete.
Quiz duration
5 mins

Pega Express testing and validation -- Thu, 01/06/2022 - 02:03


To get the full experience of this content, please visit Pega Express testing and validation

You might also like