Manual Test Plan - v1

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 15

Contract #

Client Name
Revision/Date:

Test Plan

Document Revision 0.1

хх-хх-2017

Prepared By:
Delivered On:
Submitted By:
Document History
Revision Revision Date Summary of Changes Author/Reviewer

0.1 Issued

2
Contents
1. Introduction..............................................................................................................................4
1.1. Purpose..............................................................................................................................4
1.2. Scope.....................................................................................................................................4
1.3. References.............................................................................................................................5
2. System Overview.....................................................................................................................5
3. Test Items.................................................................................................................................6
4. Features to be Tested................................................................................................................6
5. Suggested Features.................................................................................................................7
6. Features not to be Tested..........................................................................................................8
7. Test Approach..........................................................................................................................8
7.1. Testing Levels.......................................................................................................................9
7.2. Testing Types........................................................................................................................9
7.2.1. Functional......................................................................................................................9
7.2.2. User Interface..............................................................................................................10
7.2.3. Data Validation............................................................................................................10
7.2.4. Security........................................................................................................................10
7.2.5. Performance.................................................................................................................10
7.2.6. Compatibility...............................................................................................................10
7.2.7. API testing...................................................................................................................10
7.2.8. Exploratory testing......................................................................................................10
8. Test Process............................................................................................................................10
8.1. Pass/Fail Criteria.............................................................................................................10
8.2. Deliverables....................................................................................................................11
8.2.1. Test Plan......................................................................................................................11
8.2.2. Test Cases....................................................................................................................11
8.2.3. Defect List...................................................................................................................11
8.2.4. Test Reports.................................................................................................................11
8.3. Testing Tasks..................................................................................................................11
8.4. Responsibilities...............................................................................................................12
8.5. Resources........................................................................................................................13
8.6. Schedule..........................................................................................................................13

3
9. Test Environment...................................................................................................................14
9.1. Hardware.........................................................................................................................14
9.2. Software..........................................................................................................................14
9.3. Testing Tools...................................................................................................................14
10. Risks and Assumptions.......................................................................................................14
11. Change Management Procedures.......................................................................................14

4
1. Introduction
1.1. Purpose
This is the Test Plan for the ХХХ project. This document is created to provide all the
stakeholders with general, structured and clear vision of the test process, and describe all the
aspects of testing and related activities.
The main aspects consist of scope, approach and process, criteria, resources, environments
and tools, risks and test deliverables. The plan identifies the items to be tested, the features to be
tested, the types of testing to be applied, the resources responsible for testing and the risks
associated with the plan. The document defines test responsibilities and activities, and describes
the tests to be conducted. The primary focus of this plan is to ensure quality level of the XXX
web application that provides the functionality to generate and visualize the reports for the users.

1.2. Scope
The project entails a dual testing approach with both manual and automated testing and
quality assurance of XXX XXX.   
XXX XXX is an innovative business analytics accelerator platform that delivers a unique
combination of web analytics, user experience, advanced portal functionality, and integrated
capabilities across multiple analytic platforms, apps, content, and communities.
The main object of testing will be XXX component. XXX is interface features where users
can create personalized dashboard views and storyboards that meet their unique requirements by
performing data discovery, blending, authoring, and assembly in one intuitive process.
The testing will cover the following features/functionality of XXX:
 “Dataset manager” functionality within XXX component, such as:
 Dataset preparation (uploading, profiling, blending)
 Dataset sharing
 Dataset deleting
 Visualization and Storyboard authoring
 Storyboard consumption and interaction
 Storyboard sharing
 UI presentation of XXX
 Usability of XXX.

5
1.3. References
Document Title Location
Meeting recording TBD
XXX presentation video https://www.XXXcorp.com/tutorials/XXX%204%20-%20XXX.mp4

XXX: Introduction meeting video https://meetings.webex.com/collabs/#/meetings/detail?uuid=


recording

2. System Overview
XXX is a web-based reporting tool aimed to generate flexible dashboards and reports for
different kinds of business users.
Such reports can be created based on user’s pre-defined dataset file.  XXX allows to use
plenty types of files for this purpose. Excel, Access, Cognos and other data can be imported to
XXX system. Therefore user just needs to have the data structured in some of these files.
Everything else will be done by XXX to have a good presentation.
After data from dataset file is imported, user has to configure, which data from this file will
be used for the visualizations. Any types of relationships between data tables can be specified
here. Moreover, user can add any type of calculations, which are not included in original dataset
file. It allows user to have flexible data processing process and perform statistical analyzes based
on the data. Further user can configure visualization based on created dataset. A lot of graphical
visual types can be used to show processed data. Moreover, user can use geographical location
split for the presentation and show the data on the map. Even more, some external data like
Twitter, Yammer, weather, stock, etc. information can be added to created visualizations. Thus,
XXX allows configuring the picture of some business processes based on user’s needs.
XXX is interface features where users can to create personalized dashboard views and
storyboards that meet their unique requirements by performing data discovery, blending,
authoring, and assembly in one intuitive process.

6
3. Test Items
The XXX smart analytics platform is the system under test (SUT). The following items of XXX
will be tested:
Top bar menu
Visualizations pane
Pages pane
Storyboards
Storyboard Settings
Datasets manager

4. Features to be Tested
The following features of XXX component will be tested:
 Datasets manager:
 Data import
 xlsx, xls, csv, tsv
 Google Sheets
 databases, MS Access, My SQL, MS SQL, Oracle
 Cognos reports
 Json
 SPSS files
 Xml
 Google Analytics
 Other - TBD

5. Suggested Features
 Implement SQL window function
Perform statistical analyze within defined window.
Can resolve the following tasks:
 Select “the best” stores within every state in one report
 TBD

6. Features not to be Tested


Other parts of the system, except XXX will not be covered by testing. All the testing is
performed from user perspective. It means that all the interaction with the system under test is
performed via Graphical User Interface. The only exception is API testing, in which case the
interaction with the SUT is performed via web service (REST or SOAP) underlying the system.

7
7. Test Approach
1.       Dataset preparation
2.       Visualization and Storyboard authoring
3.       Storyboard consumption and interaction
4.       Dataset and Storyboard sharing

Import data from some data source to XXX app


Basic test case for this step:
1. Prepare input data for test (possible types for input data: Excel spreadsheets, Database
tables, Cognos reports, etc.)
2. Specify data preparation conditions on XXX app (selected columns, aggregation rules,
etc.)
3. Generate expected result for performed import based on input data and data preparation
conditions
4. Perform import
5. Compare results of import with expected results.

Configure visualization for imported data in XXX


Basic test case for this step:
1. Specify visualization settings on XXX app (graph type, colors, shapes, effects, additions,
etc.)
2. Generate expected result for created visualization based on reports customization
3. Create visualization
4. Compare created visualization with expected results

Configure visualization for page in XXX


Basic test case for this step:
1. Create reports set (page)
2. Add several reports visualization to the page which are connected to the same dataset
3. Filter reports data on one report
4. Verify that changes are reflected to the rest of reports on the report set (page)

Report sharing in XXX


Basic test case for this step:
1. Generate reports
2. Push inside users or groups
3. Verify that target users can see the shared visualized reports

7.1. Testing Levels


8
The testing will be provided on the System and Acceptance Levels.

7.2. Testing Types


The following testing types will be applied during test execution:
Functional,
UI,
Data Validation,
Security (sharing function),
Performance,
Compatibility (cross-browser, cross-platform, different screen resolutions),
API testing.
7.2.1. Functional
Functional Testing is a testing conducted to evaluate the compliance of a system or Component
with specified functional requirements. Functional Testing should include Positive testing and
Negative testing:
Positive Testing = (Not showing error when not supposed to) + (Showing error when supposed
to)
Negative Testing = (Showing error when not supposed to) + (Not showing error when supposed
to). Functional Testing is conducted by using Testing techniques. Testing techniques use to
prepare the optimal documents and do the optimal steps during the testing.
Testing techniques are the following:
 DT(decision table) is used to validate the mapping in b/w inputs and outputs;
 State transition flow is used write the test scenarios in order with respect to functionality
 Error guessing is used to identify the defects in s/w before testing
 Error handling is used to verify the correction of error messages.
 Deliverables: Test Cases.

7.2.2. User Interface


UI testing is a non-functional testing conducted to determine the product visual representation,
accuracy of its graphical elements and visual attractiveness to the users under specified
conditions.
XXX has its own, unique User Interface components, specifically designed for the product, to
improve user experience. Given that, UI has to be thoroughly tested.

7.2.3. Data Validation


Data Comparison Testing is a validation that the data in different places is the same.  

9
7.2.4. Security
Security Testing is a validation that the important and sensitive user data is properly secured.
Security testing for a system is very important if data leaks or modifications are unacceptable
and intolerable.
In XXX, users deal with datasets. It is very important that users are not able to access, modify, or
delete other user’s data without proper permissions.

7.2.5. Performance
Performance testing on a system measures the performance under various scenarios.
As XXX may be simultaneously reached by many users, and large datasets may be uploaded into
the system, it is important to assure that performance is not an issue that prevents users from
using XXX.

7.2.6. Compatibility
Compatibility testing is a type of testing that evaluates the interoperability of the system with
different environments: operating systems, browsers and versions, screen resolutions, etc.
Users should be able to run XXX from within different browsers (Chrome, Firefox, Edge,
Internet Explorer, Safari), on an OS of their choice, and using screens of different resolutions.

7.2.7. API testing


API testing is a type of testing that verifies the system underlying API (web services,
REST/SOAP API endpoints). The API testing is performed with the help of HTTP client library
and is fully automated.
7.2.8. Exploratory testing
Exploratory testing, is all about discovery, investigation and learning. The focus of
exploratory testing is more on testing as a "thinking" activity.

8. Test Process
8.1. Pass/Fail Criteria
Build is considered to be successful when no serious cosmetics or functional bugs will be found.
Cosmetic issues that determine the failure of software version:
 Some of the windows\dialogs cannot be displayed;
 UI elements are not aligned or are missed;
 The messages did not contain correct information;
Functional issues that determine the failure of software version:
 The system component fails to work or it works not in accordance with requirements.
 The expected result did not appear after some actions with the system.
8.2. Deliverables
The following will be delivered once the testing is completed:
 Test Plan

10
 Test Cases
 Defect List
 Test Report
8.2.1. Test Plan
Test Plan should be updated according changes in the requirements. Daily activities on the
project are monitored via Progress Control and aim to understand whether the project is on
schedule or not. It also allows the team to quickly react to identified issues and resolve them to
ensure successful completion of the work by the milestone dates.

8.2.2. Test Cases


The Test Cases should be developed and used during testing. This document is intended to guide
QA people in order to create good test cases. Good test cases are an important part of any quality
assurance process. A test case should be concise, clear and concrete. Easy to follow by test
executor or automation people. And in a wider approach, a set of good test cases will cover some
functionality on the system.

8.2.3. Defect List


All found issues should be entered to the Jira – issues tracking system.
All issues have one of the following priorities:
 Blocker –the most critical issues. The system cannot be used if this type of issue
occurs;
 Critical - indicates that this issue is causing a problem and requires urgent attention;
 Major - issues that are important and should be fixed but does not stop the rest of the
system from functioning;
 Minor - these issues have a relatively minor impact on the product but needs to be
fixed;
 Trivial - trivial issues have the least impact on the product.
8.2.4. Test Reports
The Test Report should be updated to the last tested version.
8.3. Testing Tasks
There are the following main testing tasks:
 Test Analysis
 Documentation Development:
 Test Plan
 Test Cases
 Test Design Execution
 Test Results Analysis
 Test Design Updating
 Test Result Summarizing

11
8.4. Responsibilities
Role Responsibilities

BA Consultant  Interaction with the customers whenever required.


 Selection of right test tools after interacting with the customers.
 Carry out continuous test process improvement with the help of
metrics.
 Check the quality of requirements, how well they are defined.
 Trace test procedures with the help of test traceability matrix.
 Relate metrics to all goals.
 Understand business strategy.
 Develop and maintain BI strategy.
 Сonducting accessibility assessments, design reviews, technical
reviews as well as quality assurance reviews of design document
submissions.

Team lead  Defining the testing activities for subordinates – testers or test
engineers.
 All responsibilities of test planning.
 To check if the team has all the necessary resources to execute the
testing activities.
 To check if testing is going hand in hand with the software
development in all phases.
 Prepare the status report of testing activities.
 Required Interactions with customers.
 Updating project manager regularly about the progress of testing
activities.

Test engineers  To read all the documents and understand what needs to be tested.
 Based on the information procured in the above step decide how it is
to be tested.
 Inform the test lead about what all resources will be required for
software testing.
 Develop test cases and prioritize testing activities.
 Execute all the test case and report defects, define severity and priority
for each defect.
 Carry out regression testing every time when changes are made to the
code to fix defects.

12
8.5. Resources
Role Person

BA Consultant
Team lead
Test engineer

8.6. Schedule
Component Duration Description
System 2 week During this phase, all the team will plan the  scope of the
investigation project, evaluate the complexity of the problem and the effort
required to complete it. Performing a system investigation
include requirements analysis and exploratory testing.
Requirements analysis in XXX encompasses those tasks that
go into determining the needs or conditions to meet for a
new or altered functional of the system.

Dataset 1 week The manual test will coverage a few type of creation data:
preparation personal files (.csv .tsv .txt .xlsx .xls .sav), enterprise data
source (Cognos data source, Cognos BI package) and Web.
The PageObject module will grow and  encapsulate more UI
parts of the system.
Visualization for 1 week At this stage, the first test-cases will be specify visualization
imported data in settings (graph type, colors, shapes, effects, additions, etc.).
XXX
Storyboard 2 week Important components of XXX will be encapsulated by the
consumption and framework. We will start with simple functionality test case
interaction like creating, updating (renaming) and deleting storyboards.

Top menu bar 2 weeks There are many menus (buttons or icons) on the top bar
which in turn open and hide different panes (panels), e.g.
Visualizations pane, Pages pane.

Data sharing >1 month Manual test will be consist with Report sharing and Sharing
datasets between users. Verify that target users can see the

13
shared visualized reports.

9. Test Environment
TBD
9.1. Hardware
Provide the project hardware characteristics.
Device Device Type Processor CPU Memory
name
Laptop/Virtual Machine/Mobile, etc.

9.2. Software
Provide the project software characteristics.
Device OS
Additional software
name

9.3. Testing Tools


Provide the testing tools.
Tool Name Tool Purpose

14
10. Risks and Assumptions
Provide the Risk Management Plan information. Identify the risks to the project’s success,
including the probability of their occurrence, level of impact, and a brief description of the
Mitigation and/or Contingency Strategy for each.
TBD

11. Change Management Procedures


Describe the change control process. Ideally, this process will be some type of company
standard which is repeatable and done on most or all projects when a change is necessary.
Changes to any project should be carefully considered and the impact of the change should be
clear in order to make any type of approval decisions. So, this section identifies who has
approval authority for changes to the project, who submits the changes, how they are tracked
and monitored.
TBD

15

You might also like