Health Ontology Mapper
Space shortcuts
Space Tools

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

uETL Test Plan

 

 

Document Change History

 

Version Number

Date

Contributor

Description

 

V1.0

22 December 2023

Hillari V. Allen

Draft

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


Table of Contents

 

1 Introduction

1.1 Scope

1.2 Quality Objective

1.3 Roles and Responsibilities

1.4 Assumptions for Test Execution

1.5 Constraints for Test Execution

1.6 Definitions

2 Test Methodology

2.1 Purpose

2.2 Test Levels

2.3 Bug Regression

2.4 Bug Triage

2.5 Test Completeness

3 Test Deliverables

3.1 Deliverables Matrix

3.2 Documents

3.3 Defect Tracking & Debugging

3.4 Reports

3.5 Testing Tools

3.6 Test Environment

3.7 Bug Severity and Priority Definition

3.8 Bug Reporting

4 Terms/Acronyms

 

 

 

 

 

 


1        Introduction

This test approach document describes the appropriate strategies, process, workflows and methodologies used to plan, organize, execute and manage testing of the uETL project. 

 

To assure the uETL data extraction process from original sources is accurate and complete.

1.1.1 In Scope

The uETL Test Plan defines the unit, integration, system, regression, and user acceptance testing approach.  The test scope includes the following:

  •     Testing of all functional, application performance, security and use cases requirements listed in the Use Case document
  •     Quality requirements
  •     End-to-end testing and testing of interfaces of all systems that interact with uETL

1.1.2 Out of Scope

The following are considered out of scope for uETL Test Plan and testing scope:

  •     Functional requirements testing for systems outside uETL
  •     Testing of disaster recovery and business continuity plan

1.2.1 Primary Objective

A primary objective of testing uETL is to: assure that the system meets the full requirements, including quality requirements and fit metrics for each quality requirement and satisfies the use case scenarios and maintain the quality of uETL.  At the end of uETL project development cycle, the BA/End-User should find that uETL has met or exceeded all of their expectations as detailed in the requirements. 

 

1.2.2 Secondary Objective

The secondary objective of testing uETL will be to: identify and expose all issues and associated risks, communicate all known issues to uETL team, and ensure that all issues are addressed in an appropriate matter before release.  As an objective, this requires careful and methodical testing of the application to first ensure all areas of the system are scrutinized and, consequently, all issues (bugs) found are dealt with appropriately. 

uETL roles are defined as follows:

1.3.1 Developer

(a) Develop/maintain the system/application

(b) Develop use cases and requirements in collaboration with the Business Analyst

(c) Conduct alpha unit, system, regression, and integration testing

(d) Support user acceptance testing

   (e) Consists members if the Infrastructure Team

1.3.2 Tester

Manages the entire testing process, workflow and quality with activities and responsibilities to:

 

(a) Conduct beta unit, system, regression, and integration testing

   (b) Manage testing integrity

   (c) Support development activities

(d) Document testing activities

1.3.3 Business Analyst

Undertakes formal End-User acceptance and End-User communications in solutions adopted by Developers.  Responsible to:
 

(a) Contribute to use case, requirement development through review

(b) Contribute to develop and execution of the development test scripts through review

(c) Conduct full user acceptance, regression, and end-to-end testing; this includes identifying testing scenarios and providing feedback to Developers

 

  •     For user acceptance testing, the Development/Testing teams have completed unit, system and integration testing and met all the Requirements based on the Requirement Traceability Matrix .
  •     User Acceptance testing will be conducted by Business Analyst
  •     Test results will be reported and maintained in uETL Testing . Failed scripts and defects will be reported with evidence and will be sent to Development team directly
  •     Use cases have been developed by Business Analyst for user acceptance testing and reviewed by Development team
  •     Test scripts are developed and approved by Business Analyst
  •     Test Team will support and provide appropriate guidance to Business Analyst and Developers to conduct testing
  •     Major dependencies should be reported immediately after the testing kickoff meeting. 

Below are some minimum assumptions followed by example constraints

 

  •     Developer will support ongoing testing activities based on priorities
  •     Test scripts must be approved by Business Analyst prior test execution
  •     Test scripts, test environment and any dependencies should be addressed during testing kickoff meeting in presence of Infrastructure Team and request list should be submitted within due time to prepare
  •     The Developer cannot execute the user acceptance. After alpha testing, the developer will make known any development issues they may encounter to resolve, but reporting of the issues/incidents is not required.

1.6    Definitions

Bugs - Any error or defect that cause the uETL to malfunction. That is also included in the requirements and does not meet the required workflow, process or function point.  An error or defect that causes uETL or hardware to malfunction.

 

Enhancement - Any alteration or modification to the existing system for better workflow and process.  What is NOT included in the requirements can be categorized as an enhancement. 

2        Test Methodology

2.1    Purpose

2.1.1 Overview

The below list is not intended to limit the extent of this test plan and can be modified as necessary.

 

The purpose of the test plan is to achieve the following:

  • Define testing for each area and sub-area to include all the functional and quality requirements
  • Divide Design Spec into testable areas and sub-areas.  Be sure to also identify and include areas that are to be omitted (not tested) also
  • Identify testing risks
  • Share test findings

2.1.2 Usability Testing

The purpose of usability testing is to ensure that the new components and features will function in a manner that is acceptable to the BA/End-User. 

 

Development will create a non-functioning prototype of the UI components to evaluate the proposed design.  Usability testing can be coordinated by testing, but actual testing must be performed by non-tester/Business Analyst (in this case) .   Testing will review the findings and provide the development team with its evaluation of the impact these changes will have on the testing process and to uETL as a whole.

2.1.3 Unit Testing

Unit Testing is conducted by the Developer during code development process to ensure that proper functionality and code coverage have been achieved by each developer both during coding and in preparation for acceptance into functional and quality testing. 

 

The following are the areas of the uETL that must be unit-tested and signed-off before being passed on to regression testing:

  • Databases
  • Stored Procedures
  • Triggers
  • Tables
  • Indexes
  • Services
  • Database conversion
  • other binary formatted executables

2.1.4 Iteration/Regression Testing

During the repeated cycles of identifying bugs and taking receipt of new builds (containing bug fix code changes), there are several processes which are common to this phase across all projects.  These include the various types of tests: functionality, performance, stress, configuration, etc. 

 

During testing, ad-hoc debriefings will be held.  All identified bugs will be communicated and addressed.  At a minimum, all priority bugs should be resolved prior to entering the beta phase.

 

Important deliverables required for acceptance into Final Release testing include:

  • Installation/ Configuration procedures
  • Any User instructions
  • All test scripts, requirements, or training docs, etc.

2.1.5 Final Release Testing

Testing team with Business Analyst participates in this milestone process as well by providing confirmation feedback on any new issues uncovered, and input based on identical or similar issues detected earlier.  The intention is to verify that uETL is ready for distribution, acceptable to the customer and iron out potential operational issues. 

 

Assuming critical bugs are resolved during previous iterations testing; testing will continue its process of verifying the stability of the application through regression testing (existing known bugs, as well as existing test cases). 

 

The milestone target is to establish that the application in Test phase has reached a level of stability, appropriate for its usage and that it can be released to the uETL community (end users).

2.1.6 Testing Completeness Criteria

Release for Projection can occur only after the successful completion of the application under test throughout all of the phases and milestones previously discussed above. 

 

The milestone target is to place the tested build into Production after it has been shown that the app has reached a level of stability that meets or exceeds expectations as defined in the Requirements and Functional Specifications.

2.2    Test Levels

Testing uETL can be broken down into three primary categories and several sub-levels.  The three primary categories include tests conducted every build (Build Tests), tests conducted every major milestone (Milestone Tests), and tests conducted at least once every project release cycle (Release Tests). The test categories and test levels are defined below:

2.2.1 Build Tests

2.2.1.1                  Level 1 - Build Acceptance Tests

These test cases simply ensure that the application can be built and installed successfully.

This verifies that the Infrastructure is ready.

 

This is performed by Tester in the push to TEST from DEV life cycle with basic testing to assure app is functioning properly in that environment.

 

2.2.1.2                  Level 2 - Smoke Tests

These tests cases verify the major functionality at a high level.

The objective is to determine if further testing is possible.  These test cases should emphasize breadth more than depth.  All components should be touched, and every major feature should be tested briefly by the Smoke Test. If any Level 2 test case fails, the build is returned to developers un-tested.

 

This is performed by Tester with Developer (with custom apps) post push to TEST.  Tester is to document and script planned process, as application process is run end-to-end.  In this case, Epic table data to be extracted are:

 

PAT_ENC

ORDER_MED

PATIENT

PATIENT_RACE

CLAIM_INFO

CLAIM_INFO2

 

2.2.1.3                  Level 2a - Bug Regression Testing

Every bug that noted during the previous build, but Developers have “Fixed” will need to be regressed, or re-tested.

2.2.2 Milestone Tests

2.2.2.1                  Level 3 - Critical Path Tests

Critical Path test cases are targeted on features and functionality that the user will see and use every day.

The Critical Path test cases must all be executed at least once during the Iteration cycle, and once during the Final Release cycle.

 

This is performed by Tester in TEST, via automated scripts using various scenarios prior to push to PROD.  The same is performed in PROD once Tester has notified the Management and Development Team that the application has passed standard functionality testing.

 

2.2.3 Release Tests

2.2.3.1                  Level 4 - Standard Tests

Test Cases that need to be run at least once during the entire test cycle.  These cases are run once, not repeated as are the test cases in previous levels.  Functional Testing and Detailed Design Testing (Functional Spec and Design Spec Test Cases, respectively).  These can be tested multiple times for each Milestone Test Cycle (Iteration, Final Release, etc.). 

Standard test cases usually include Installation, Data, GUI, and other test areas.

 

This test phase will include data item comparison counts (for source > proxy > IDR) and validation of large data sets via random data selections of a minimum of 1 per 1000 data points.

 

2.2.3.2                  Level 5 - Suggested Test

These are Test Cases that would be nice to execute, but may be omitted due to time constraints.

 

Bug Regression will be a central tenant throughout all testing phases.  

All bugs that are resolved as “Fixed, Needs Re-Testing” will be regressed when the Tester is notified of the new build containing the fixes.  When a bug passes regression it will be considered “Closed, Fixed”.  If a bug fails regression, the tester will notify development team immediately.  When a Severity 1 bug fails regression, Tester should also put out an immediate email to Development.  The Tester will be responsible for tracking and reporting to development and Project management the status of regression testing.

 

Bug Triages will be held throughout all phases of the development cycle. 

 

The Tester will provide feedback and reports on bugs for Project Team.  The purpose of the triage is to determine the type of resolution for each bug and to prioritize and determine a schedule for all “To Be Fixed Bugs’.  Development will then assign the bugs to the appropriate person for fixing and report the resolution of each bug back to Tester.  The Tester will be responsible for tracking and reporting on the status of all bug resolutions.  The Developer is responsible for providing Configuration documentation for all code changes.

 

Testing will be considered complete when the following conditions have been met:

2.5.1 Standard Conditions:

  • When Testing Team and Developers, agree that testing is complete, the app is stable, and agree that the application meets functional requirements
  • Script execution of all test cases in all areas have passed
  • Test cases in all areas have passed
  • All priority 1 and 2 bugs have been resolved and closed
  • Business Analyst/Tester approves the test completion

3        Test Deliverables

Testing will provide specific deliverables during uETL.  These deliverables fall into three basic categories: Documents, Test Cases / Bug Write-ups, and Reports.  Here is a diagram indicating the dependencies of the various deliverables:

 

 

As the diagram above shows, there is a progression from one deliverable to the next.  Each deliverable has its own dependencies, without which it is not possible to fully complete the deliverable.

 

The following page contains a matrix depicting all of the deliverables that Testing will use.

 

 

Below is the list of artifacts that are process driven and should be produced during the testing lifecycle.

 

This matrix should be updated routinely throughout uETL development cycle in you project specific Test Plan.

 

Deliverable

Documents

  Test Approach

  Test Plan

    Test Specifications

Test Case / Bug Write-Ups

  Test Cases / Results

<Tool> Bug tracker for bug reporting

Reports

Test results report

  Test Final Report - Sign-Off

 

 

3.2    Documents

3.2.1 Test Plan

The purpose of the Test Plan document is to:

  • Specify the approach that Testing will use to test uETL
  • Break uETL down into distinct areas and identify features of uETL that are to be tested.
  • Specify the procedures to be used for testing sign-off and release
  • Indicate the tools used to test uETL
  • Identify risks and contingency plans that may impact the testing of uETL
  • Specify bug management procedures for uETL
  • Specify criteria for acceptance of development drops to testing (of builds).

3.2.2 Test Specifications

A Test Specification document is derived from the Test Plan as well as the Requirements, Functional Spec., and Design Spec documents.  It provides specifications for the construction of Test Cases and includes list(s) of test case areas and test objectives for each of the components to be tested as identified in uETL’s Test Plan.

 

3.2.3 Requirements Traceability Matrix

A Requirements Traceability Matrix (RTM) which is used to link the test scenarios to the requirements and use cases.  Requirements traceability is defined as the ability to describe and follow the life of a requirement, in both a forward and backward direction (i.e. from its origins, through its development and specification, to its subsequent deployment and use, and through periods of ongoing refinement and iteration in any of these phases) .

 

3.3.1 Testing Workflow

DIAGRAM

3.3.2 Defect reporting

ALL defects should be logged to address and debug defects. Business Analysts are also requested to send a report of bugs to the developer. Developers will update the defect list and notify the requestor after the defect has been resolved.

Debugging should be based on Priority – High > Medium > Low, these priorities are set by the Business Analyst and are based on how critical is the test script in terms of dependency and mainly based on use case scenario.

All High priority defects should be addressed within (1 day) of the request and resolved/closed within 2 days of the initial request

Examples may include:  Incorrect table mapping, no/limited data extracted, proxies not generated properly

All Medium priority defects should be addressed within (2 days) of the request and resolved/closed within 4 days of the initial request

Examples may include:  Infrastructure issues that cause performance delays

All Low priority defects should be resolved/closed no later than (5 days) of the initial request.

Examples may include:  Mis-spelling in GUI

3.4    Reports

The Tester will be responsible for writing and disseminating the following reports to appropriate project personnel as required.

3.4.1 Testing status reports

A status report will be forwarded by the Tester to Development and filed in uETL Plan .   This report will summarize weekly testing activities, issues, risks, bug counts, test case coverage, and other relevant metrics.

3.4.2 Phase Completion Reports

When each phase of testing is completed, the Tester will file a Testing Report to Project Manager and Development Team, for review and sign-off.

 

The document must contain the following metrics:

  • Total Test Cases, Number Executed, Number Passes / Fails, Number Yet to Execute
  • Number of Bugs Found to Date, Number Resolved, and Number still Open
  • Breakdown of Bugs by Severity / Priority Matrix
  • Discussion of Unresolved Risks
  • Discussion of Schedule Progress (are we where we are supposed to be?)

3.4.3 Test Sign-Off

Certifies as to the extent to which testing has actually completed (test case coverage report suggested), and an assessment of uETL’s readiness for Release to Production.

3.5.1 Tracking Tools

Test Complete/OnTime bug tracker is used by uETL to enter and track all bugs and project issues.  The Tester is responsible for maintaining the Test Complete and OnTime databases. 

 

3.5.1.1                  Configuration Management

 

3.6    uETL User Guide for DASI_DEV schema

 

The sequence of steps to generate PatientDimension, ObservationFact, VisitDimension, ConceptDimension for DASI_PATIENTS is outlined below.

Step1. Enter the schema as DASI_HQDEV

Step2. Select Database as CHESTNUT-HQ-INSTANCE from the drop down.

The list of tables gets populated in the list box below the Database selection.

If the list needs to populate view names instead of table names, the modification should be made in the code, DBOracle.java file. The application needs to be relaunched for the changes to take effect.

 

 

Step3. Make sure that a correct path is mentioned in the ‘Obs fact file name’ Path

The file path is set to C:\\temp\\ETL_Data for the current case.

 

Step 4. Select the table DASI_PATIENTS from the table names list. Click on ‘Get Tables’ button.

The selected table gets populated in the list box, next to the table names list box.

 

 

Step 5. Select DASI_PATIENTS from the second list box and click on ‘Get Columns’ button.

All the columns in the DASI_PATIENTS table are displayed in the list box located at the top-right of the panel. Select the column name which is not required from the selected list, and click on ‘Remove’ button. This will limit the column names in the list box to the ones which we require. The ‘ID’ column is removed from the list box, as seen from the screen shot below.

 

Please not that the ‘Encounter count’ and ‘Patient count’ number are initialized to ‘1’. This value will be used as a starting id number for ProxyEncounter and ProxyMRN respectively. The final value of the incremented proxy encounter number and proxy mrn number are written to a file ‘EncounterPatientNum.txt’. When uETL app is relaunched for subsequent processing to generate concept and observation_fact files for other tables, the numbers stored in ‘EncounterPatientNum.txt’ file will be used to evaluate the next starting id number for the proxy encounter and proxy mrn numbers.

 

Step 6. At this point, for DASI_PATIENTS table which holds PatientDimension data, the ProxyMRNs_DASI_PATIENTS_0.txt and PatientDim_DASI_PATIENTS_0.txt need to be generated.

To achieve this, click on the ‘PatientDim’ tab located at the top left of the panel. A new panel titled ‘PatientDim’ will open. Notice that the Schema value is pre-selected to DASI_HQDEV and the columns from ‘DASI_PATIENT’ table are pre-populated.

 

 

 

Step 7. Enter ‘1’ for Start ProxyMRN which will be taken as input for the starting id for proxyMRNs.

 

Step 8. Enter the ‘File Path’ value as ‘C:\\temp\\ETL_Data’

 

Step 9. Select each of the column names from the ‘Columns’ list box which we want as a value in the observation_fact table and add it to the ‘Selected Columns’ list box, by clicking on the ‘Add’ button.

 

 

Step 10. Map each of the columns from ‘Selected Columns’ list box to the respective parameters from the ‘Demographics’ list box. This is done by selecting column name such as ‘MRN’ from ‘Selected Columns’ list box and selecting ‘MRN’ from ‘Demographics’ list box and by clicking ‘Apply Mapping’ button. Each column from ‘Selected Columns’ list box is paired with a relevant column from ‘Demographics’.

 

 

Step 11. Click on ‘Generate Query’ button after mapping the respective columns. The Query ‘SELECT "MRN", "SEX", "CURRENT_AGE", "RACE", "DOB" FROM DASI_PATIENTS’ will be generated. For our scenario, we want to limit the number of patients, which will be selected from DASI_PATIENTS table, to 3 patients with MRN numbers - 123456,98765,7654321. These MRN numbers represents demo patients with reasonable data for diagnosis and labs in dev database. To limit the patient MRNs, the query is modified to ‘SELECT "MRN", "SEX", "CURRENT_AGE", "RACE", "DOB" FROM DASI_PATIENTS where MRN in (123456,98765,7654321)’.

 

 

Step 12. Click on ‘Execute Query’ button. This will generate ProxyMRNs_DASI_PATIENTS_0.txt and PatientDim_DASI_PATIENTS_0.txt files at location ‘C:\temp\ETL_Data’. The data from these files should be loaded into the database into tables Proxy_MRN and Proxy_MRN_Patient_Dimension respectively. The scripts to create Proxy_MRN and Proxy_MRN_Patient_Dimension is given below.

 

Following tables are required for ProxyMRNs and ProxyEncounters

 

Table PROXY_MRN

 

CREATE TABLE [dbo] . [PROXY_MRN] (

[MRN] [bigint] NULL,

[ProxyMRN] [bigint] NULL,

[SourceSystem] [varchar] ( 30 ) NULL,

[CreatedDate] [datetime] NULL

) ON [PRIMARY]

 

 

Table PROXY_ ENCOUNTER

 

CREATE TABLE [dbo] . [PROXY_ENCOUNTER] (

[Encounter] [bigint] NULL,

[ProxyEncounter] [bigint] NULL,

[MRN] [bigint] NULL,

[ProxyMRN] [bigint] NULL,

[SourceSystem] [varchar] ( 30 ) NULL,

[CreatedDate] [datetime] NULL

) ON [PRIMARY]

 

 

Table PROXY_MRN_PATIENT_DIMENSION

 

CREATE TABLE [dbo] . [PROXY_MRN_PATIENT_DIMENSION] (

[PATIENT_NUM] [bigint] NULL,

[VITAL_STATUS_CD] [varchar] ( 10 ) NULL,

[BIRTH_DATE] [datetime] NULL,

[DEATH_DATE] [datetime] NULL,

[SEX_CD] [varchar] ( 1 ) NULL,

[AGE_IN_YEARS] [int] NULL,

[LANGUAGE_CD] [varchar] ( 20 ) NULL,

[RACE_CD] [varchar] ( 100 ) NULL,

[MARITAL_STATUS_CD] [varchar] ( 20 ) NULL,

[RELIGION_CD] [varchar] ( 20 ) NULL,

[ZIP_CD] [varchar] ( 20 ) NULL,

[STATECITYZIP_PATH] [varchar] ( 12 ) NULL,

[PATIENT_BLOB] [varchar] ( 100 ) NULL,

[UPDATE_DATE] [datetime] NULL,

[DOWNLOAD_DATE] [datetime] NULL,

[IMPORT_DATE] [datetime] NULL,

[SOURCESYSTEM_CD] [varchar] ( 20 ) NULL,

[UPLOAD_ID] [int] NULL

) ON [PRIMARY]

 

Step 13. After the data from the files is loaded to the database, close the PatientDim panel.

 

Step 14. Select all the columns from the top right list box and click ‘Select Columns’ button. Doing this, will populate the columns in the ‘Selected Columns’ list box.

 

 

Step 15. Select ‘MRN’ column from the ‘Selected Columns’ list box and also select ‘ProxyMRN’ parameter from the ‘Filters’ box. Map these two selections by clicking on ‘Apply Filters’ button. By performing this operation, all the MRNs retrieved from DASI_PATIENTS table will be replaced with Proxy Numbers in the background during processing.

 

Step 16. Select PatientDim from ‘i2b2 Dimension’ box and click on ‘Apply I2B2Dimension’ button.

 

 

Step 17. Click on ‘Generate SQL Query’ button. This will generate the query in ‘SQL Query’ box. The query can be customized in the ‘Custom Query’ box. For our use case, as we want to limit the MRN numbers to - 123456,98765,7654321, let us append the where clause to the query as follows.

 

SELECT "MRN", "SEX", "CURRENT_AGE", "RACE", "DOB" FROM DASI_HQDEV.DASI_PATIENTS where mrn in (123456,98765,7654321)

 

 

Step 18. Click on ‘Observation Fact’ button to generate files

Obsfact_DASI_PATIENTS_0.txt

ProxyEncounter_DASI_PATIENTS_Fake_0.txt

VisitDim_DASI_PATIENTS_Fake_0.txt

 

Data from Obsfact_DASI_PATIENTS_0.txt gets populated into Observation_fact table and data from VisitDim_DASI_PATIENTS_Fake_0.txt is populated to Visit_dimension table.

 

Step 19. Clicking the ‘Concept Dimension’ table generates files

Concept_Dim_DASI_PATIENTS.txt

I2B2Metadata_DASI_PATIENTS.txt

I2B2Metadata_TableAccess_DASI_HQDEV.txt

 

Tables Concept_dimension, i2b2 and Table_access hold data from above three files.

SQL loader scripts were used to insert data from the files into the database.

This concludes the process for DASI_PATIENTS import from our schema to i2b2 schema.

 

The same sequence of steps from 1 to 19 needs to be repeated for generating patient Encounters. The only difference being that, steps 6 – 13 need to use VisitDim panel instead of the PatientDim panel. The data from the ProxyEncounter file should be populated in the Proxy_Encounter table.

 

Relevant files for Procedures and Diagnosis can be generated using the same steps outlined above. The steps 6-13 will be skipped for procedures and diagnosis. Only ICD9, ProxyMRN and ProxyEncounter filters need to be applied before file creation.

 

Concerns:

 

If data from the diagnosis and procedures is going to use the same concepts as the sample ontology for the demo data then care must be taken to ensure that i2b2 is the main head level node with level value as ‘1’ and this level is a unique row with no duplicates.

 

In case new concepts are created which will have a different head node then the required entries for the new head node must be made in table_access table and a new metadata table needs to be created.

 

A direct dump of data from the files generated from uETL may cause certain issues such as duplicate rows of data at same level in the tree structure. Care must be taken to eliminate the duplicates. Some of the files have additional columns than expected by the respective table structure. The invalid columns must be deleted and data cleanup may be required in certain files.

 

The c_tooltip column of i2b2 table if null generates an error on the i2b2 client. The value of this column must be set to a valid concept path.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

3.7.1 Hardware

Include the minimum hardware requirements that will be used to test uETL.

Testing will have access control to one or more application/database servers separate from any used by non-test members of uETL team.  Testing will also have access control to an adequate number of variously configured PC workstations to assure testing a range from the minimum to the recommended client hardware configurations listed in uETL’s Requirements, Functional Specification and Design Specification documents.

3.7.2 Software

In addition to the application and any other customer specified software, the following list of software should be considered a minimum:

  • Mozilla Firefox
  • Internet Explorer
  • Safari
  • TestComplete
  • LoadComplete
  • OnTime

Bug Severity and Priority fields are both very important for categorizing bugs and prioritizing if and when the bugs will be fixed.  The bug Severity and Priority levels will be defined as outlined in the following tables below.  Testing will assign a severity level to all bugs.  The Tester will be responsible to see that a correct severity level is assigned to each bug.

 

3.8.1 Severity List

The tester reporting a bug is also responsible for indicating the bug Severity. 

 

Severity ID

Severity Level

Severity Description

1

Critical

uETL crashes or the bug causes non-recoverable conditions. System crashes, GP Faults, or database or file corruption, or potential data loss, program hangs requiring reboot are all examples of a Sev. 1 bug.

2

High

Major system component unusable due to failure or incorrect functionality.  Queries results incorrect data.  Sev. 2 bugs cause serious problems such as a lack of functionality, or insufficient or unclear error messages that can have a major impact to the user, prevents other areas of the app from being tested, etc.  Sev. 2 bugs can have a work around, but the work around is inconvenient or difficult.

3

Medium

Incorrect functionality of component or process.  There is a simple work around for the bug if it is Sev. 3.

4

Minor

Documentation errors or signed off severity 3 bugs.

3.8.2 Priority List

Priority ID

Priority Level

Priority Description

5

Must Fix

This bug must be fixed immediately; uETL will not pass testing with this bug.

4

Should Fix

These are important problems that should be fixed as soon as possible.

3

Fix When Have Time

The problem should be fixed within the time available.  If the bug does not delay an important date, then fix it.

2

Low Priority

It is not important (at this time) that these bugs be addressed.  Fix these bugs after all other bugs have been fixed.

1

Trivial

Enhancements/ Good to have features incorporated- just are out of the current scope.

 

Testing team recognizes that the bug reporting process is a critical communication tool within the testing process.  Without effective communication of bug information and other issues, the development and release process will be negatively impacted.

 

The Tester will be responsible for managing the bug reporting process.  Standard bug reporting tools and processes will be used. 

4        Terms/Acronyms

The below terms are used as examples, please add/remove any terms relevant to the document.

 

TERM/ACRONYM

DEFINITION

UAT

User Acceptance Testing

End-to End Testing

Tests user scenarios and various path conditions by verifying that the system runs and performs tasks accurately with the same set of data from beginning to end, as intended.

QA

Quality Assurance

RTM

Requirements Traceability Matrix

SME

Subject Matter Expert