- 2.5.14 Quality Assurance
- 2.5.14.1 Program Scope and Objectives
- 2.5.14.1.1 Background
- 2.5.14.1.2 Authority
- 2.5.14.1.3 Roles and Responsibilities
- 2.5.14.1.4 Program Management and Review
- 2.5.14.1.5 Program Controls
- 2.5.14.1.6 Terms/Definitions/Acronyms
- 2.5.14.1.7 Related Resources
- 2.5.14.2 Introduction
- 2.5.14.3 Quality Assurance Reviewing Process
- 2.5.14.3.1 Reviewing Process Roles and Skills
- 2.5.14.3.2 Review Process Control
- 2.5.14.3.3 Reviewing Process Tasks and Flow
- 2.5.14.3.3.1 Plan to Audit
- 2.5.14.3.3.2 Perform the Audit
- 2.5.14.3.3.3 Manage Documents
- 2.5.14.3.3.4 Monitor and Control Review Findings
- 2.5.14.3.4 Review Process Management
- 2.5.14.3.5 Review Process Review
- 2.5.14.1 Program Scope and Objectives
Part 2. Information Technology
Chapter 5. Systems Development
Section 14. Quality Assurance
2.5.14 Quality Assurance
Manual Transmittal
May 05, 2025
Purpose
(1) This transmits revised IRM 2.5.14, Systems Development, Quality Assurance (QA).
Material Changes
(1) The following revisions were made throughout the IRM to remove the word “Audit” from the IRM and replace it with the word “Review”.
Effect on Other Documents
IRM 2.5.14, dated February 22, 2023, is supersededAudience
The audience for this IRM is all Applications Development (AD) personnel responsible for the development and maintenance of Agency software systems identified in the Enterprise Architecture. This IRM applies to all QA activities conducted on projects, including contractor QA functions. It establishes the policy for conducting QA activities and the responsibilities and authority for performing QA across the Applications Development organization.Effective Date
(05-05-2025)Rajiv Uppal
Chief Information Officer
-
Purpose
The purpose of this IRM is to provide the framework for conducting Quality Assurance Review activities within Applications Development (AD). It establishes a standard context for Project Teams, including contractors working for the Applications Development organization, to participate in the QA Review audit process.
-
Audience
The audience for this IRM is all Applications Development personnel responsible for the development and maintenance of Agency software systems identified in the Enterprise Architecture. This IRM applies to all QA activities conducted on projects, including contractor QA functions. It establishes the policy for conducting QA activities and the responsibilities and authority for performing QA across the Applications Development organization.
-
Policy Owner
The Associate, Chief Information Officer (ACIO), Application Development establishes all Information Technology (IT) internal controls for this IRM.
-
Program Owner
The Application Development Director, Delivery Management and Quality Assurance is the program owner.
-
Primary Stakeholders
This policy applies to all IT projects and programs including contractors.
-
Program Goals
The objective of this IRM is to establish the overall approach to Quality Assurance, apply Quality Assurance standards, report and control requirements for the QA program as outlined by AD QA Directive and related processes and procedures.
-
Below is a list of QA Roles and Responsibilities:
-
QA Program Manager - Manages QA Program
-
QA Program Staff - Serves as liaison from QA Program to project/program personnel and Process Owners
-
Project/Program Manager - Manages Project quality
-
Project/Program Team - Produces quality software products, systems, and documentation
-
Process Owner - Owner of Specific Process Area (e.g., Configuration Management)
-
Enterprise Life Cycle (ELC) Coach - Provides assistance with ELC matters to Project/Program Teams
-
-
Program Reports
-
Review Report
-
Non-Compliance Summary Report
-
Corrective Action Plan (CAP)
-
Heat Chart
-
DMQA Weekly Status Report
-
Review Team Daily Status Report
-
-
Program Effectiveness
Measurements show how well products and processes conform to organizational and industry standards, they also indicate how well projects follow documented processes. The Quality Assurance Program reports on process metrics, derived from the number of non-compliances identified during process reviews and the number of projects audited to produce process measures. To keep senior management abreast of project status and process performance metrics, the Quality Assurance Program Office will generate, retain, and report process metrics in order to demonstrate Program effectiveness.
-
The following list represents the ibet controls, and mandates applicable to AD projects:
-
IRM 2.5.1 Systems Development
-
IRM 2.5.14 Quality Assurance (QA)
-
IRM 2.16.1 Enterprise Life Cycle (ELC)
-
IRM 2.21 Introduction to Shopping Cart Processing for IT
-
IRM 2.22 Business Planning and Risk Management
-
IRM 2.25 Managed Service for ibet
-
IRM 2.100 Integrated Process Management
-
IRM 2.110 Requirements Engineering
-
IRM 2.120 Engineering
-
IRM 2.125 Change Management
-
IRM 2.126 Enterprise Organizational Readiness
-
IRM 2.127 Testing Standards and Procedures
-
IRM 2.144 Capacity Management
-
IRM 2.149 IT Asset Management
-
IRM 2.150 Configuration Management
-
IRM 2.152 Data Engineering
-
IRM 10.8.1 Information Technology Security Policy and Guidance
-
Federal Information Security Management Act (FISMA)
-
Privacy Act of 1974
-
Section 508 of the Rehabilitation Act of 1973
-
Risk, Issue, and Action Item Management Directive
-
Other internal standards are located on the Process owners' SharePoint sites
-
-
The following tables are a list of terms and acronyms used throughout this IRM Section.
Defined Terms
Term Definition Applications Development An organization that supports IT components of a system that utilizes IT resources to store, process, retrieve or transmit data or information using IT hardware and software. Clinger-Cohen Act The Clinger-Cohen Act of 1996 (40 U.S.C. 1401(3)), also known as the Information Technology Management Reform Act, was intended, to "reform acquisition laws and information technology management of the Federal Government. Configuration Management Establish and maintain the integrity of work products using configuration identification, configuration control, configuration status accounting, and configuration reviews Corrective Action Plan Changes made to bring expected future performance of the project in line with the project plan Daily Metrics Daily status of Quality Assurance Program Staff Enterprise Life Cycle A framework that provides a workflow for projects to follow to move an IT solution from concept to production while making sure that they are in compliance with ibet guidelines and are compatible with the overall goals of the ibet Heat Chart Visual representation of compliance data that uses colors to illustrate project and Domain compliance Internal Revenue Manual Official communications that designate authorities and/or disseminate instructions to staff for ibet officials and employees Lessons Learned Practices for evaluating past performance of activities Non-Compliance Finding and/or weaknesses usually found in a quality review Peer Review Defect and Resolution Report Used to document Peer Reviews of project artifacts. Acronyms
Acronym Definition AD Applications Development ACIO Associate, Chief Information Officer CAP Corrective Action Plan CM Configuration Management DID Data Item Descriptions ELC Enterprise Life Cycle ESC Executive Steering Committee FISMA Federal Information Security Management Act IRM Internal Revenue Manual PRIV Privacy PRM Process Management PP Project Planning PMC Project Monitoring and Control QA Quality Assurance RSKM Risk Management RQEN Requirements Engineering SEC Security SWDEV Software Development SME Subject Matter Expert SM Supplier Management TEST Testing U.S.C. United States Code
-
QA evaluation of the project processes throughout the life cycle is based on the processes defined by the following supporting documents:
-
AD Quality Assurance Directive
-
AD Quality Management Plan
-
AD Quality Assurance Plan
-
AD Quality Assurance Program (processes, procedures, etc.)
-
Internal Revenue Manual (IRM) 2.16 ibet Enterprise Life Cycle
-
IRM 2.5 Systems Development
-
-
This IRM provides the framework for conducting Quality Assurance Review activities within Applications Development. It establishes a standard context for Project Teams, including contractors working for the Applications Development organization, to participate in the QA review process. This IRM establishes:
-
the overall approach to Quality Assurance
-
the applicable Quality Assurance standards
-
the reporting and control requirements for the QA program as outlined by the AD QA Directive and related processes and procedures.
-
-
The Quality Assurance (QA) Program Office supports the delivery of high-quality products and services by ensuring that projects implement a coordinated set of activities that conform to organizational policies, processes and procedures.
-
Quality Assurance is a systematic, planned set of activities necessary to provide adequate confidence that the product conforms to stated customer requirements. The activities are designed to evaluate the processes (i.e. Project Planning, Project Monitoring and Control, Requirements Management, etc.) by which products are developed.
-
QA Review Process and Procedures are used to objectively and independently evaluate adherence of the process and work products to applicable directives, processes, standards, procedures, and guidelines. The objectives of the Review process are to:
-
identify and track noncompliance instances
-
communicate and facilitate the resolution of noncompliance issues
-
identify and communicate, to senior management, best practices and opportunities for improvement
-
document Quality Assurance activities; and
-
report quality issues to relevant stakeholders
-
-
Benefits of the Review process are realized through:
-
consistency in assessing use of organizational processes
-
facilitation of improvements
-
enhanced planning and resource allocation capability
-
-
To meet the objectives and realize the benefits of QA reviewing, the following roles and skill sets are needed to perform QA reviewing activities.
Roles Responsibilities QA Program Manager Manages QA Program -
Develops, updates, and maintains quality standards and procedures (e.g., plans, processes, activities, templates, checklists, and guidelines)
-
Approves or rejects changes to quality documents
-
Assigns tasks to QA Program Staff
-
Manages Review Schedule
-
Evaluates reviewer feedback and workload
-
Provides QA Program Staff with Review Checklist
-
Reviews, resolves, or escalates issues to senior program management, if necessary
-
Responsible for the generation of Daily Metrics, Weekly Metrics, and Heat Charts
-
Coordinates quality training based on feedback
QA Program Staff Serves as liaison from QA Program to project/program personnel and Process Owners -
Functions as the Subject Matter Expert (SME) on QA program objectives and procedures
-
Updates quality documents as assigned by QA Program Manager
-
Attends necessary internal training
-
Tailors Review checklist based on current standards and /project specifics
-
Ensures compliance with applicable ibet directives, processes, and organizational standards by conducting reviews
-
Reports on quality assessments and the review findings to QA Program Manager
-
Coordinates with other Review reviewers to establish review baseline.
-
Coordinates with Project/Program Team on non-compliance items and corrective actions, if necessary Escalates to QA Program Manager, if necessary
-
Receives and acts upon Daily Metrics
-
Schedules a QA Services Team meeting
-
Provides coaching, training, and mentoring to projects/programs
-
Participates in Lessons Learned and continuous improvement activities
Project/Program Manager Manages Project quality -
Commits to compliance with applicable ibet directives, processes, procedures, and organizational standards and their deliverables
-
Has working knowledge of the QA Program objectives and standards; and required deliverables
-
Reviews current QA Program standards and objectives to incorporate into Project Plan
-
Performs project level self assessments
-
Supports and participates in QA reviews
-
Receives review results from the QA Program and responsible for follow-up action, as appropriate
-
Develops strategy for resolution and ensures timely responses to deficiencies with a Corrective Action Plan (CAP)
-
Ensures that quality management activities are scheduled, documented, and performed in accordance with Project Plan.
-
Supports Peer Review activities by allocating personnel, facilities, and time for Peer Reviews
-
Identifies trends in peer reviews and responsible for follow up action, as appropriate
-
Tracks project progress
-
Attends QA Services Team Meeting to discuss non-compliance and resolution
Project/Program Team Produces quality software products, systems, and documentation -
Has working knowledge of QA objectives and standards
-
Submits request to change existing Quality documents
-
Executes quality related activities and produces deliverables
-
Utilizes external Quality documents for support
-
Participates and performs peer reviews, technical reviews, and quality assurance activities
-
Performs independent/self-assessments of project/program artifacts and work products
-
Conducts ELC and quality reviews within various stages of the project lifecycle
-
Supports and participates in reviews
-
Reconciles Non-Compliance issues resulting from independent/self-assessments
-
Participates in Lessons Learned and continuous improvement activities
-
Attends quality related training and provides feedback
Process Owner Owner of Specific Process Area (e.g., Configuration Management) -
Participates in Lessons Learned and continuous improvement activities
-
Supports and participates in reviews
ELC Coach Provides assistance with ELC matters to Project/Program Teams -
Provides assistance and feedback to projects/programs on organizational standards and actions
-
Collaborates with QA Program Staff to confirm non-compliance items
-
Attends QA Services Team Meeting to discuss non-compliance and resolution
-
-
The review process is controlled and driven by the Review Checklists. The checklists are questionnaires used to gather data for the processes and products being reviewed and to evaluate the project’s level of compliance. The checklists are tailored based on the project or type of review conducted. The Quality Assurance Program conducts reviews on organizational process areas as outlined in the AD Process Framework contained in the IT Process list below.
-
The process areas are:
-
Project Planning (PP)
-
Project Monitoring and Control (PMC)
-
Risk Management (RSKM)
-
Requirements Engineering (RQEN)
-
Software Development (SWDEV)
-
Testing (TEST)
-
Privacy (PRIV)
-
Section 508 (508)
-
Security (SEC)
-
Configuration Management (CM)
-
Supplier Management (SM)
-
Quality Assurance (QA)
-
Process Management (PRM)
The Quality Assurance Program conducts three types of reviews. The review types are:
-
Process Area Review - Evaluates adherence to the standards and procedures of a process area (e.g., Configuration Management) or a group of process areas (e.g., Engineering) within a program or project.
-
Work Product Review- Reviews work products for conformance to the Enterprise Life Cycle (ELC), Data Item Descriptions (DID) and templates. Work Products can include any work products and deliverables, as well as the standards and/or procedures used to produce them.
-
Process Owner Review- Evaluates Process Owner policies and procedures based on standards and requirements. This type of review is performed upon request from the Process Owner or their management.
-
-
The following tasks constitute the flow of the QA Reviewing process:
-
Plan to Review
-
Perform the Review
-
Manage Documents
-
Manage and Control Review Findings
-
-
The QA Program Manager and QA Program Staff perform Review Planning activities in conjunction with the overall planning activities. Project/Program Managers and Project/Program Teams support Review Planning activities. The following steps describe the review planning activities:
-
Establish QA Program goals using inputs from previous Reviews
-
Plan and develop annual QA Review Plan including review calendar
-
Plan and develop QA Review Schedule for each review
-
Assign projects/programs to reviewers
-
Tailor Review Checklist based on type of review
-
Coordinate among reviewers to ensure consistency in reviewing
-
-
This task is performed in accordance with the Review Schedule. The following steps occur during this task:
-
Projects and programs are notified of the review and provide access to the project/program repository (if required).
-
The reviewer reviews the Health Assessment and ReadMe File and makes a preliminary assessment of the quality status of the project/program.
-
The Review Opening Meeting is conducted to communicate the scope and the objectives of the review to the projects and programs.
-
The reviewer, guided by the appropriate Review Checklist, will evaluate the level of compliance with standards and processes as well as evaluate associated work products by reviewing instructions and procedures, checking records and through observation.
-
If necessary, project/program staff will be consulted to address questions that arise during the checks and observations.
-
The reviewers coordinate their efforts to ensure consistency in Reviewing.
-
The reviewer documents the findings and confirms accuracy with other reviewers and (if necessary) ELC Coaches.
-
The Review Closing Meeting is conducted to communicate high level review results, global findings, and next steps to the projects and programs.
-
The Reviewer issues to the reviewee a final report of findings and stores it in the appropriate repository based on type of review.
-
The reviewer conducts a QA Services Team meeting with the reviewee to discuss review findings and recommend potential corrective actions. A QA Services Team is not conducted if no deficiencies are found by the auditor.
-
When audit findings require corrective action, a CAP must be submitted by the project/program.
-
-
This task occurs after an review is conducted or may occur anytime a document is established, or content/information is gathered relating to the review process. The following activities occur during this task:
-
All documents generated and/or received as a result of an review are collected and stored in soft and/or hard copy in the appropriate repository.
-
Changes to QA process assets (i.e., processes, procedures, templates) shall be controlled in accordance with the Document Management Procedure located in the QA Program’s shared repository.
-
-
As part of this task, when new QA documents are developed and/or requests have been received to modify existing documents, the following activities are executed:
-
The QA Program Office reviews requested changes internally.
-
The QA Program Office implements requested changes, updates repositories, and notifies users of changes to document (if approved).
-
If the change is rejected, the QA Program Office informs the requestor that there will be no change.
-
-
The reviewer will use the information from the final report and any subsequent actions to update the QA Program database. Using this data, the reviewer tracks review findings to closure. If corrective actions are not completed by the resolution date as outlined in the approved CAP, the auditor escalates the unresolved findings to the QA Program Manager for resolution. This is not required for Process Owner reviews.
-
The auditor’s responsibilities are to:
-
Verify resolution of the corrective action.
-
Review updates to the QA Program Database to indicate the status of the review’s corrective actions.
-
Identify and escalate corrective actions that remain unresolved five days after planned resolution date.
-
Perform trend analysis activities.
-
Prepare audit data and status reports for review activities.
-
Place reports in the appropriate repository.
-
-
When all findings are resolved and verified, the reviewer updates the data repositories, (i.e., shared drive, QA Program database).
-
AD Quality Assurance Program Office will regularly maintain measurements on the status and progress of Quality Assurance tasks for the AD portfolio. Process trends shall be analyzed for efficiency and effectiveness.
-
Data will be compiled to develop and report trends in performance and compliance. The reporting will occur through performance trends metrics and compliance metrics. Performance and compliance trends metrics will be reported:
-
At the AD portfolio, domain, and project levels; and
-
For all process areas (by AD portfolio, domain, project levels).
-