← Back to Portfolio

Full Process Walkthrough

System Testing

A complete system testing case study for an ITSM platform upgrade (v3.2). From test planning to production sign-off — every phase shows the real data, what was found, and why acting on each finding prevented costly post-release failures.

248 test cases25 defects closed0 defects at go-live100% UAT sign-off
01📝

Test Planning

Test Plan DocJIRAConfluence

Before any testing begins, a structured test plan was prepared covering scope, objectives, test types, resource allocation, schedule, and risk mitigation. The plan was agreed and signed off by the project lead and client representative.

Test Plan Summary — ITSM Platform Upgrade v3.2

SectionDetail
ProjectITSM Platform Upgrade v3.2
Test ScopeIncident module, SLA engine, reporting dashboards, user access
Test TypesFunctional, Integration, Regression, UAT
Total Test Cases248
Test Period2024-02-05 to 2024-02-23 (3 weeks)
EnvironmentsSIT (System Integration), UAT (User Acceptance)
Sign-off Required FromProject Manager, Client Lead, QA Lead
02✍️

Test Case Design

Excel Test CasesConfluenceBoundary Analysis

248 test cases were designed across 6 functional areas using equivalence partitioning and boundary value analysis. Each test case included preconditions, test steps, expected results, and links to requirements. Cases were reviewed and approved before execution began.

Test Case Distribution by Module

ModuleTest CasesPriorityType
Incident Creation & Routing52HighFunctional
SLA Timer & Breach Logic48CriticalFunctional + Integration
Escalation Workflow34HighFunctional
Reporting & Dashboard Output41HighFunctional + UAT
User Access & Permissions28MediumSecurity
System Integration (API)45CriticalIntegration + Regression
03🧪

Test Execution — SIT

JIRAExcel TrackerServiceNow SIT env

System Integration Testing was executed over 10 days. 248 test cases were executed, uncovering 19 defects. Critical defects in the SLA breach logic and API integration were immediately escalated. 16 of 19 defects were resolved and re-tested within the SIT window.

SIT Execution Results — ITSM Platform v3.2

Test Case IDModulePriorityResultDefect RaisedStatus
TC-SIT-001Incident CreationHighPassNoneClosed
TC-SIT-012SLA Timer LogicCriticalFAILDEF-007Resolved
TC-SIT-013SLA Breach AlertCriticalFAILDEF-008Resolved
TC-SIT-045Escalation WorkflowHighPassNoneClosed
TC-SIT-089API IntegrationCriticalFAILDEF-015Open
TC-SIT-112Dashboard OutputHighPassNoneClosed
TC-SIT-198User PermissionsMediumFAILDEF-019Resolved
TC-SIT-248Regression — CoreHighPassNoneClosed

🔍 Finding

19 defects raised. 3 critical: SLA timer miscalculating breach thresholds by 15 minutes, breach alert emails not triggering, and API payload dropping custom fields on handoff. These would have caused incorrect SLA reporting in production.

✅ Why the Team Must Act

Critical defects DEF-007, DEF-008 were resolved by the dev team within 48 hours and re-verified. DEF-015 (API) remained open and was tracked as a known risk into UAT — the dev lead confirmed a fix in the next build cycle.

04

UAT Coordination

UAT Test ScriptsZoom SessionsDefect Log

UAT was conducted with 8 end users across 3 departments over 5 days. I coordinated sessions, distributed test scripts, walked users through scenarios, documented their feedback, and tracked all raised issues in a defect log. 6 new issues were raised by users, 4 of which were resolved before sign-off.

UAT Session Log — 5 Days, 8 Users, 3 Departments

DayDeptUsersScenarios TestedIssues RaisedResolvedSign-off
Day 1IT Operations3Incident creation, routing, escalation22Yes
Day 2Service Desk2SLA compliance view, breach alerts21Partial
Day 3Management2Dashboard, reporting, KPI views11Yes
Day 4IT Operations1Re-test Day 2 open issue + regression11Yes
Day 5All8Full end-to-end walkthrough00Yes — Full

🔍 Finding

Service Desk users flagged that the SLA breach alert email did not display the ticket priority in the subject line — a usability issue not caught in SIT. This caused confusion in triaging urgent tickets from email alone.

✅ Why the Team Must Act

Email template was updated by the dev team to include priority and ticket category in the subject line before Day 4. This directly improved the team's ability to triage from inbox without opening the portal — saving estimated 3-4 minutes per P1 alert.

05🐛

Defect Lifecycle

JIRA Defect BoardSeverity MatrixRe-test Tracking

All defects were logged with severity, steps to reproduce, expected vs actual results, screenshots, and developer assignment. A defect board in JIRA tracked status from New → In Progress → Fixed → Re-test → Closed. The full lifecycle for all 25 raised defects was managed and closed before UAT sign-off.

Defect Summary — Full Lifecycle (SIT + UAT Combined)

Defect IDModuleSeverityStatusDev Fix TimeRe-verified
DEF-007SLA Timer LogicCriticalClosed24 hrsYes
DEF-008Breach Email AlertCriticalClosed48 hrsYes
DEF-015API IntegrationCriticalClosed72 hrsYes
DEF-019User PermissionsHighClosed36 hrsYes
DEF-021Email Subject LineMediumClosed8 hrsYes
DEF-024Report Date FilterLowClosed12 hrsYes
DEF-025Dashboard Load TimeLowClosed16 hrsYes

🔍 Finding

All 25 defects were closed before final UAT sign-off. 3 critical defects were resolved within 72 hours of being raised. Zero defects were carried into production — a full clean release.

✅ Why the Team Must Act

The defect-free production release was achieved because every critical defect was escalated immediately with clear reproduction steps and business impact documented. This allowed the dev team to prioritise correctly and avoid costly post-production fixes.

06📄

Sign-off & Documentation

Test Summary ReportSign-off FormConfluence

A full Test Summary Report was produced documenting test coverage, execution results, defect summary, open risks, and sign-off recommendation. The report was presented to the project manager and client lead. Formal sign-off was obtained, clearing the build for production deployment.

Test Summary Report — Final Metrics

MetricValue
Total Test Cases Designed248
Total Test Cases Executed248 (100%)
Pass on First Run229 (92.3%)
Defects Raised (SIT + UAT)25
Critical Defects3 (all closed)
Defects Open at Sign-off0
UAT Sign-offApproved — all 8 users
Production Release StatusCleared for deployment

🔍 Finding

100% test execution coverage was achieved with a 92.3% first-run pass rate. All 25 defects were resolved and closed. The project delivered a defect-free production release on schedule.

✅ Why the Team Must Act

The structured approach — test plan agreed upfront, critical defects escalated immediately, UAT coordinated with real end users — is what enabled zero defects at go-live. Skipping any of these steps would have created rework and production risk.