
Project Soul Test Plan
Revision History
Date: April 30, 2010
| Ver No. | Date of Release | Prepared By | Reviewed / Approved By | List of Changes from Previous Version |
| 0.1 | Jonathan Jacinto | Draft version | ||
| 0.2 | Jonathan Jacinto | – Added Assumptions – Added Novare resource for Defect Manager and Technical Lead – Added test environment diagrams – Added timeline – Added test execution flow | ||
| 0.3 | Jonathan Jacinto | – Modified version Released for Project Team’s Review | ||
| 0.4 | Jonathan Jacinto | – Modified version based on the Project Status meeting |
- Introduction
- Purpose
This document shall provide a detailed list of work packages and documentation required to carry out the User Acceptance Testing of Project Soul. This will define how UAT should be conducted and outlines the objectives, test scope, test strategy, process and standards.
- Document Scope
The scope of the document only covers the plan for the end to end testing of Project Soul functional deliverables.
- Audience
Intended audience for this document includes:
- Project Management Team
- UAT Team
- Technical Team
- Novare
- Definition of Terms
The following terms will be used in this document and are defined as follows:
| Term | Definition |
| GT | Globe Telecom; refers to the project team of Globe |
| Novare | Third Party vendor that contracted to develop the New Store System of Globe Stores |
| SLA | Service Level Agreement |
| Cycle | One round of testing covering all the test cases |
| UAT | User Acceptance Testing |
| QA | Quality Assurance |
| NSS | New Store System |
| DFS | Detailed Functional Specifications |
| KPI | Key Performance Indicator |
| Banig | Consolidated List of Test Scripts |
| FMO | Future Mode of Operations |
- Background
- Project Background
In line with the Stores Strategy and Development Roadmap of for 2010, Project Soul will address the identified gaps in the IT Systems/Needs of Stores.
And in so doing, Project Soul implementation aims to:
- Provide IT solution to support the new thematic style of Globe Stores
- Equip Globe stores with adequate tools to serve customers better and faster by developing a front-end system that would allow ease of use and reliable access to currently used systems and webtools
- Provide reliable sales and inventory updates to stores
- Enable Stores to handle several supplier arrangements simultaneously
- Provide useful report that would measure the SLAs and KPIs of Stores
The following has been identified to be the Primary Functional deliverable of Project Soul:
- One Frontliner View – A unified stores portal which shall provide store frontliners and other staff with a single interface to systems critical to store operations.
- Self Help Kiosk – allows customers to perform simple self-service after sales transactions, thereby improving customer experience and reducing store queues. For the June 1 deliverable only the SHK welcome screen will be tested to land on the Esprit
- Inventory and Partner Management – allows store users to manage and track inventory at the store level. For June 1 deliverable the Reservation, Request for Replenishment, Receiving, Repair and Reports will be delivered.
- Paperless Transactions – allows direct encoding by store frontliners of applications and image capture of relevant documents. For June 1 deliverable the New Application for consumers, scanning of required documents and digital signature will be tested.
- Executive Information System (Reports) – provide stores with access to reports critical to store operations.
- User Experience
Please refer to the DFS for details on the User experience.
- Objectives
The objective of the UAT is to simulate and test the functionalities in accordance to the specifications in the DFS. It aims to test the capability and performance of the system. It is also the objective of the UAT to identify, document, resolve and manage system/application and process defects. The following specific objectives will also apply:
- To simulate the different user specified scenarios.
- To ensure alignment of process to the New Store System.
- To ensure readiness of the system prior to the commercial launch date.
- Scope of UAT
- UAT Scope
The scope of UAT is limited only to the following modules and functionalities that are necessary for the June 1 deliverables:
| Module | Functionality |
| One Frontliner View | – Sign on – Sign off – Link to Paperless Transaction – Link Executive Information System (EIS) – Link to Inventory System – Link to Inquiry Tool |
| Paperless Transaction | – Digitized Application Form Save data in the Digital Application Form (Consumer) – Scan Required Documents – Link scanned document to the customer record in the Digital Application Form – Access to Scanned Documents -Sign using the electronic signature pad and save – Capture electronic signature – Attach scanned file and signature to customer accounts in the Digital Application Form – Search and view record digitized application form |
| Inventory | – Online Stock Replenishment Request Create, Edit, View, Submit – Reservation (Stocks): Create, Edit, View, Search – Receiving Search, Update Remarks/Status, Submit – Repair Create WO, Update WO Status, Create Repair Info Form Note: Above UAT scope will be confirmed after the Inventory meeting. |
| Executive Information System | – Retail Inquiry Count – Service Transaction – Channel Count – Product Interactivity – Traffic Analysis – Traffic Analysis – Customer Traffic – Customer Served per Group – Shop Offers – Profit & Loss – Revenue Sources – Transactional – Co-location – Commission Income Commsion Income Call cardsPkitsSmpacksSmcardsTattooGHP PostpaidPseudo RevenuesBroadband Landline – Expense Sources – Activations – End of Day (After Sales) – KPI – Line Wait – Transaction Time The following reports will be tested during the Inventory UAT as scheduled: – Retail Transactions – On Hand Status – Defective Stock Status – Transfer Status – Replacement Status – Return Status – Issuance Status – Reservation Status – Sales Trend Analysis and Inventory Level – Retail Sales Inventory – Volume – Sales Issuance (per SKU) – Sales Issuance (per Store) – Sales Run Rate (per SKU) – Sales Run Rate (per Store) |
| Esprit Kiosk | – SHK landing page connected to Esprit |
Detailed Test scenarios will be consolidated in a Test banig.
- Exclusions / Out of Scope
The following transactions are not part of the UAT:
- End to end testing of all legacy systems for SSO.
- Functionalities to be delivered after June 1.
- Test scenarios not part of DFS and signed Test Cases and Script.
- For Paperless, FMO will only be tested less the actual activation of accounts for new application.
- Validation of fields in the Paperless Transaction
- Assumptions, Dependencies, Risks and Mitigation
| ASSUMPTION | DEPENDENCY | RISK | MITIGATION |
| Prior UAT, a Test Environment should be available | Availability of a Test Environment | Impact on quality of system capability and functionality if to be tested directly in production. Should defects and errors occur, system performance is greatly impacted. | Ensure availability of a separate test environment apart from production. In the absence of UAT environment, Sat environment may be re used. |
| Prior UAT, a Test Data should be available | Availability of a Test Data | Accuracy and completeness of reports will be uncertain Trash data may be expected | Ensure that existing reporting sources provides complete and accurate records |
| All functional and system interface test cases should have been executed and passed in the SAT. | Full Coverage and Acceptance of SAT | The probability of having defects on uncovered scenario is High during UAT. Consequently, it will affect the UAT timeline Another unscheduled cycle of testing might be considered. Missed out requirements is possible | Ensure completeness of SAT coverage. Provide a realistic timeline in UAT schedule Check the system release if its correct. Consider all the possible scenarios in the requirements on the 2nd pass |
| Non-functional testing such as performance, security, volume, business continuity and operations acceptance will be conducted during this phase after office hours. | Functional UAT to finish on time based on set schedule | If testing will be concurrent with the functional UAT, test environment performance might be affected and might delay UAT completion | Ensure that a different test environment will be used for non functional testing. Or at least performance testing should be done separately as planned. |
- Team Organization
- Roles and Responsibilities
| Project Manager | Overall responsible for the projectLeads the UAT strategy planningRecommends prioritization of requirements to project sponsorAssists in the preparation of UAT logistics |
| UAT Lead | Manages the UAT TeamEnsures UAT is performed according to planResponsible for the on-time performance of UATReports issues and concerns to Project ManagerProvides a regular progress report to Project ManagerEnsures all logistical requirements of UAT team are ready prior to UAT proper |
| GT Defects Manager | Tracks and monitors the status of the defectsClassifies/validates the severity of a defect raised together with Novare |
| Functional Testers | Performs the test cases relevant to Globe internal processes Raises defects to Defects Manager if inconsistencies in actual results vs. expected results are encountered |
| ISG Technical Lead | Provides ISG Technical support (e.g. network connectivity of the PCs to be used for UAT, etc.) |
| Novare Technical Lead | Overall responsible for the technical resolution of defects/issues |
| Novare Defects Manager | Overall responsible for the technical assessment of defects/issuesIdentifies accountability of defect resolution (e.g. Globe or NOVARE)Tracks and monitors the status of the defects raised by GTUpdates GT if defects are fixed and therefore ready for retest |
- Resource Assignment
| Role | |
| UAT Lead | Novare: Jeanette Cruz Co lead by James Manalo |
| GT Defects Manager | James Manalo |
| NOVARE Defects Manager | Melissa Cordon |
| ISG Technical Lead | Imee Gonzales |
| Novare Technical Lead | Joel Realubit |
| Functional Testers | Retail Specialist – Lerry Villamin / Archie Devera / Pam Buenaventura Custodian – Jason Ibanez / Nina Rasiles RM / ARM – Mavic Leonardo / Susan Flores / Issa Cabiluna RAH – Victor San Pedro / Winston Dalida / Mario Gozun Globe Store Inventory and Planning– Jo Canizares Service Delivery Support– Emma Raynera / Cristeta Garcia Stores Support Head or Stores Support Ops Head – Jacqui Simbajon SSD Head or Sr. Store Strategy Specialist– Monica Mercado CatMan/CM/SSD – Asi Ruiz SSD/Admin – TBD Inventory Accounting – Jas Balaoro DCM – Melai Marcelo |
- Test Strategy
- Test Approach / Methodology
There will be a total of 6 working days allotted for the UAT. This should cover 3 passes (2 functional Passes, 1 Clean Pass) for all Functional deliverables of the Project. (see timeline for the breakdown of test schedule). With the objective to ensure service and process readiness, the team will cover all test scenarios stated in the banig. 1 Clean pass shall ensure consistency of favorable results which will be enough to declare RFS.
UAT will only commence once all Entry criteria are met. Actual testing is scheduled within 9am-6pm of the working day. This is with the assumption that the test representatives are dedicated solely for the project. An extended working hours may be required if found necessary. A technical support representative should be present to address issues that may need clarification and/or immediate resolution. Program Manager may also require a weekend testing should the need arise.
The objective of the UAT team is to complete all test cases in a cycle within the planned schedule, regardless if there are defects (major/minor) or errors encountered during the activity. If tester encounters an error or an unexpected result, tester will register the result in the test log and inform the Defects Manager about the defect. Tester will then proceed with the succeeding test scenarios until the end of the cycle. At the end of each testing day, tester will update the Test Banig with corresponding Statuses:
Pass tested – Test scenario was covered with Pass result
Failed tested – Test scenario was covered with Failed result
Not tested – Test scenario not yet tested. Pending for testing
Cannot be tested – Test scenario is dependent on the resolution of the defect encountered by another test scenario
On the next cycle, the Failed, Not tested and Cannot be tested scenarios will be covered. As soon as the defects have been resolved and all scenarios have been covered, a final Clean Pass will be made to ensure consistency of results.
UAT will be considered complete and successful if all functionalities are working as expected. There should be no defects classified as Severity 1 and 2 and all reports are accurate and complete. Lastly, UAT results will be documented, accepted and signed-off by all support groups to affirm readiness of the service functionality.
A shallow test will be conducted to ensure that the service is working consistently in production. Systems testing will be integrated in the full end to end process rehearsal.
- Test CriteriaEntry Criteria
Listed below are the pre-requisites before a cycle begins:
| Requirement | Responsible |
| SAT (Functional) Certificate | Novare |
| SAT Logs | Novare |
| SAT Defects Register | Novare |
| UAT Setup Checklist (e.g. UAT Test, data and environment) | Novare c/o Rico Salgado / ISG Tech Lead / PM |
| Test tools | Novare/Project Manager |
| Signed UAT Strategy, Test Cases, Test Script and Test Banig | Project Manager |
- Acceptance/Exit Criteria
The UAT will be accepted if the following are achieved:
- 100% completion of all test cases.
- All defects encountered should be recorded in the defects log.
- One (1) clean pass of the UAT should be accomplished for each functional deliverable.
- No defects classified as critical (1) or major (2) are outstanding. Workarounds should be available if Severity 1 and 2 Defects are still unresolved prior to the User Acceptance Certificate. All defects classified as minor or trivial are fully documented for acceptance of Project Manager.
- Action plan for minor defects should be documented.
- Zero defect on functional testing, and reporting.
- Signed User Acceptance Certificate.
- Test Plan Test Case
Test Cases will be consolidated in the Appendix.
- Test Tools and Resources
The following Test tools and resources are identified pre-requisites in the UAT.
| Test Tool | Responsible |
| Pre UAT Set UP | |
| UAT Room | GT PM |
| UAT Desktops | GT PM |
| Test Environment | Novare |
| UAT Proper | |
| Test Cases | Novare |
| Test Scripts | Novare |
| Test Banig | Novare |
| Traceability Matrix | Novare |
| Defects Register | Novare |
| Other UAT Tools | |
| Printer | SSD |
| Dummy Data for Inventory and Reports | ISG |
| Test Account for New Application | |
| Sample Documents for Scanning | |
| Signature Pad | SSD |
| Scanner | SSD |
| Switch | SSD |
| Others | |
| Budget | GT PM |
- Test Environment
Project has a total of four (3) environments deployed:
- Development and SAT Environment: used by NOVARE for development and resolution of defects; no connectivity to other network elements

- UAT Environment: used by UAT team and setup by Novare.

- Production Environment: upon cutover, all Soul elements will be deployed to Globe the live network.
- Testing Site
So as to ensure immediate reporting and resolution of issues and concerns, the UAT team will be co-located in the CE Break Out Rm 4 5th Fl GT II (and B1 UAT Room GT I if space is not enough), while the Novare Technical Team is stationed in 5th fl GT II SIA Area.
- Test Execution Test Execution Flow
Prior UAT, a walkthrough will be done together with the UAT Team to set the expectations of the testers, discuss cycle of testing, tagging test results, issue reporting, what and how to test. The schedule for the walkthrough is as follows:
| System | Schedule |
| EIS | May 17, 9am-12nn |
| Inventory | May 17, 1pm-5pm |
| OFV/SHK/PLT | May 18, 9am-12nn |
After the walkthrough, the OFV Administrator will setup the user profiles of the testers which will happen on May 18 from 1pm onwards.
OFV Administrator will have to define:
- Create Store
- Create Group
- Create Sub tabs/Legacy sytems
- Create Roles
- Set Roles Status
- Update User
- Roles
- Tab
- Store
- Manage Product Set
On the next day, the UAT proper will commence with the approved test cases and test scripts. The following schedule will be followed:
| Modules | UAT Proper | Location | Schedule |
| OFV, PLT, EIS | 1st cycle | UAT Rm CE Break Out Rm 4 5th GT II | May 20-21 |
| 2nd cycle | May 22 | ||
| 3rd cycle* | May 24 | ||
| Clean Pass | May 25 | ||
| INVENTORY | 1st cycle | May 27 | |
| 2nd cycle | May 28 | ||
| 3rd cycle* | May 29 | ||
| Clean Pass | May 31 |
*3rd cycle is conditional and will arise if there will be Failed, Not Tested, Cannot be Tested after the 2nd cycle.
*** Actual Day allotment is exclusive of weekend. However, if the need arise, Program Manager may request so. An extended working hours may also be required if found necessary.
Testing of all modules will start at the same time and will be done concurrently. If one module finishes early than expected, tester may already proceed with the next pass. This is with the assumption that there are no dependencies to the ongoing testing of other modules.
The above mentioned UAT schedule only covers SSO, Paperless, SHK and EIS. For Inventory, UAT schedule is TBD.
The systems will be tested according to the process flow:
One Frontliner View (User)
| Login | ||
| Default Tab Arrangement | ||
| Tab Management | ||
| Access Legacy Systems | ||
| Single Sign on | ||
| Basic Functionality Test (Legacy Systems) | ||
| Sales Tagging during inquiry | ||
Paperless Transaction
| Login to OFV |
| CSR to open the Paperless TransactionCSR to Create new ApplicationCSR to interview the consumer and enter the information |
| CSR Save new applicationConsumer to view the Digital Application Form and read if all the information is correct |
| Consumer to Sign on the signature pad |
| CSR to scan required documents |
| CSR to print a copy of the Application |
| Sales Tagging during inquiry |
Reports
| Retail Inquiry Count |
| Service TransactionTraffic AnalysisCustomer trafficCustomer Served per GroupP&L – revenue SourcesActivationsEnd of Day (After Sales)KPI -Line WaitKPI – Transaction TimeKPI – ICS ResolutionOverall TransactionSingle ViewOn Hand StatusDefective Stock StatusTransfer StatusReplacement StatusReturn StatusIssuance StatusReservation StatusSales Trend Analysis and Inventory LevelVolumeSales Issuance (per SKU)Sales Issuance (per Store)Sales Run Rate (per SKU)Sales Run Rate (per Store)Channel CountProduct InteractivityShop Offers / Deals |
At the end of each UAT, all issues will be consolidated by the Defect manager and be assessed by the Novare and Globe team. If issues are valid, Novare will fix and the test case shall surpass another UAT with the assigned tester.
- Timeline
| No. | Activities | PIC | Date |
| 1 | UAT Walkthrough | Globe, Novare | May 17-18 |
| 2 | UAT Proper | Globe, Novare | May 19-26 |
| 1st cycle | UAT Team | May 19- 21 | |
| 2nd cycle | UAT Team | May 24-25 | |
| Clean Pass | UAT Team | May 26 | |
| 3 | UAT Signoff | Novare, Globe | May 27-28 |
| 4 | Go Live | Globe, Novare | June 1 |
- Defect Management
This section explains how Defects should be managed and resolved. It also intends to establish the clear accountability of work packages across project team members, to facilitate issue reporting, escalation and resolution.
The UAT Manager shall probe into the errors received from the testers to verify its validity. All defects noted during testing will be consolidated by the Defects Manager and will be submitted to the Technical Lead for resolution. All findings/resolution should be documented in the Defects Registry. Should defects remain open on the set resolution TAT, such will be escalated to Program Manager for necessary action.
Defects investigation and fixes should be carried out by Novare in a UAT environment. This is done instantaneously as defects are being reported to them by the Defects Manager.
Major defects may temporarily halt the testing to allow full resolution while minor defects, on the other hand, will be dealt accordingly without the need to stop the progress of the testing. Set TAT of resolution will be strictly monitored. Retests of defects will be observed to ensure that the defects are indeed resolved. Defect forms will be utilized to document all defects.
At the end of each cycle, Novare and/or ISG will be given time to apply the fixes and perform testing in the SAT environment. Fixes have to be in place in the UAT environment before proceeding with the next UAT cycle.
- Defects Reporting
Defects Manager is responsible to document the defects as soon as reported by the testers. Severity level will be assessed by the Defects Manager, with concurrence of Technical Lead and user. Severity Levels are classified based on the following criteria:
| Severity Level | Criteria |
| Severity 1: Critical Defect | Defect impacts the main functionality of Project SoulDefect completely prevents further execution, regardless of the environment and work cannot reasonably continue (e.g. server downtime). |
| Severity 2: Major Defect | Functionality defects encountered are generating erroneous output with no workaround. However, execution can continue in a restricted fashion.Actual result is different from expected result |
| Severity 3: Minor Defects | Defect is not related or has no impact to any functionality or module and no core changes requirement in the system.Defect that has minimum impact on the system. The system may continue processing of transactions without correcting these errors. |
| Severity 4: Trivial Defects | These defects are merely cosmetic in nature. Testers may give recommendations to improve the user friendliness of the service (e.g. font, background color, etc.). |
Defects Manager will send the Defects Register to the Technical Lead for further investigation and assessment. The report should include:
- Module / Functionality being Tested
- Date of transaction
- Time of transaction
- Expected result
- Actual result / error
For monitoring purposes, the Defects Manager updates the Defects monitoring log accordingly.
- Defects Resolution
Resolution of defects can be done by Novare Team while UAT is still in progress. This is with the assumption that defect encountered does not completely prevent further execution of the other test scenarios (Severity 1 defects). If that happens, defect should be completely resolved and cleared prior resumption of UAT. To validate if the defects are indeed fixed, Novare will have to simulate all functionalities affected or the particular test scenario at least using the SAT environment.
A turn around time for resolution has also been set to ensure that defects are acknowledged and resolved on time. Please see below:
| SEVERITY LEVEL | BUSINESS IMPACT | PRIORITY CORRECTION | TURN AROUND TIME OF FIXING |
| Level 1 – Critical Defect | Very High | Highest | Response time: 1 hr Resolution: 2 hours |
| Level 2 – Major Defect | Moderately High | Moderately High | Response time: 2 hrs Resolution: 4 hrs |
| Level 3 – Minor Defect | Moderately Low | Moderately Low | Response time: Half day after receipt of defect Resolution: 1 day |
| Level 4 – Trivial Defect | Very Low | Very Low | Response time: 1 day Workaround: Resolution: Half day after receipt of defect |
- Communication Plan
| Type | Owner | Audience | Frequency | Medium |
| Defect Report | Defects Manager | ISG Technical Lead, Novare Defects Manager Copy: Program Manager | As encountered | |
| Defect Assessment | Defects Manager, Novare Defects Manager, ISG Technical Lead, Functional User | Defects Manager, Novare Defects Manager, ISG Technical Lead, Functional User Copy: Program Manager | As encountered | Email, Face to face, Call |
| UAT Progress Report | UAT Lead | Project Manager, Defects Manager, ISG Technical Lead, Novare Defects Manager | Daily EOD |
- Progress Reporting
The UAT Lead issues a regular progress report to Project Manager and team. The update contains the following information:
- Total number of Test Cases and percentage of:
- Test cases covered
- Passed
- Failed
- Not tested
- Cannot be tested
- Percentage per Module
- Test cases covered
- Total number of defects encountered
- Defect number
- Defect description
- Severity level
- Call-outs, e.g. slow response of system, recurring defects, etc.
- Issues, e.g. communication problems, unavailability of support, etc