ent
1. Introduction
The purpose of this Test Strategy document is to outline the approach and scope of the testing activities for the NetSuite project. It ensures that all functionalities are thoroughly tested, meets business requirements, and achieves a successful implementation.
2. Objectives
- Validate that NetSuite configurations, customizations, and integrations function as intended.
- Ensure data accuracy under various conditions and NetSuite records.
- Mitigate and minimize risks by identifying and addressing defects before deployment through comprehensive testing and defect resolution.
- Confirm compliance with business and regulatory requirements.
3. Scope of Testing
3.1 In-Scope
- Standard Modules: Financials, CRM, Inventory, Procurement, Sales Orders, etc.
- Items included are :
- Inventory Items
- Non –Inventory Items
- Kit/ Package items
- Item groups
- Other charge items
- Service items
- Discount items
- Mark up items
- Customizations: Auto-Generated numbering for Invoice and Purchase Orders, SuiteFlows, custom fields, and forms.
- Workflows include PO approval, Vendor Payment Approval,Invoice approval,Payment Receipt Approval Process, Journal Entry Approval
- Automatic email to customers with overdue payments, weekly basis post due date
- Customized PDFs for
- open invoices sent to customers,
- Payment advice sent to vendors upon payment with invoices and TDS data
- Purchase Order
- Invoice
- Credit Notes
- Debit Notes
- Delivery Challans
- Financial charges for overdue payments
- Integrations: SuiteTax, Cleartax
- Data Migration: Validate data completeness and integrity post-migration.
- User Roles and Permissions: Verify access control based on roles.
- Reports and Dashboards: Ensure reports reflect accurate data and comply with business formats.
3.2 Out-of-Scope
- Phases or modules not included in this implementation.
- Consolidated Exchange Rates
- Purchase requisition
- Proforma Invoice
- Approval for Vendor Return Authorization
- External systems not part of the integration in this phase.
- Testing scenarios outside project boundaries unless otherwise specified.
4. Testing Types
Testing Type
Description
Unit Testing Validate individual components (e.g., scripts, workflows) by developers.
System Integration Testing (SIT) Ensure seamless integration between NetSuite and third-party systems.
Functional Testing Validate end-to-end business processes align with requirements.
Data Migration Testing Verify data accuracy, completeness, and integrity post-migration.
Regression Testing Ensure new changes do not affect existing functionalities.
User Acceptance Testing (UAT) End users validate the system against real-world scenarios and requirements.
5. Test Environment
- SandBox environment for QA: Mirrored setup of the production environment for all initial testing.
- Test Data: Use anonymized real-world data for accurate validation.
- UAT Environment: Used for end-user validation with near-live configurations.
- Production Environment: Final deployment environment for sanity testing.
Environment Setup Checklist
- Configured roles, permissions, and access control.
- Loaded test data resembling real-world data.
- Integration points set up with external systems, if applicable
Test Data Management
- Use anonymized production-like data for realistic test scenarios.
- Prepare test cases for positive, negative, and edge-case scenarios.
- Optimized usage of test data in testing wherever feasible.
- Test data and test execution carried out in SB environment will be lost post SB refresh
6. Test Roles & Responsibilities
Role
Responsibilities
Test Manager Define strategy, manage timelines, and track progress.
Test Lead Plan, design, and review test cases of team members.
Test Analyst Execute test cases, log defects, and verify fixes.
Developers Conduct unit tests and fix identified defects.
Business Users Participate in UAT and validate real-world scenarios.
7. Test Deliverables
7.1 Documentation
- Test Cases
- Test Scripts (Automated, if applicable) : NA
- Test Execution Reports
- Defect Reports
- UAT Sign Off Document.
7.2 Tools
- Test Management Tools: JIRA
- Automation Tools (if applicable) like Selenium, SuiteCloud Testing Framework (SCTF) : NA
- Data Validation Tools: Excel, SQL queries, and native NetSuite Saved Searches and other NetSuite SuiteAnalytics options.
- Defect Tracking: JIRA Project Management Tool
8. Testing Process
- Requirement Analysis: Understand functional and technical specifications.
- Test Case Design: Create detailed test cases for all scenarios specified in the requirement document.
- Environment Setup: Ensure the data is loaded correctly in SandBox
- Test Execution: Execute tests, log defects, and verify fixes.
- Regression Testing: Re-test impacted areas after defect fixes.
- UAT: Facilitate end-user testing and gather sign-offs .
- Production Validation: Conduct sanity checks post-deployment.
9. Entry and Exit Criteria
Stage —- Entry Criteria —- Exit Criteria
Unit Testing —- Development completed; test data prepared and executed by the assigned developer.—- All unit tests executed and defects fixed.
System Testing/ SIT —- Unit tests completed; SandBox environment set up as per requirement.—- Test execution completed with critical defects resolved. Certain defects or test cases can be deferred to next release, like in Agile based on agreement with stakeholders of the project
UAT —-System testing completed ( based on milestone / release); UAT data loaded. UAT user documentation manual shared with client. —-End-user sign-off obtained for business-critical scenarios.
Go-Live —-UAT completed successfully; production environment ready. —-Final sign-off received from stakeholders.
10. Risk Management
Risk Mitigation Plan
Delays in test environment setup Ensure early communication with the infrastructure team.
Unavailability of test data Prepare and validate test data during the planning phase.
High number of defects in UAT Allocate sufficient time for defect resolution and re-testing.
Third-party system downtime during SIT Conduct testing in phases and plan for mock integration testing, if feasible
Sandbox refresh Test data and test execution carried out in SB environment will be lost post SB refresh
11. Test Metrics and Reporting
- Test Coverage: Percentage of executed vs. planned test cases.
- Defect Status Report: Count of defects based on their current statuses.
12. Approval
Sign-off is required from the following stakeholders to proceed with the next phase.
Role Name Signature Date
Project Manager
Test Manager
Business Owner