TEST LEAD
HCL
Test Lead
Job Summary
To plan, deploy and manage the testing effort/work plans for the project [module] to meet the client /internal requirements as per the quality standards prescribed. (1.) Key Responsibilities
1. To participate in functional as well as technical discussions with the client /team to understand functional /design specifications ,highlight performance concerns, inconsistencies ,prepare automated test scripts,maintain test case suite,ensure the appropriate test environments and infrastructures are in place/kept upto date.
2. To develop and continuously improve automated tests as new system features/ enhancements are developed and accordingly create work plans| monitor and track the work schedule for on time delivery as per the defined quality standards.
3. To create reusable /scalable test automation framework, develop test strategies/plans, maintain test data and to submit status reports for minimizing exposure and risks on the project or closure of escalations.
4. To develop/guide and mentor QA engineers in the use of the testing framework , enhancing their technical capabilities and increasing productivity.Additional requirement :Tech Stack: ETL testing Informatica PowerCenter, dbt Data Build Tool, SQL, Snowflake , Git, JIRA
Responsibilities
1. QA Strategy and Planning
- Define QA strategy for a hybrid ETL stack involving Informatica, and dbt for transformationmodeling.
- Analyze STTMs (SourcetoTarget Mapping documents), business rules, and dbt model definitions to derive test scope.
- Plan and manage test phases across Program Increments (PIs) or releases: unit testing support, system testing, integration testing, and UAT support.
2. Informatica-Specific QA Responsibilities
- Review and validate Informatica mappings, workflows, sessions, and parameter files.
- Validate:
o Source data extraction (flat files, RDBMS, mainframes, etc.)
o Data transformations via mappings (joins, lookups, aggregations, filters, etc.)
o Load strategies to stagingODS layers (insert/update strategies, truncate/load, CDC).
- Use SQL queries and data comparison tools to perform reconciliation and transformation validation.
- Validate job dependencies and control table mechanisms (if used).
- Monitor job runs via Informatica Monitor/Workflow logs, and ensure correct execution paths.
3. dbt-Specific QA Responsibilities
- Validate dbt models, tests, and documentation in the analytics layer.
- Review and test:
o Model SQL logic for business rule accuracy and performance
o YAML files for correct metadata and testing configurations
o dbt tests (unique, not null, accepted values, relationships, custom tests)
- Execute dbt test suites using dbt test and investigate failures.
- Collaborate with analytics engineers to validate , materialization strategies (table, view, incremental), and source freshness logic.
- Review generated SQL in target/compiled to ensure correct transformations.
4. Data ValidationReconciliation
- Perform full and sampled data validation across source ? staging ? ODS ? analytics data layers.
- Write complex SQL queries to compare record counts, aggregates, and field-level data across layers.
- Ensure data consistency and integrity using both manual validation and dbt tests.
- Validate PII masking and data obfuscation where applicable.
5. Automation and DevOps Integration
- Build and maintain reusable SQL-based test automation scripts for recurring validations.
- Integrate QA checks in CI/CD pipelines using git Actions.
- Monitor and validate Informatica batch schedules and dbt Cloud jobs (or CLI triggers).
- Contribute to QA dashboards and test reporting automation.
6. Defect ManagementReporting
- Log and track defects in JIRA, categorize by layer (Informatica/dbt), and severity.
- Facilitate daily defect triage calls with data engineering teams.
- Provide detailed test execution status reports and test coverage metrics by data domain.
Job Summary
To plan, deploy and manage the testing effort/work plans for the project [module] to meet the client /internal requirements as per the quality standards prescribed. (1.) Key Responsibilities
1. To participate in functional as well as technical discussions with the client /team to understand functional /design specifications ,highlight performance concerns, inconsistencies ,prepare automated test scripts,maintain test case suite,ensure the appropriate test environments and infrastructures are in place/kept upto date.
2. To develop and continuously improve automated tests as new system features/ enhancements are developed and accordingly create work plans| monitor and track the work schedule for on time delivery as per the defined quality standards.
3. To create reusable /scalable test automation framework, develop test strategies/plans, maintain test data and to submit status reports for minimizing exposure and risks on the project or closure of escalations.
4. To develop/guide and mentor QA engineers in the use of the testing framework , enhancing their technical capabilities and increasing productivity.Additional requirement :Tech Stack: ETL testing Informatica PowerCenter, dbt Data Build Tool, SQL, Snowflake , Git, JIRA
Responsibilities
1. QA Strategy and Planning
- Define QA strategy for a hybrid ETL stack involving Informatica, and dbt for transformationmodeling.
- Analyze STTMs (SourcetoTarget Mapping documents), business rules, and dbt model definitions to derive test scope.
- Plan and manage test phases across Program Increments (PIs) or releases: unit testing support, system testing, integration testing, and UAT support.
2. Informatica-Specific QA Responsibilities
- Review and validate Informatica mappings, workflows, sessions, and parameter files.
- Validate:
o Source data extraction (flat files, RDBMS, mainframes, etc.)
o Data transformations via mappings (joins, lookups, aggregations, filters, etc.)
o Load strategies to stagingODS layers (insert/update strategies, truncate/load, CDC).
- Use SQL queries and data comparison tools to perform reconciliation and transformation validation.
- Validate job dependencies and control table mechanisms (if used).
- Monitor job runs via Informatica Monitor/Workflow logs, and ensure correct execution paths.
3. dbt-Specific QA Responsibilities
- Validate dbt models, tests, and documentation in the analytics layer.
- Review and test:
o Model SQL logic for business rule accuracy and performance
o YAML files for correct metadata and testing configurations
o dbt tests (unique, not null, accepted values, relationships, custom tests)
- Execute dbt test suites using dbt test and investigate failures.
- Collaborate with analytics engineers to validate , materialization strategies (table, view, incremental), and source freshness logic.
- Review generated SQL in target/compiled to ensure correct transformations.
4. Data ValidationReconciliation
- Perform full and sampled data validation across source ? staging ? ODS ? analytics data layers.
- Write complex SQL queries to compare record counts, aggregates, and field-level data across layers.
- Ensure data consistency and integrity using both manual validation and dbt tests.
- Validate PII masking and data obfuscation where applicable.
5. Automation and DevOps Integration
- Build and maintain reusable SQL-based test automation scripts for recurring validations.
- Integrate QA checks in CI/CD pipelines using git Actions.
- Monitor and validate Informatica batch schedules and dbt Cloud jobs (or CLI triggers).
- Contribute to QA dashboards and test reporting automation.
6. Defect ManagementReporting
- Log and track defects in JIRA, categorize by layer (Informatica/dbt), and severity.
- Facilitate daily defect triage calls with data engineering teams.
- Provide detailed test execution status reports and test coverage metrics by data domain.
Confirmar seu email: Enviar Email