Posted in

Data Quality Engineer – Operational Platform

Data Quality Engineer – Operational Platform

CompanyFocus Financial Partners
LocationSt. Louis, MO, USA
Salary$Not Provided – $Not Provided
TypeFull-Time
DegreesBachelor’s
Experience LevelJunior

Requirements

  • Bachelor’s degree in Computer Science, Information Systems, or a related field, or equivalent work experience.
  • Strong understanding of data quality methodologies, tools, and processes.
  • Hands-on experience with SQL and data querying for validation and testing.
  • Familiarity with ETL processes, data pipelines, and data warehousing concepts.
  • Experience in scripting and test automation using tools such as Python, Java, or other relevant technologies.
  • Knowledge of CI/CD pipelines and tools such as Jenkins or GitLab.
  • Familiarity with tools like dbt and Great Expectations for data testing and validation.
  • Knowledge of data visualization tools (e.g., Tableau, Power BI) and their role in validating data outputs.
  • Strong analytical and problem-solving skills.
  • Excellent communication and collaboration abilities.

Responsibilities

  • Collaborate with the Data Engineering team to understand requirements, design specifications, and technical implementations.
  • Develop and execute test plans, test cases, and test scripts for ETL processes, data pipelines, and data transformation workflows.
  • Perform end-to-end testing of data platforms to ensure accuracy, reliability, and scalability.
  • Validate data integrity across multiple systems and environments.
  • Identify, document, and track defects, working with the engineering team to resolve issues in a timely manner.
  • Automate testing processes for data validation and system performance to improve efficiency.
  • Implement testing and validation using tools such as dbt and Great Expectations.
  • Monitor data quality and identify opportunities for improving test coverage.
  • Validate APIs and RESTful services to ensure proper integration and data flow between systems.
  • Provide regular updates on testing progress, issues, and risks to the team and stakeholders.
  • Stay up-to-date with industry best practices and new tools for data testing and quality assurance.

Preferred Qualifications

  • Experience with big data technologies (e.g., Hadoop, Spark, Kafka) and cloud platforms (e.g., AWS, Azure, GCP).
  • Familiarity with CI/CD pipelines and tools such as Jenkins or GitLab.
  • Knowledge of data governance and data security best practices.
  • Experience testing APIs and RESTful services for integration and functionality validation.