Data Engineer
Company | Aptive |
---|---|
Location | Provo, UT, USA |
Salary | $Not Provided – $Not Provided |
Type | Full-Time |
Degrees | Bachelor’s |
Experience Level | Mid Level, Senior |
Requirements
- 3+ years of experience in data engineering
- Strong proficiency with AWS services (Lambda, S3, Secrets Manager) and cloud architecture
- Advanced SQL skills with experience in query optimization
- Experience with Python, including type hints and functional programming principles
- Hands-on experience with Airflow (versions 1.10.15 and 2.9)
- Familiarity with dbt for data transformation and testing
- Experience with version control systems (GitHub) and CI/CD pipelines
- Understanding of data validation techniques and processes
- Experience using AI tools and LLMs to automate routine data tasks and improve workflow efficiency
Responsibilities
- Design and implement end-to-end data migration pipelines for M&A integrations, ensuring data integrity and validation
- Develop and maintain Airflow data pipelines for extracting, transforming, and validating data across systems
- Build standardization and normalization scripts to ensure consistent data structure post-migration
- Create robust validation processes to identify and document data discrepancies between systems
- Collaborate with IT, M&A teams, and external vendors to coordinate data migrations
- Build observability tools to monitor data quality, completeness, and accuracy
- Implement and maintain documentation for data engineering processes
- Optimize database performance through proper configuration of clustering keys and materialization strategies
- Troubleshoot and resolve data integration issues using systematic debugging approaches
Preferred Qualifications
- Experience with Snowflake and its optimization techniques
- Experience with CRM data migrations or integrations
- Knowledge of M&A data integration processes
- Bachelor’s degree in Computer Science, Information Technology, or related field
- Data engineering certifications (AWS, Snowflake, etc.)
- Experience with API integrations and external vendor collaborations
- Understanding of data governance and compliance requirements
- Experience with scripting for data normalization and standardization
- Advanced expertise in leveraging AI tools for code generation, documentation creation, and data pipeline optimization
- Ability to craft effective prompts for AI assistants to enhance productivity in data engineering workflows