Python ETL Developer – Airflow & Data Mesh
Company | Jabil |
---|---|
Location | Lexington, KY, USA |
Salary | $Not Provided – $Not Provided |
Type | Full-Time |
Degrees | Bachelor’s |
Experience Level | Senior |
Requirements
- Proven experience as a Python ETL Developer with Apache Airflow
- Strong understanding of Data Mesh principles and best practices
- Proficiency with SQL and PostgreSQL, including complex query optimization
- Experience working with Kubernetes for container orchestration and deployment
- Solid Linux command line and shell scripting skills
- Familiarity with CI/CD tools and modern data engineering practices
- Excellent problem-solving and troubleshooting abilities
- Strong communication and collaboration skills
- Bachelor’s degree in engineering or related field is required
- 5-8 years related experience is required
Responsibilities
- Design, develop, and maintain ETL pipelines using Python and Apache Airflow
- Implement Data Mesh architecture to support decentralized data ownership and domain-driven design
- Automate data workflows, monitor data pipeline performance, and troubleshoot issues
- Optimize SQL queries and data transformation processes to improve performance and reliability
- Deploy, monitor, and manage data services on Kubernetes clusters
- Collaborate with data scientists, analysts, and other engineers to integrate data solutions
- Perform data quality checks and validation processes
- Document ETL processes and data flow diagrams
Preferred Qualifications
- Experience with cloud platforms such as AWS, GCP, or Azure
- Familiarity with data warehousing solutions and big data technologies (e.g., Spark, Hadoop)
- Experience with monitoring and observability tools (e.g., Prometheus, Grafana)
- Familiarity with version control using Git and CI/CD practices