Associate Director – Data Engineering
Company | Carrier Global |
---|---|
Location | Atlanta, GA, USA, Palm Beach Gardens, FL, USA |
Salary | $Not Provided – $Not Provided |
Type | Full-Time |
Degrees | Bachelor’s, Master’s |
Experience Level | Senior, Expert or higher |
Requirements
- Bachelor’s Degree.
- 8+ years of experience in data engineering or analytics engineering.
- 5+ years in a technical leadership or architect role.
- 5+ years of experience in Core Data Engineering & Cloud Platforms: AWS (S3, Glue, Kinesis, DMS), Snowflake.
- 5+ years of experience AI/ML & Discovery: AWS SageMaker or JupyterHub.
- 3+ years of experience Transformation Frameworks: dbt or sqlmesh.
- 5+ years of experience implementing and managing scalable data pipelines and platform capabilities in complex, enterprise environments.
Responsibilities
- Define and execute the data engineering roadmap in alignment with enterprise data and AI goals.
- Lead a team of data engineers responsible for developing and supporting real-time and batch data pipelines.
- Foster a culture of engineering excellence, continuous improvement, and delivery at scale.
- Oversee the development of modular, reusable pipelines using Spark, Airflow, AWS Glue, and Nexla to support batch, real-time, and CDC use cases.
- Champion scalable architecture patterns leveraging Amazon S3 and Apache Iceberg for decoupled storage and compute.
- Integrate systems using event-driven and streaming architectures (e.g., AWS Kinesis, Kafka, AWS DMS) to support timely, governed data delivery.
- Ensure metadata, data quality, lineage, and access controls are embedded in every pipeline.
- Partner with Data Scientists, Business Analysts, Product Owners, and Platform Engineers to ensure engineering solutions meet business needs.
- Work closely with Governance, Security, and FinOps teams to ensure data engineering aligns with policy, compliance, and cost-efficiency goals.
- Support and enable data discovery, analytics, and AI/ML through integrations with AWS SageMaker, AWS DataZone, and business-aligned data products.
Preferred Qualifications
- Master’s degree in Computer Science, Data Engineering, or related field.
- Experience with Apache Iceberg, Apache Spark, Apache Airflow
- Data Integration & Automation: Nexla
- Strong understanding of data architecture principles, observability, CI/CD, and cloud-native DevOps practices.
- Strong understanding of automation, automation principles, and different aspects of pipelines and automation delivery.
- Excellent communication, stakeholder engagement, and cross-functional leadership skills.
- Agile/Lean background for projects and project delivery.
- Advanced strategic capability with cloud delivery models.
- Deep understanding of advanced security segmentation and controls.