Staff Data Engineer
Company | Route |
---|---|
Location | Santa Monica, CA, USA, San Francisco, CA, USA, New York, NY, USA, Lehi, UT, USA |
Salary | $176000 – $198090 |
Type | Full-Time |
Degrees | |
Experience Level | Senior, Expert or higher |
Requirements
- 5+ years of experience in data engineering or a similar field.
- Expertise in Snowflake, DBT, Databricks, and Spark for data processing and analytics.
- Strong programming skills in Python for data pipeline development and automation.
- Experience with Snaplogic, Airflow, or similar ETL orchestration tools.
- Proficiency in cloud infrastructure and data services within the AWS ecosystem.
- Hands-on experience with Terraform for infrastructure as code.
- Strong problem-solving skills with the ability to work independently in a fast-paced environment.
- Excellent communication skills and ability to collaborate across teams.
Responsibilities
- Own and maintain the reliability of existing data ingestion pipelines, ensuring uninterrupted data flow.
- Design, develop, and optimize new data ingestion and integration pipelines.
- Work cross-functionally with data analysts, product managers, back-end and front-end engineering teams, and business stakeholders to deliver robust data solutions.
- Leverage Snowflake, Databricks, DBT, and Spark to build scalable and high-performance data processing solutions.
- Implement and maintain Snaplogic, Airflow, and Terraform for ETL orchestration, workflow automation, and infrastructure as code.
- Ensure high availability, scalability, and security of data infrastructure using the AWS suite.
- Troubleshoot and optimize performance across the data pipeline stack.
- Stay ahead of industry trends and continuously improve data engineering best practices.
Preferred Qualifications
- Familiarity with real-time data processing architectures.
- Knowledge of data governance, security, and compliance best practices.
- Experience with AI agents to interact with a database.
- Experience in the e-commerce or post-purchase space is a plus.