Skip to content

Data Engineering Lead
Company | Mission Lane |
---|
Location | Toronto, ON, Canada |
---|
Salary | $157000 – $191000 |
---|
Type | Full-Time |
---|
Degrees | Bachelor’s |
---|
Experience Level | Senior, Expert or higher |
---|
Requirements
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering.
- Expert-level SQL skills.
- Strong Python programming skills.
- Strong understanding of software engineering principles and best practices (e.g., version control, testing, CI/CD).
- Extensive experience with data warehousing technologies, preferably Snowflake.
- Experience with dbt (Data Build Tool).
- Experience with cloud platforms, preferably GCP (Google Cloud Platform), including services like Cloud Functions, and GCS.
- Experience designing and implementing reliable and resilient ETL/ELT pipelines.
- Experience with data observability and monitoring tools.
- Excellent communication, collaboration, and problem-solving skills.
Responsibilities
- Design, develop, and maintain complex, high-performance data pipelines using Python, SQL, dbt, and Snowflake on GCP.
- Lead the effort to advance code quality, test coverage, and maintainability of our data pipelines.
- Champion and implement software engineering best practices, including code reviews, testing methodology, CI/CD, and documentation.
- Drive the adoption of data observability tools and practices (e.g., data lineage, automated alerting).
- Research, evaluate, and recommend new technologies and tools to improve our data platform.
- Contribute to the data architecture and design of our data warehouse.
- Mentor and guide other data engineers, providing technical expertise and fostering a culture of continuous learning.
- Collaborate effectively with software engineering teams to define data ownership, streamline ingestion processes, and ensure data consistency.
- Work closely with stakeholders (data scientists, analysts, business users) to understand their data needs and translate them into technical requirements.
- Partner with stakeholders to ensure that projects are well defined.
- Troubleshoot and resolve complex data pipeline issues, ensuring data quality and reliability.
- Contribute to the development and maintenance of our CI/CD pipelines for data infrastructure.
- Participate in on-call rotation to support critical data pipelines.
- Identify and address inefficiencies in our data engineering processes.
Preferred Qualifications
- Experience with data governance and data quality frameworks.
- Experience with data modeling techniques.
- Experience with Airflow.
- Experience with infrastructure-as-code tools (e.g., Terraform, Kubernetes Config Connector).
- Experience with Montecarlo or similar data observability platforms.