Posted in

Software Engineer III – Data Infrastructure

Software Engineer III – Data Infrastructure

CompanyLead Bank
LocationSan Francisco, CA, USA, Sunnyvale, CA, USA
Salary$180000 – $210000
TypeFull-Time
Degrees
Experience LevelMid Level, Senior

Requirements

  • 5-7 years of experience in data engineering, with a proven track record of building and maintaining high-quality data pipelines.
  • Expertise with AWS services, including RDS (Aurora), AWS DMS, S3, Data Firehose, Kinesis, Lambda, DynamoDB, or other equivalent services.
  • Hands-on experience with AWS CDK for infrastructure as code or equivalent
  • Experience with distributed data processing. example (mapreduce/hadoop/spark).
  • Proficiency in one of the programming languages (Java, Go, or Python)
  • Knowledge of data modeling (e.g., star schemas, dimensional modeling, data mesh).
  • Attention to detail and a commitment to maintaining the highest standards of data accuracy and pipeline quality.
  • Ability to troubleshoot complex issues and implement robust, reliable solutions
  • Strong communication skills to work with cross-functional teams.

Responsibilities

  • Design, develop, and maintain highly reliable data pipelines to ingest, transform, and load data into Snowflake with an uncompromising focus on accuracy and quality.
  • Leverage AWS services (e.g., RDS, DMS, S3, Data Firehose, Kinesis, Lambda, DynamoDB) to build scalable and fault-tolerant data workflows.
  • Maintain infrastructure as code using AWS CDK to maintain data infrastructure
  • Utilize open-source distributed data processing frameworks (e.g., Hadoop, Spark) to handle large-scale data transformations and batch processing.
  • Manage schema evolution and database migrations with tools like SchemaChange or Flyway.
  • Write and maintain code in programming languages (Java, Go, or Python)
  • Monitor, troubleshoot, and optimize pipelines to ensure maximum uptime, performance, and data integrity, critical for banking operations.
  • Collaborate with data engineers, analysts, and other cross-functional teams.
  • Document pipeline designs, processes, and quality assurance measures to maintain transparency and auditability.

Preferred Qualifications

  • Experience in the banking or financial services industry, with an understanding of compliance and data accuracy requirements.
  • Experience with Snowflake database.
  • Exposure to CI/CD pipelines for deploying data infrastructure in a controlled, auditable manner.
  • Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience).