Software Engineer
Company | Charles Schwab |
---|---|
Location | Westlake, TX, USA |
Salary | $Not Provided – $Not Provided |
Type | Full-Time |
Degrees | Bachelor’s |
Experience Level | Mid Level |
Requirements
- Bachelor’s degree in computer science, Information Technology, or a related field, or equivalent practical experience.
- 2+ years of experience as a cloud data platform engineer in a data analytics ecosystem, with demonstrated progression in responsibilities and technical skills.
- Hands-on experience with Snowflake and Google Cloud Platform (GCP) services such as Cloud Storage, Cloud Run, Cloud Functions, PubSub, Composer, and Cloud SQL.
- Proficient in Infrastructure as Code (IaC) tools such as Terraform or Google Cloud Deployment Manager, with a solid understanding of cloud resource automation.
- 3-5 years of working experience and sound knowledge in building cloud-based data platform leveraging cloud (GCP/AWS) cloud native architecture, ETL/ELT and data integration.
- 3-5 years of development experience with cloud services (AWS, GCP, AZURE) utilizing various support tools (e.g. GCS, Dataproc, Cloud Data flow, Airflow (Composer), Cloud Pub/Sub).
- 3-5 years of experience and sound knowledge in developing reliable data pipelines leveraging cloud data warehouses (Snowflake, Big Query) and distributed data processing frameworks (Apache Spark, Apache Beam, Apache Flink).
- In-depth knowledge of NoSQL database technologies (e.g. MongoDB, BigTable, DynamoDB).
- Expertise in build and deployment tools – (Visual Studio, PyCharm, Git/Bitbucket/Bamboo, Maven, Jenkins, Nexus).
- Experience in database design techniques and philosophies (e.g. RDBMS, Document, Star Schema, Kimball Model).
- Experience leveraging continuous integration/development tools (e.g. Bamboo, Docker, Containers, GitHub, GitHub Actions) in a CI/CD pipeline.
- Advanced understanding of software development and research tools.
- Attention to detail and results oriented, with a strong customer focus.
- Ability to work as part of a team and independently.
- Analytical and problem-solving skills.
- Problem-solving and technical communication skills.
- Ability to prioritize workload to meet tight deadlines.
Responsibilities
-
No responsibilities provided.
Preferred Qualifications
- Strong experience in secure data management practices including technical data governance and data quality management of the pipelines.
- Deep understanding of data architectures and engineering patterns of data pipelines and reporting environments.
- Analytical and troubleshooting skills to identify and resolve data and platform issues effectively.
- Ability to work collaboratively within a team environment, supporting cross-functional initiatives and contributing to shared goals.
- Strong documentation skills and the ability to communicate technical concepts clearly and effectively to both technical and non-technical stakeholders.