Posted in

Business Intelligence Developer

Business Intelligence Developer

CompanyTexas Capital Bank
LocationRichardson, TX, USA
Salary$Not Provided – $Not Provided
TypeFull-Time
DegreesBachelor’s, Master’s
Experience LevelSenior, Expert or higher

Requirements

  • Bachelor’s degree in Computer Science, Data Science, Data Engineering, Information Systems, Mathematics, or a related field (Master’s preferred)
  • Minimum of 7 years of experience in analytics engineering, data engineering, or a related technical field
  • Strong understanding of data warehouse concepts and modern enterprise data architectures
  • Advanced SQL skills with the ability to write, optimize, and troubleshoot complex queries in Snowflake
  • Hands-on experience with Coalesce, dbt, or similar tools for managing robust SQL-based transformations
  • Familiarity with building, scheduling, and monitoring ETL/ELT workflows using tools like Airflow
  • Strong understanding of dimensional modeling concepts (e.g., star schema, snowflake schema)
  • Strong proficiency in Power BI for building and validating dashboards
  • Proficiency in Python for scripting and automation is highly preferred
  • Experience implementing data validation checks and quality assurance processes
  • Strong problem-solving skills and ability to debug complex workflows and/or pipelines
  • Excellent communication skills and ability to translate technical information to a non-technical audience
  • Ability to manage competing priorities on complex projects, initiatives, and deliverables
  • Experience working in Agile or Scrum environments
  • Familiarity with version control systems like Git and experience using collaboration tools such as Jira, Confluence, or Azure DevOps

Responsibilities

  • Partner closely with data engineers to design and maintain dimensional data models (e.g., star schema, snowflake schema) in the presentation layer
  • Develop and implement robust data transformations using SQL and tools like Coalesce to deliver modular and scalable data marts
  • Develop efficient, reliable ETL/ELT workflows in Snowflake to automate data ingestion, transformation, and delivery
  • Advocate for and help implement modern practices like data observability, CI/CD for data pipelines, and testing frameworks
  • Contribute to the development and enhancement of dashboards and reports for critical use cases
  • Partner with data engineers to diagnose and resolve issues related to data pipelines, ETL/ELT processes, and data mart performance
  • Maintain thorough technical documentation for data pipelines, transformation logic, and data mart designs
  • Use version control systems (e.g., Git) to track changes and ensure consistency across deployments
  • Contribute to advanced analytics initiatives by preparing domain-oriented datasets and/or feature stores for AI/ML
  • Embed outputs from analytics models into BI dashboards or other operational reporting workflows
  • Actively participate in team knowledge-sharing sessions, code reviews, and retrospectives

Preferred Qualifications

  • Familiarity with Snowflake-specific features, such as time travel, zero-copy cloning, data sharing, and materialized views
  • Knowledge of Data Vault 2.0 is a plus
  • Experience integrating predictive model outputs into analytics workflows and/or BI solutions is preferred
  • Strong knowledge of data governance principles and tooling; experience with Collibra is a plus