Similar Jobs
Professional Services • Software
Lead the buildout of a new enterprise data platform, designing infrastructure, pipelines, and storage while ensuring data governance and quality.
Top Skills:
AWSAzureDatabricksGCPJavaSnowflakeSQL
Digital Media • Gaming • Information Technology • Software • Sports • Esports • Big Data Analytics
As a Senior Data Engineer, you'll design and implement scalable data systems, optimize performance, and lead projects while collaborating with teams to enhance data solutions.
Top Skills:
BigQueryGitRedshiftSnowflakeSQL
Artificial Intelligence • Insurance • Machine Learning • Software • Analytics
Lead design and implementation of scalable, HIPAA-compliant data pipelines and platforms for healthcare ML. Build ETL, orchestration, and tooling for processing EHR, claims, pharmacy, and bioinformatics data; collaborate with data scientists to produce modeling-ready datasets and ensure data quality, reliability, and operational excellence.
Top Skills:
AirflowApache Spark (Pyspark)AWSCi/CdDagsterDatabricksDbtDockerKubernetesPrefectPythonSnowflakeSQLTerraform
Bridgeway is seeking a Senior Data Engineer to design, develop, and maintain our data warehouse infrastructure. This role involves working closely with analysts, engineers, and other stakeholders to shape our data architecture, ensuring secure and efficient data pipelines, and enabling advanced analytics across the organization. The ideal candidate will have a strong background in data engineering, data warehousing, and ELT processes, along with a passion for optimizing data systems.
This is a remote position, with preference given to East Coast candidates.
Key Responsibilities:
- Design, develop, and maintain a scalable data warehouse/lakehouse environment.
- Design and implement ELT pipelines to ingest, transform, and deliver high-quality data for analytics and reporting, incorporating current best practices, such as “pipelines as code”.
- Ensure data security and compliance, including role-based access controls for security, encryption, masking, and governance best practices to ensure compliant handling of sensitive information.
- Optimize performance of data workflows and storage for cost efficiency and speed.
- Partner with engineers, analysts, and stakeholders to meet data needs; balance cost, performance, simplicity, and time-to-value while mentoring teams and documenting standards.
- Define and implement robust testing frameworks, enforce data contracts, and establish observability practices including lineage tracking, SLAs/SLOs, and incident response runbooks to maintain data integrity and trustworthiness.
- Monitor, troubleshoot, and resolve data & automation issues.
- Collaborate within an Agile-Scrum framework and develop comprehensive technical design documentation to ensure efficient and successful delivery.
- Serve as a trusted expert on organizational data domains, processes, and best practices.
Requirements:
- 5+ years of experience in data engineering and ELT with a focus on large-scale data platforms
- 3+ years of experience with Databricks
- Advanced proficiency in analytical SQL, including ANSI SQL, T-SQL, and Spark SQL
- Strong Python skills for data engineering
- Expertise in data modeling
- Hands-on experience with data quality and observability practices (tests, contracts, lineage tracking, alerts)
- Practical knowledge of orchestration tools and CI/CD concepts for data workflows
- Excellent communication and a track record of technical leadership and mentoring
- Strong understanding of integrating data solutions with AI and machine learning models
- Strong problem-solving skills and attention to detail.
- Experience with version control systems like Git preferred
- Strong understanding of data governance and best practices in data management, with hands-on experience using Unity Catalog
- Hands-on experience in designing and managing data pipelines using Delta Live Tables (DLT) on Databricks
- Streaming and ingestion tools, such as Kafka, Kinesis, Event Hubs, Debezium, or Fivetran
- DAX, LookML, dbt; Airflow/Dagster/Prefect, Terraform; Azure DevOps; Power BI/Looker/Tableau; GitHub CoPilot knowledge is a plus
- Bachelor’s degree in Computer Science, Information Technology, or a related field. Master’s degree preferred
What you need to know about the Seattle Tech Scene
Home to tech titans like Microsoft and Amazon, Seattle punches far above its weight in innovation. But its surrounding mountains, sprinkled with world-famous hiking trails and climbing routes, make the city a destination for outdoorsy types as well. Established as a logging town before shifting to shipbuilding and logistics, the Emerald City is now known for its contributions to aerospace, software, biotech and cloud computing. And its status as a thriving tech ecosystem is attracting out-of-town companies looking to establish new tech and engineering hubs.
Key Facts About Seattle Tech
- Number of Tech Workers: 287,000; 13% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Amazon, Microsoft, Meta, Google
- Key Industries: Artificial intelligence, cloud computing, software, biotechnology, game development
- Funding Landscape: $3.1 billion in venture capital funding in 2024 (Pitchbook)
- Notable Investors: Madrona, Fuse, Tola, Maveron
- Research Centers and Universities: University of Washington, Seattle University, Seattle Pacific University, Allen Institute for Brain Science, Bill & Melinda Gates Foundation, Seattle Children’s Research Institute


