LiveKit Logo

LiveKit

Data Engineer

Posted 8 Days Ago
Remote
Hiring Remotely in U.S.
195K-245K Annually
Senior level
Remote
Hiring Remotely in U.S.
195K-245K Annually
Senior level
As a Data Engineer at LiveKit, you will manage the analytics infrastructure, develop scalable GCP-based data pipelines, and ensure effective data movement and transformation processes, while collaborating with the Analytics and engineering teams.
The summary above was generated by AI

LiveKit is building the infrastructure layer for the voice-driven era of computing. Our platform gives developers everything they need to build, test, deploy, scale, and observe agents in production. Founded in 2021, LiveKit powers voice AI applications for OpenAI, xAI, Salesforce, Coursera, Spotify, and thousands of others, collectively facilitating billions of calls each year.

You'll thrive at LiveKit if you:
  • obsess with crafting code that is fast, reliable and practical for the problem

  • are known as the go-to person for tackling tough technical problems

  • work hard and can build and ship fast

  • can clearly explain complex technical concepts to others

  • are a fast learner, frequently picking up new languages and tools

The best way to impress us is with thoughtful Issues and/or PRs on our Github repos 😊

About This Role:

As a Data Engineer at LiveKit, you'll own the analytics infrastructure that powers our business intelligence and data analysis capabilities. Working closely with the Head of Data and analytics peers, you'll design and implement scalable GCP-based data pipelines — from ingestion through transformation to delivery — maximizing the GCP ecosystem for cost-effective solutions while integrating additional services or homegrown tooling where appropriate. While analytics infrastructure is the core focus, you'll also engage with the broader application data infrastructure, contributing your data pipeline expertise to support product and engineering needs. This is a foundational IC role with significant ownership over the architecture and direction of our analytics stack as the team grows.

What You’ll Do:

Own the Analytics Infrastructure: You are the end-to-end owner of our GCP-based data infrastructure — including ingestion, movement, storage, security, and availability. You build and operate reliable, scalable pipelines that power analytics, and partner closely with the Analytics team on downstream transformation and BI.

Maximize the GCP Ecosystem: Build cost-effective solutions anchored in GCP-native services. Know when to extend with third-party tooling or homegrown solutions, and make pragmatic tradeoffs.

Contribute Across Data Infrastructure: While analytics is the primary focus, you'll bring broad data pipeline expertise to application data needs in collaboration with the product engineering team.

Managed Services First: Favor managed solutions over self-hosting. Evaluate build vs. buy with cost and operational burden in mind.

Engineering Standards: This role reports to the Head of Data within the Engineering org. Expect PR reviews, automated testing, proper change management, and production-grade standards.

AI-First Development: Work extensively with AI coding assistants and contribute to evolving our AI development workflows and infrastructure.

Startup Pace: Priorities shift quickly. Balance long-term architectural thinking with the tactical execution the moment requires.

Who You Are:
  • 8+ years of experience in data engineering with strong Python and SQL expertise

  • Deep expertise in GCP, with hands-on experience in BigQuery, Dataflow, Cloud Storage, and related analytics services

  • Proven ability to design and implement production-grade data pipelines and aggregation layers for BI and analysis

  • AI-first development mindset with hands-on experience building AI-driven workflows and effectively using AI coding assistants

  • Strong understanding of data modeling, transformation patterns, and working with dbt

  • Experience with data movement tools (Estuary, Airbyte, Fivetran, or similar)

  • Solid infrastructure and DevOps fundamentals: Terraform or similar IaC, CI/CD, Git workflows, and change management

  • Experience implementing observability and monitoring for data systems (DataDog, Grafana, or similar)

  • Strong communication skills and ability to work cross-functionally with engineering and business stakeholders

  • Self-directed and comfortable with ambiguity in a fast-paced startup environment

  • Located in the US or Canada

Bonus:
  • Experience coordinating with dbt and analytics engineering teams

  • Background with AI workflow tools (n8n or similar)

  • Background with AI coding assistants

  • Prior experience as an early infrastructure hire building from the ground up

Our Commitment to You:
  • An opportunity to build something truly impactful to the world

  • Contribute to open source alongside world-class engineers

  • Competitive salary and equity package

  • Health, dental, and vision benefits

  • Flexible vacation policy

Top Skills

Airbyte
BigQuery
Cloud Storage
Datadog
Dataflow
Dbt
Estuary
Fivetran
GCP
Grafana
Python
SQL
Terraform

Similar Jobs

Yesterday
Easy Apply
Remote
Easy Apply
Mid level
Mid level
Fintech • Software
The Google Cloud Data Engineer will design, build, and improve partner-facing web portals, contribute to the full software development lifecycle, and collaborate with cross-functional teams for high-quality software delivery.
Top Skills: AngularC#Ci/CdDockerGitJavaScriptKubernetesMySQLPHPReact
5 Days Ago
Remote
Mid level
Mid level
Big Data • Security • Software • Analytics • Cybersecurity
The Data Engineer I will design and implement data solutions on Azure, developing ETL processes, managing data pipelines, and ensuring data quality and security, while collaborating with various teams.
Top Skills: AirflowAzureAzure Data FactoryAzure DevopsC#/.NetDatabricksDbtDockerGitInformaticaJavaKubernetesMicrosoft Sql ServerPostgresPythonRest ApiSQLSynapse
8 Days Ago
In-Office or Remote
Expert/Leader
Expert/Leader
Big Data • Analytics • Business Intelligence • Big Data Analytics
The Data Engineer will architect and implement advanced analytics capabilities, manage data pipelines, and optimize data processing using AWS cloud and various analytical tools.
Top Skills: SparkAWSDatabricksDbtPysparkPythonSnowflakeSQL

What you need to know about the Seattle Tech Scene

Home to tech titans like Microsoft and Amazon, Seattle punches far above its weight in innovation. But its surrounding mountains, sprinkled with world-famous hiking trails and climbing routes, make the city a destination for outdoorsy types as well. Established as a logging town before shifting to shipbuilding and logistics, the Emerald City is now known for its contributions to aerospace, software, biotech and cloud computing. And its status as a thriving tech ecosystem is attracting out-of-town companies looking to establish new tech and engineering hubs.

Key Facts About Seattle Tech

  • Number of Tech Workers: 287,000; 13% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Amazon, Microsoft, Meta, Google
  • Key Industries: Artificial intelligence, cloud computing, software, biotechnology, game development
  • Funding Landscape: $3.1 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Madrona, Fuse, Tola, Maveron
  • Research Centers and Universities: University of Washington, Seattle University, Seattle Pacific University, Allen Institute for Brain Science, Bill & Melinda Gates Foundation, Seattle Children’s Research Institute

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account