Givzey Logo

Givzey

Data Engineer

Job Posted 25 Days Ago Reposted 25 Days Ago
Remote
Hiring Remotely in United States
Junior
Remote
Hiring Remotely in United States
Junior
Design and maintain data pipelines and infrastructure, ensuring high-quality data delivery for ML and product teams. Collaborate with cross-functional teams to implement data solutions and automation.
The summary above was generated by AI

Data Engineer

We’re looking for a Data Engineer to architect and scale the data backbone that powers our AI‑driven donor engagement platform. You’ll design and own modern, cloud‑native data pipelines and infrastructure that deliver clean, trusted, and timely data to our ML and product teams - fueling innovation that revolutionizes the nonprofit industry.

About Givzey:

Givzey is a Boston-based, rapidly growing digital fundraising solutions company, built by fundraisers for nonprofit organizations. 

Join a fast-growing, mission-driven team working across two innovative platforms: Givzey, the first donor commitment management platform revolutionizing nonprofit fundraising, and Version2.ai, a cutting-edge AI platform helping individuals and organizations create their most authentic, effective digital presence. As an engineer at the intersection of philanthropy and artificial intelligence, you'll build scalable, high-impact solutions that empower nonprofit fundraisers and redefine how people tell their stories online. We’re a collaborative, agile team that values curiosity, autonomy, and purpose. Whether you're refining AI-driven experiences or architecting tools for the future of giving, your work will help shape meaningful technology that makes a difference.

Responsibilities
  • Design & build data pipelines (batch and real‑time) that ingest, transform, and deliver high‑quality data from diverse internal and third‑party sources
  • Develop and maintain scalable data infrastructure (data lakes, warehouses, and lakehouses) in AWS, ensuring performance, reliability, and cost‑efficiency
  • Model data for analytics & ML: create well‑governed schemas, dimensional models, and feature stores that power dashboards, experimentation, and ML applications
  • Implement data quality & observability frameworks: automated testing, lineage tracking, data validation, and alerting
  • Collaborate cross‑functionally with ML engineers, backend engineers, and product teams to integrate data solutions into production systems
  • Automate infrastructure using IaC and CI/CD best practices for repeatable, auditable deployments
  • Stay current with emerging data technologies and advocate for continuous improvement across tooling, security, and best practices
Requirements
  • US Citizenship
  • Bachelor’s or Master’s in Computer Science, Data Engineering, or a related field
  • 2+ years of hands-on experience building and maintaining modern data pipelines using python-based ETL/ELT frameworks
  • Strong Python skills, including deep familiarity with pandas and comfort writing production-grade code for data transformation
  • Fluent in SQL, with a practical understanding of data modeling, query optimization, and warehouse performance trade-offs
  • Experience orchestrating data workflows using modern orchestration frameworks (e.g., Dagster, Airflow, or Prefect)
  • Cloud proficiency (AWS preferred): S3, Glue, Redshift or Snowflake, Lambda, Step Functions, or similar services on other clouds
  • Proven track record of building performant ETL/ELT pipelines from scratch and optimizing them for cost and scalability
  • Experience with distributed computing and containerized environments (Docker, ECS/EKS)
  • Solid data modeling and database design skills across SQL and NoSQL systems
  • Strong communication & collaboration abilities within cross‑functional, agile teams

Nice‑to‑Haves

  • Dagster experience for orchestrating complex, modular data pipelines
  • Pulumi experience for cloud infrastructure‑as‑code and automated deployments
  • Hands‑on with dbt for analytics engineering and transformation-in-warehouse
  • Familiarity with modern data ingestion tools like dlt, Sling, Fivetran, Airbyte, or Stitch
  • Apache Spark experience, especially useful for working with large-scale batch data or bridging into heavier data science workflows
  • Exposure to real-time/event-driven architectures, including Kafka, Kinesis, or similar stream-processing tools
  • AWS data & analytics certifications (e.g., AWS Certified Data Analytics - Specialty)
  • Exposure to serverless data stacks and cost‑optimization strategies
  • Knowledge of data privacy and security best practices (GDPR, SOC 2, HIPAA, etc.)
What You’ll Do Day‑to‑Day
  • Be part of a world‑class team focused on inventing solutions that can transform philanthropy
  • Build & refine data pipelines that feed our Sense (AI) and Go (engagement) layers, ensuring tight feedback loops for continuous learning
  • Own the full stack of data work - from ingestion to transformation to serving - contributing daily to our codebase and infrastructure
  • Partner closely with customers, founders, and teammates to understand data pain points, prototype solutions, iterate rapidly, and deploy to production on regular cycles
  • Help craft a beautiful, intuitive product that delights nonprofits and elevates donor impact

Top Skills

Airbyte
Airflow
Spark
AWS
Dagster
Dbt
Dlt
Docker
Ecs
Eks
Fivetran
Kafka
Kinesis
Prefect
Python
Sling
SQL
Stitch

Similar Jobs

2 Days Ago
Remote or Hybrid
USA
110K-180K Annually
Mid level
110K-180K Annually
Mid level
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
Design and build services for ML products, maintain production data systems, and collaborate on code reviews to improve data pipelines.
Top Skills: AirflowAWSDockerGoKafkaKubernetesPython
8 Days Ago
Easy Apply
Remote or Hybrid
United States
Easy Apply
Mid level
Mid level
Cloud • Healthtech • Professional Services • Software • Pharmaceutical
Data Engineer will design, develop, and deploy SQL code; create ETL processes; and provide analytics reporting using various tools while ensuring compliance with quality standards.
Top Skills: .NetAWSAzureBiC#CloudData AnalyticsData LakeData ModellingData PipelinesData WarehouseDb2ETLHTMLJavaJreviewOraclePl/SqlPythonQlikRSQLT-SqlTeradata
9 Days Ago
In-Office or Remote
Miami, FL, USA
100K-130K Annually
Mid level
100K-130K Annually
Mid level
Big Data • Cannabis • eCommerce • Logistics • Database • Business Intelligence • Big Data Analytics
Design and maintain a data analytics platform, build ETL pipelines, standardize and model data, optimize database performance, and ensure data quality.
Top Skills: AirbyteAWSDbtExcelGithub ActionsGlueHevoMetabaseModeMySQLPostgresPower BIPythonSnowflakeSparkSQLTerraformThoughtspot

What you need to know about the Seattle Tech Scene

Home to tech titans like Microsoft and Amazon, Seattle punches far above its weight in innovation. But its surrounding mountains, sprinkled with world-famous hiking trails and climbing routes, make the city a destination for outdoorsy types as well. Established as a logging town before shifting to shipbuilding and logistics, the Emerald City is now known for its contributions to aerospace, software, biotech and cloud computing. And its status as a thriving tech ecosystem is attracting out-of-town companies looking to establish new tech and engineering hubs.

Key Facts About Seattle Tech

  • Number of Tech Workers: 287,000; 13% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Amazon, Microsoft, Meta, Google
  • Key Industries: Artificial intelligence, cloud computing, software, biotechnology, game development
  • Funding Landscape: $3.1 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Madrona, Fuse, Tola, Maveron
  • Research Centers and Universities: University of Washington, Seattle University, Seattle Pacific University, Allen Institute for Brain Science, Bill & Melinda Gates Foundation, Seattle Children’s Research Institute
By clicking Apply you agree to share your profile information with the hiring company.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account