Particle41 Logo

Particle41

Data Engineer

Reposted 25 Days Ago
Easy Apply
Remote
Hiring Remotely in USA
Mid level
Easy Apply
Remote
Hiring Remotely in USA
Mid level
The Data Engineer will design and maintain ETL pipelines and data storage solutions, collaborating with cross-functional teams to ensure efficient data processing and delivery.
The summary above was generated by AI

Data Engineer

Particle41 is seeking a talented and versatile Data Engineer to join our innovative team. As a Data Engineer, you will play a key role in designing, building, and maintaining robust data pipelines and infrastructure to support our clients' data needs. You will work on end-to-end data solutions, collaborating with cross-functional teams to ensure high-quality, scalable, and efficient data delivery. This is an exciting opportunity to contribute to impactful projects, solve complex data challenges, and grow your skills in a supportive and dynamic environment.

In This Role, You Will:

Software Development

  • Design, develop, and maintain scalable ETL (Extract, Transform, Load) pipelines to process large volumes of data from diverse sources.
  • Build and optimize data storage solutions, such as data lakes and data warehouses, to ensure efficient data retrieval and processing.
  • Integrate structured and unstructured data from various internal and external systems to create a unified view for analysis.
  • Ensure data accuracy, consistency, and completeness through rigorous validation, cleansing, and transformation processes.
  • Maintain comprehensive documentation for data processes, tools, and systems while promoting best practices for efficient workflows.

Requirements Gathering and Analysis

  • Collaborate with product managers, and other stakeholders to gather requirements and translate them into technical solutions.
  • Participate in requirement analysis sessions to understand business needs and user requirements.
  • Provide technical insights and recommendations during the requirements-gathering process.

Agile Development

  • Participate in Agile development processes, including sprint planning, daily stand-ups, and sprint reviews.
  • Work closely with Agile teams to deliver software solutions on time and within scope.
  • Adapt to changing priorities and requirements in a fast-paced Agile environment.

Testing and Debugging

  • Conduct thorough testing and debugging to ensure the reliability, security, and performance of applications.
  • Write unit tests and validate the functionality of developed features and individual elements.
  • Writing integration tests to ensure different elements within a given application function as intended and meet desired requirements.
  • Identify and resolve software defects, code smells, and performance bottlenecks.

Continuous Learning and Innovation

  • Stay updated with the latest technologies and trends in full-stack development.
  • Propose innovative solutions to improve the performance, security, scalability, and maintainability of applications.
  • Continuously seek opportunities to optimize and refactor existing codebase for better efficiency.
  • Stay up-to-date with cloud platforms such as AWS, Azure, or Google Cloud Platform.

Collaboration

  • Collaborate effectively with cross-functional teams, including testers, and product managers.
  • Foster a collaborative and inclusive work environment where ideas are shared and valued.

Skills and Experience We Value:

  • Bachelor's degree in Computer Science, Engineering, or related field.
  • Proven experience as a Data Engineer, with a minimum of 3 years of experience.
  • Proficiency in Python programming language.
  • Experience with database technologies such as SQL (e.g., MySQL, PostgreSQL) and NoSQL (e.g., MongoDB) databases.
  • Strong understanding of Programming Libraries/Frameworks and technologies such as Flask, API frameworks, datawarehousing/lakehouse, principles, database and ORM, data analysis databricks, panda's, Spark, Pyspark, Machine learning, OpenCV, scikit-learn.
  • Utilities & Tools: logging, requests, subprocess, regex, pytest
  • ELK stack, Redis, distributed task queues
  • Strong understanding of data warehousing/lakehousing principles and concurrent/parallel processing concepts.
  • Familiarity with at least one cloud data engineering stack (Azure, AWS, or GCP) and the ability to quickly learn and adapt to new ETL/ELT tools across various cloud providers.
  • Familiarity with version control systems like Git and collaborative development workflows.
  • Competence in working on Linux OS and creating shell scripts.
  • Solid understanding of software engineering principles, design patterns, and best practices.
  • Excellent problem-solving and analytical skills, with a keen attention to detail.
  • Effective communication skills, both written and verbal, and the ability to collaborate in a team environment.
  • Adaptability and willingness to learn new technologies and tools as needed.

About Particle41
Our core values of Empowering, Leadership, Innovation, Teamwork, and Excellence drive everything we do to achieve the ultimate outcomes for our clients. Empowering Leadership for Innovation in Teamwork with Excellence ( ELITE )

  • E - Empowering: Enabling individuals to reach their full potential.
  • L - Leadership: Taking initiative and guiding each other toward success.
  • I - Innovation: Embracing creativity and new ideas to stay ahead.
  • T - Teamwork: Collaborating with empathy to achieve common goals.
  • E - Excellence: Striving for the highest quality in everything we do.

We seek team members who embody these values and are committed to contributing to our mission. Particle41 welcomes individuals from all backgrounds who are committed to our mission and values. We provide equal employment opportunities to all employees and applicants, ensuring that hiring and employment decisions are based on merit and qualifications without discrimination based on race, color, religion, caste, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, local, or international laws. This policy applies to all aspects of employment and hiring. 

We appreciate your interest and encourage applicants from these regions to apply. If you need any assistance during the application or interview process, please feel free to reach out to us at [email protected]. This is specifically for USA location only. 

Top Skills

AWS
Azure
Databricks
Elk Stack
Flask
GCP
Git
Linux
Machine Learning
MongoDB
MySQL
Opencv
Pandas
Postgres
Pyspark
Python
Redis
Scikit-Learn
Spark
SQL

Similar Jobs

Yesterday
Remote or Hybrid
3 Locations
102K-171K Annually
Senior level
102K-171K Annually
Senior level
eCommerce • Information Technology • Retail • Industrial
As a Senior Data Engineer, you will design and maintain scalable data pipelines, develop data products, and collaborate across teams to deliver insights.
Top Skills: SparkAws GlueCloudFormationDockerGithub ActionsKafkaKubernetesPostgresPythonSap S/4HanaScalaSnowflakeSQLTerraform
Yesterday
Easy Apply
Remote
United States
Easy Apply
118K-184K Annually
Senior level
118K-184K Annually
Senior level
AdTech • Digital Media • Marketing Tech • Software • Automation
The Sr Data Engineer will design, implement, and maintain deployment and ETL pipelines while integrating various data sources and ensuring efficient developer experiences.
Top Skills: Apache AirflowArgo Ci/CdArgo WorkflowsBazelCircle CiDockerFlywayGitHarnessJavaJenkinsKubernetesLookerPower BIPythonSnowflakeSQLThoughtspot
3 Days Ago
In-Office or Remote
New York, NY, USA
140K-180K Annually
Senior level
140K-180K Annually
Senior level
Healthtech • Insurance • Software
As a Software Engineer, you will design and operate frameworks for data processing, build internal APIs, and ensure data security. You will collaborate on systems that handle real-time and batch workloads efficiently.
Top Skills: Auth0Cloud RunCloud StorageDockerGCPGoIamIcebergK NativeKubernetesPostgresPub/SubPythonRest ApisSQLTrino

What you need to know about the Seattle Tech Scene

Home to tech titans like Microsoft and Amazon, Seattle punches far above its weight in innovation. But its surrounding mountains, sprinkled with world-famous hiking trails and climbing routes, make the city a destination for outdoorsy types as well. Established as a logging town before shifting to shipbuilding and logistics, the Emerald City is now known for its contributions to aerospace, software, biotech and cloud computing. And its status as a thriving tech ecosystem is attracting out-of-town companies looking to establish new tech and engineering hubs.

Key Facts About Seattle Tech

  • Number of Tech Workers: 287,000; 13% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Amazon, Microsoft, Meta, Google
  • Key Industries: Artificial intelligence, cloud computing, software, biotechnology, game development
  • Funding Landscape: $3.1 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Madrona, Fuse, Tola, Maveron
  • Research Centers and Universities: University of Washington, Seattle University, Seattle Pacific University, Allen Institute for Brain Science, Bill & Melinda Gates Foundation, Seattle Children’s Research Institute

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account