Cybermedia Technologies (CTEC) Logo

Cybermedia Technologies (CTEC)

Data Engineer

Posted Yesterday
Easy Apply
Remote
Hiring Remotely in United States
Senior level
Easy Apply
Remote
Hiring Remotely in United States
Senior level
The Data Engineer will design and maintain ETL pipelines in Azure Databricks, manage data migrations, and ensure data quality and governance.
The summary above was generated by AI

CTEC is a leading technology firm that provides modernization, digital transformation, and application development services to the U.S. Federal Government. Headquartered in McLean, VA, CTEC has over 300 team members working on mission-critical systems and projects for agencies such as the Department of Homeland Security, Internal Revenue Service, and the Office of Personnel Management. The work we do effects millions of U.S. citizens daily as they interact with the systems we build. Our best-in-class commercial solutions, modified for our customers’ bespoke mission requirements, are enabling this future every day.

The Company has experienced rapid growth over the past 3 years and recently received a strategic investment from Main Street Capital Corporation (NYSE: MAIN). In addition to our recent growth in Federal Civilian agencies, we are seeking to expand our capabilities in cloud development and footprint in national-security focused agencies within the Department of Defense and U.S. Intelligence Community.


We are seeking to hire a Data Engineer to our team!

Client:
CTEC develops and delivers innovative customer-centric technologies and solutions that support the Office of Personnel Management’s (OPM) Health and Insurance business unit and Office of the Chief Information Officer (OCIO).

Duties and Responsibilities:

  • Data Pipeline & ETL Development: Design, develop, and maintain scalable ETL pipelines and data workflows to ingest, transform, and integrate data from legacy systems and external sources into modern cloud-based data platforms.
  • Cloud Data Platform & Databricks Implementation: Build, optimize, and maintain data processing solutions using Azure Databricks and lakehouse architectures to support analytical, operational, and reporting use cases.
  • Data Migration Support: Support phased data migration from legacy databases and ETL tools to Azure Databricks environments, including transformation documentation and data mapping.
  • Lakehouse & Data Layer Design: Implement layered lakehouse data architectures (e.g., bronze, silver, gold layers) in Databricks to support data quality, performance, and downstream reporting needs.
  • Python & PySpark Development: Develop data processing notebooks, workflows, and distributed data transformations using Python and PySpark within Databricks environments.
  • Data Quality & Validation: Develop data validation, reconciliation, and testing processes to ensure data accuracy, completeness, and consistency across data domains.
  • Data Integration & Interoperability: Integrate Databricks data platforms with analytics and reporting tools to enable business intelligence and operational dashboards.
  • Data Governance & Security Implementation: Support data governance initiatives including metadata management, data catalog integration, encryption, access controls, and compliance with federal data protection requirements.
  • DevOps & CI/CD Support: Maintain source control and CI/CD pipelines for Databricks and data engineering workflows, supporting automated promotion across environments.
  • Technical Collaboration: Work closely with data architects, solution architects, business analysts, and reporting teams to implement approved data solutions.
  • Operational Support & Troubleshooting: Provide ongoing support for Databricks workflows, resolve pipeline failures, and troubleshoot complex data processing issues.
  • Mentorship & Knowledge Sharing: Provide guidance to junior data engineers and contribute to documentation and team enablement.
  • Works under minimal supervision with minor guidance from senior personnel.

Skills & Work Experience:

  • Professional Experience: At least seven (7–9+) years of experience in data engineering, ETL development, or large-scale data integration environments.
  • Strong experience designing and developing ETL pipelines and data transformations in Azure Databricks environments.
  • Strong proficiency in SQL and Python, with hands-on experience using PySpark for distributed data processing.
  • Experience working with cloud-based data platforms, data lakes, and lakehouse environments, preferably on Microsoft Azure.
  • Experience implementing layered lakehouse data architectures (bronze, silver, gold) for enterprise analytics.
  • Familiarity with Spark-based big data processing frameworks.
  • Experience supporting data migration from legacy databases and ETL tools to Databricks-based platforms.
  • Experience integrating Databricks platforms with business intelligence and reporting tools such as Power BI.
  • Familiarity with data governance, metadata management, and data security best practices.
  • Experience with source control and CI/CD pipelines for data engineering and Databricks workflows.
  • Working knowledge of SDLC and Agile delivery methodologies.
  • Excellent organizational, communication, and collaboration skills.

Preferred:

  • Hands-on experience with Azure Databricks workflows, Delta tables, and Databricks job orchestration.
  • Experience with Azure Data Lake, Azure Data Factory, or similar Azure data services.
  • Experience supporting federal IT modernization or large-scale enterprise data transformation initiatives.
  • Familiarity with healthcare, insurance, or benefits administration data environments.
  • Experience implementing data governance or data catalog platforms.

Education:

Bachelor’s degree in Computer Science, Engineering, Information Systems, or a related technical discipline. Equivalent education or professional experience will be considered in lieu of a degree.

Clearance:
Must be a U.S. citizen and be able to obtain a Public Trust clearance.

If you are looking for a fun and challenging environment with talented, motivated people to work with, CTEC is the right place for you. In addition to employee salary, we offer an array of employee benefits including:

  • Paid vacation & Sick leave

  • Health insurance coverage

  • Career training

  • Performance bonus programs

  • 401K contribution & Employer Match

  • 11 Federal Holidays


 

Top Skills

Azure Data Factory
Azure Data Lake
Azure Databricks
Ci/Cd
Data Governance
ETL
Power BI
Pyspark
Python
SQL

Similar Jobs

2 Days Ago
Remote or Hybrid
United States
60K-120K Annually
Mid level
60K-120K Annually
Mid level
Cloud • Insurance • Payments • Software • Business Intelligence • App development • Big Data Analytics
The Data Engineer will build and maintain data solutions, optimize data architectures, and ensure data quality while collaborating with cross-functional teams.
Top Skills: BigQueryGoogle Cloud PlatformPythonSQL
3 Days Ago
Remote or Hybrid
Richmond, VA, USA
123K-141K Annually
Junior
123K-141K Annually
Junior
Fintech • Machine Learning • Payments • Software • Financial Services
As a Data Engineer at Capital One, you will design and implement cloud-based data solutions, collaborate with Agile teams, and utilize technologies such as Java, Scala, and SQL. You will ensure code quality through testing and performance tuning while mentoring less experienced engineers.
Top Skills: AWSDatabricksEmrGCPHadoopHiveJavaKafkaMapreduceAzureMySQLPythonScalaSnowflakeSparkSQLUnix/Linux
3 Days Ago
In-Office or Remote
Mountain View, CA, USA
194K-306K Annually
Senior level
194K-306K Annually
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
Lead the Data Engineering Team at Atlassian, architecting large-scale data solutions, mentoring engineers, and ensuring high-quality data processing standards.
Top Skills: AirflowAthenaAWSDatabricksEmrFlinkHiveKafkaRedshiftSparkSQL

What you need to know about the Seattle Tech Scene

Home to tech titans like Microsoft and Amazon, Seattle punches far above its weight in innovation. But its surrounding mountains, sprinkled with world-famous hiking trails and climbing routes, make the city a destination for outdoorsy types as well. Established as a logging town before shifting to shipbuilding and logistics, the Emerald City is now known for its contributions to aerospace, software, biotech and cloud computing. And its status as a thriving tech ecosystem is attracting out-of-town companies looking to establish new tech and engineering hubs.

Key Facts About Seattle Tech

  • Number of Tech Workers: 287,000; 13% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Amazon, Microsoft, Meta, Google
  • Key Industries: Artificial intelligence, cloud computing, software, biotechnology, game development
  • Funding Landscape: $3.1 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Madrona, Fuse, Tola, Maveron
  • Research Centers and Universities: University of Washington, Seattle University, Seattle Pacific University, Allen Institute for Brain Science, Bill & Melinda Gates Foundation, Seattle Children’s Research Institute

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account