Capco Logo

Capco

Senior DevOps Engineer (AWS) - Payments (She/ He/ They)

Reposted 3 Hours Ago
Be an Early Applicant
Remote or Hybrid
Hiring Remotely in Poland
Senior level
Remote or Hybrid
Hiring Remotely in Poland
Senior level
The Senior DevOps Engineer will design and manage AWS infrastructures, automate processes, and build CI/CD pipelines while collaborating with development teams to implement DevOps practices.
The summary above was generated by AI

CAPCO POLAND 

*We are looking for Poland based candidate. 

At Capco Poland, we’re not just another consultancy — we’re the spark behind digital transformation in the financial world. As a global leader in technology and management consulting, we thrive on helping clients tackle the toughest challenges across banking, payments, capital markets, wealth, and asset management.

The Project:

Join a greenfield IT transformation in the fintech sector, working with Java 21/Spring Boot and a strong focus on software quality and Craftsmanship practices. We are currently looking for a skilled, product-oriented DevOps Software Engineer to join one of our product-focused DevOps teams. It’s a great opportunity to help businesses of all shapes and sizes to accelerate their growth journey - quickly, simply, and securely. We are the innovators at the heart of the payments technology industry, shaping how the world pays and gets paid. Our technology powers the growth of millions of businesses across 5 continents. And just as we help our customers accelerate their business, we are committed to helping our people accelerate their careers. Together, we shape evolution.

Our Goal:

Delivering a fantastic payment experience that the customers love is what drives us forward. Our cloud-native acquiring platform processes billions of card transactions, including authorization processing and clearing & settlement, ensuring trust and reliability for our customers.

Role Overview:

DevOps Engineer with over 5 years of hands-on experience designing, implementing, and operating AWS cloud infrastructures, automating processes, and building secure and scalable CI/CD pipelines. Strong background collaborating with development teams to enable DevOps and DevSecOps best practices in production environments. 

Responsibilities include:

  • Active participation and contribution in all phases of the product lifecycle including discovery, delivery and operations
  • Delivering committed objectives, adapting to whatever is needed to reach the goal you and your team have committed to reach

Key tech stack: AWS, Terraform (IaC), EKS, CI/CD, Python

Role Requirements:

  • Cloud Platform - AWS
    • 5+ years of hands-on experience with AWS.
    • Strong knowledge of core AWS services, including:
      • ECS, EKS, Lambda, EC2, S3, EBS, VPC, IAM, CloudWatch, CloudTrail
      • Route 53, API Gateway, EventBridge, SNS, SQS
      • RDS, Aurora PostgreSQL, DynamoDB
    • Experience with AWS Cognito:
      • User pools, identity pools, app clients, and authentication flows.
    • Solid understanding of AWS security best practices:
      • Least-privilege IAM, KMS, Secrets Manager, ACM certificates.
    • Experience with Route 53:
      • Public and private hosted zones, DNS management, and service integration.
    • Familiarity with multi-account architectures, AWS Landing Zones, and permission boundaries.
    • Proven ability to design and maintain secure, scalable, production grade AWS infrastructures.
    • AWS Solutions Architect (Associate/Professional) certification or equivalent experience.
  • Infrastructure as Code - Terraform
    • Advanced experience with Terraform:
      • Reusable and versioned module design.
      • Remote state management using S3 and DynamoDB.
      • Multi-environment (dev, qa, sta, prod) IaC architectures.
    • Experience managing Terraform state, handling drift, and refactoring IaC.
    • Familiarity with:
      • Checkov, TFLint, terraform-docs for security and code quality.
    • Terraform Associate certification (preferred).
  • Configuration Management & Automation
    • Solid experience with Ansible:
      • Development of roles, playbooks, inventories, and reusable automation patterns.
      • Integration with CI/CD pipelines and hybrid environments.
  • Operating Systems & Scripting
    • Strong knowledge of Linux, especially RHEL based systems.
    • Experience working with bastion hosts and secure SSH access.
    • Proficiency in Python:
      • Automation scripts, AWS integrations (boto3), internal tooling.
    • Experience using Node.js for scripting and CI/CD related tasks.
  • CI/CD & DevSecOps
    • Experience designing and maintaining end-to-end CI/CD pipelines.
    • Hands-on experience with pipelines for backend and frontend applications.
    • Tools and platforms:
      • Jenkins, GitLab CI, Azure DevOps, GitHub Actions
    • Integration of:
      • SonarQube for code quality.
      • Security scanning and policy-as-code tools (Checkov, Sentinel, etc.).
      • Automated testing and validation stages.
    • Experience with container registries:
      • Amazon ECR, JFrog Artifactory, Nexus.
  • Containers & Orchestration
    • Strong experience with Docker:
      • Image optimization, multi-stage builds, private registries.
    • Good working knowledge of Kubernetes, preferably EKS.
    • Familiarity with GitOps practices and tools:
      • ArgoCD, FluxCD (nice to have).
    • CKA / CKAD certifications are a plus.
  • Observability & Monitoring
    • Hands-on experience with Grafana and Prometheus for infrastructure and Kubernetes monitoring.
    • Experience with AWS CloudWatch (metrics, logs, alarms, dashboards).
    • Experience with Loki for centralized log aggregation.
    • Working experience with Datadog for infrastructure monitoring, APM, and alerting.
    • Experience with Splunk and Logz.io for log management and analysis.
    • Knowledge of alerting best practices, SLIs/SLOs, and incident troubleshooting.
  • Databases
    • Working knowledge of SQL and NoSQL databases:
      • PostgreSQL, MySQL, MongoDB.
    • Experience with AWS-managed databases:
      • RDS, Aurora PostgreSQL, DynamoDB.
    • Experience supporting database-backed applications in cloud environments.
  • Additional Skills
    • Proficiency with Git and common branching strategies.
    • Experience with monitoring, logging, and alerting solutions.
    • Understanding of immutable infrastructure and continuous delivery principles.
    • Strong troubleshooting skills across infrastructure, networking, and application layers.
    • Strong documentation skills and experience with knowledge transfer.

We offer a flexible collaboration model based on a B2B contract, with the opportunity to work on diverse projects.

 ONLINE RECRUITMENT PROCESS STEPS

  • Screening call with Recruiter
  • Technical Interview
  • Client Interview 
  • Feedback/ Offer

We have been informed of several recruitment scams targeting the public. We strongly advise you to verify identities before engaging in recruitment related communication. All official Capco communication will be conducted via a Capco recruiter.

Top Skills

Ansible
Aurora Postgresql
AWS
Aws Rds
Azure Devops
Ci/Cd
Docker
DynamoDB
Eks
Github Actions
Gitlab Ci
Grafana
Jenkins
Kubernetes
Prometheus
Python
Terraform

Similar Jobs at Capco

Yesterday
Remote or Hybrid
Poland
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Lead end-to-end product strategy and delivery for Commercial & Specialty Insurance, translating market processes into user stories, defining success metrics, and collaborating with data and engineering teams to deliver data models, APIs, and integrated digital platforms while managing cross-functional stakeholders.
Top Skills: Ppl,Whitespace,Apis,Genai,Agentic Ai
2 Days Ago
Remote or Hybrid
Poland
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Analyze large datasets and validate data using SQL and Python. Develop KPIs, maintain reports, and present insights to business stakeholders.
Top Skills: HadoopPythonSQL
2 Days Ago
Remote or Hybrid
Poland
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Analyze large datasets using SQL and Python, validate data, develop KPIs, translate business problems into insights, and collaborate with teams.
Top Skills: HadoopPythonSQL

What you need to know about the Seattle Tech Scene

Home to tech titans like Microsoft and Amazon, Seattle punches far above its weight in innovation. But its surrounding mountains, sprinkled with world-famous hiking trails and climbing routes, make the city a destination for outdoorsy types as well. Established as a logging town before shifting to shipbuilding and logistics, the Emerald City is now known for its contributions to aerospace, software, biotech and cloud computing. And its status as a thriving tech ecosystem is attracting out-of-town companies looking to establish new tech and engineering hubs.

Key Facts About Seattle Tech

  • Number of Tech Workers: 287,000; 13% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Amazon, Microsoft, Meta, Google
  • Key Industries: Artificial intelligence, cloud computing, software, biotechnology, game development
  • Funding Landscape: $3.1 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Madrona, Fuse, Tola, Maveron
  • Research Centers and Universities: University of Washington, Seattle University, Seattle Pacific University, Allen Institute for Brain Science, Bill & Melinda Gates Foundation, Seattle Children’s Research Institute

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account