Data Ops Engineer

Oct 19
Location requirement:


  • Delivering across a broad range of activities from technical and solution architecture design to operations, prototyping, developing and testing.
  • Ability to architect and steer the DevSecOps Cloud journey for EDO. Demonstrate key skills and ability to influence without authority. This role would interact with senior strategy and architecture leaders, product engineering teams, infrastructure and security architects.
  • Deep understanding of Application, Infrastructure and security architecture and nonfunctional aspects like Performance, Scalability, Reliability, Availability etc.
  • Developing automation for deployment, cluster management, scaling activities and container capabilities, promoting serverless capabilities.
  • Work with various service teams in EDO to design and build an end-end-end DevOps pipeline using a flavour of technologies, covering AWS & SaaS platforms and services, focused on Data pipeline related capabilities.
  • Designing reusable architectures and services that can be leveraged by agile teams to improve development velocity.
  •  Designing and engineering cloud automation best practices around charge back models, developer authorization, security controls, and continuous integration/deployment (CI/CD) pipelines for code promotion.

Know how applications should be engineered by following fault tolerant best practices, separation of duties, observability, and being operator friendly.

Preffered skills for the job and preferred experience:

  • Familiarity with Data Ingestion, Analytics, AI technologies (preferable on AWS)
  • 5+ years of software development experience and best practices
  • 3+ years of developing and operating production workloads in AWS cloud infrastructure
  • Kubernetes, Docker, Teraform, Cloud formation, Helm
  • AWS: specifically, networking knowledge, IAM and security
  • CI/CD
  • Scheduling / Orchestration
  • Spark, Python/ Scala
  • Core Engineering skills (versioning, ci/cd, quality, automated testing)
  • Should Have: API / front end / blueprint for pipelines
  • Strong expertise in DevOps, Agile methodologies, server less, containers, CI/CD

We offer:

  • Opportunity to be part of a rapidly expanding global organization with irreproachable reputation, remotely or Gdańsk office.
  • Pleasant and inspiring working atmosphere.
  • Professional development and clear career path.
  • Training & development opportunities.
  • Competitive salary on B2B

* Please reference that you found the job on

Tags: dataops aws kubernetes docker teraform helm poland europe

13 000 - 20 000 PLN
+ VAT per month

Apply for this position