Role overview

Data Engineer

FocusData EngineerRole area
SenioritySeniorCandidate level
StackPython, SqlPrimary skills
Location5 accepted countriesEligibility

Requirements and responsibilities

Readable role content extracted into sections for faster review.

About the role

We are looking for a Senior Data Engineer to build and maintain the data infrastructure that powers cost attribution, financial modeling, and cloud optimization initiatives. You will design scalable ELT/ETL pipelines and data models using Python, SQL, dbt, and Airflow, working across AWS services including Glue, Aurora, and Athena alongside Snowflake. The role sits at the intersection of engineering and finance, requiring close collaboration with Product, Engineering, and Analytics teams.

What you will do

  • Design, develop, and implement tooling and datasets that enable telemetry-driven cost attribution and performance-informed financial modeling;
  • Build and maintain scalable data pipelines and automation frameworks to support cost transparency and optimization initiatives;
  • Develop systems that surface cost-saving opportunities and support committed-use discount modeling and management;
  • Ensure data accuracy and reliability through end-to-end validation, auditing, and observability practices;
  • Collaborate cross-functionally with Product, Engineering, Finance, and Analytics stakeholders to deliver high-impact data solutions;
  • Improve and maintain the long-term scalability, performance, and reliability of ELT/ETL pipelines;
  • Support and enhance CI/CD processes and infrastructure automation.

Must haves

  • 4+ years of experience in a Data Engineering role;
  • Strong hands-on experience with SQL and Python;
  • Experience with workflow orchestration tools such as dbt, Airflow, or Dagster;
  • Demonstrated experience designing and maintaining end-to-end ELT/ETL data pipelines;
  • Strong background in data modeling and schema design;
  • Experience working with AWS services such as Glue, Aurora, and Athena, and Snowflake;
  • Familiarity with CI/CD pipelines and Infrastructure as Code using Terraform;
  • Experience building and consuming REST APIs;
  • Strong analytical skills with exceptional attention to detail, including data auditing and validation;
  • Upper-intermediate English level.

Nice to haves

  • FinOps Practitioner or FinOps Engineer certification;
  • Experience with Scala;
  • Hands-on experience with cloud platforms such as AWS and GCP, and observability tools such as Datadog;
  • Experience working with large-scale datasets and distributed systems;
  • Strong understanding of cloud cost optimization strategies;
  • Experience in highly data-driven, cross-functional environments.

Tech stack

Use these tags to compare similar remote roles.

Location eligibility

Candidates should apply only when their profile country is listed here.

Hiring flow

Applications are saved in WithMira for review and follow-up.

1Apply with your profile and resume snapshot.
2Recruiter reviews your fit for this position.
3Messages and referral status stay attached to this role.