Data Engineer

Come build a digital, sustainable, infrastructure with us

We're looking for a data engineer who wants to build data driven services that will help tens of thousands of companies to deliver digital post and receipts to millions of users.

About Kivra 

We believe that digital postal services make life easier for both the sender and the recipient, while at the same time contributing to a more sustainable world. Over 1 billion envelopes are sent every year, in Sweden alone. We think that there’s a better, faster, more secure and more environmentally friendly way to do this. That’s why we created Kivra.

What you'll do

You will join a small team of data engineers, scientists and analysts, that are responsible for building and maintaining resilient and scalable data infrastructure. We use a combination of cloud and on-site infrastructure to store and process data. We are in the process of moving towards an event-driven architecture.

We use our data pipeline in the cloud, built on the Google Cloud Platform, to consume events and data from various internal systems. This data is used for, amongst others, business intelligence, reporting and billing. Data is stored in BigQuery and visualised using Data Studio.

Your role will be to help develop and build and maintain this data pipeline, and to help migrate use cases from the previous on-site data system.

Who you are

You have the relevant experience and qualifications. You want to join a young team, and work hands-on in a high-paced environment with fast decision making. You are analytical and like solving problems. 

Furthermore we think that you:

  • Have experience working with data pipelines in cloud environments
  • Have experience in programming (Java, Scala and/or Python)
  • Are familiar with Kafka, and the principles and concepts of event-driven architecture; and can help drive good practices within Kivra (event creation and domain models)
  • Have experience in data modelling and orchestration (Airflow/Oozie/Luigi)
  • Have an understanding of the implications of GDPR for data processing and storage


It's an added bonus if you:

  • Have experience working with data pipelines based on the Google Cloud Platform (Pub/Sub, Dataflow, BigQuery)
  • Have previous experience with Airflow, Terraform and/or Ansible
  • Have previous experience with Hadoop, Postgres databases and/or Erlang


Känner du några som skulle passa ännu bättre? Berätta det för dem.



Vattugatan 17
111 52 Stockholm Vägbeskrivning Se sida


Ett av våra ledord är teamovation. Det betyder att vi jobbar tillsammans för att tillsammans skapa en hållbar och säker digital infrastruktur. Oavsett vilket område en jobbar med, är alla med och driver produkten framåt. Det är teamovation för oss.


Rekryteringsverktyg från Teamtailor