We're looking for a data engineer who wants to build data driven services that will help tens of thousands of companies to deliver digital post and receipts to millions of users.
This position is available 100% remote. Our offices are in Stockholm, but you can work from anywhere!
We believe that digital postal services make life easier for both the sender and the recipient, while at the same time contributing to a more sustainable world. Over 1 billion envelopes are sent every year, in Sweden alone. We think that there’s a better, faster, more secure and more environmentally friendly way to do this. That’s why we created Kivra.
What you'll do
You will join a small team of data engineers, scientists and analysts, that are responsible for building and maintaining resilient and scalable data infrastructure.
We have a clean, modern and agile tech stack built on Google Cloud Platform (we use BigQuery heavily). We use Terraform for managing our infrastructure-as-code, and use Github Actions for CI/CD. Data modelling is done in dbt, and visualisation is done in Tableau.
Your role will be to:
- Help develop, build, maintain and improve this data pipeline
- Use your knowledge and expertise to work with data stakeholders to make data easily accessible and available to product and functional teams in Kivra (this data is used for, amongst others, business intelligence, analytics, reporting and billing)
As a team, we have full ownership and autonomy over what and how we build our solutions - from idea and design, to implementation and operations. We are responsible for our solutions end-to-end.
Who you are
You have the relevant experience and qualifications. You want to join a fun, young team, and work hands-on in a high-paced environment with fast decision making. You are analytical and like solving problems.
Furthermore we think that you:
- Have experience working with data pipelines in cloud environments
- Have experience in programming (Java, Scala, Python and/or Golang)
- Are familiar with infrastructure-as-code and CI/CD, and have worked in an environment which utilised this in your daily work
- Have experience in data modelling and orchestration
- Have an understanding of the implications of GDPR for data processing and storage
It's an added bonus if you:
- Have experience working with data pipelines based on the Google Cloud Platform (Pub/Sub, Dataflow, BigQuery, Cloud SQL)
- Have previous experience with Terraform and/or Github Actions
- Are familiar with Kafka, and the principles and concepts of event-driven architecture; and can help drive good practices within Kivra (event creation and domain models)