Data is the lifeblood of Supermetrics and a part of the core value we deliver to our customers. We highly value things like data-informed decision making, data quality and building data-intensive products.
Meer informatie over deze baan >
As a Data Engineer at Supermetrics you will be a core part of designing and implementing ETL (extract-transform-load) data flows both for our customers and our internal teams, including reporting for company management, marketing and sales.
We are especially looking for a person who wants to evolve into a senior data engineer with the ability to design enterprise-grade ETL workflows and systems for ETL processing.
Typical data flows involve some of the following steps
Understanding the input and output formats of the data and other relevant constraints on the data processing
Helping the customers and/or stakeholders understand how the processing is best done to help them achieve their non-technical objectives
Understand and define best possible data model for various stages of data processing
Pulling data from REST APIs, flat files and SQL data stores
Doing transformation tasks on the data – data cleansing and validation, filtering, partitioning, aggregating, calculating metrics – and figuring out the best pipeline for the process
Storing the data into various targets such as flat files, data stores or outgoing REST APIs
Scheduling jobs that recover from errors, such as hitting rate limits or quotas, and can do retries when failures occur
What you need to succeed
You should have at least 3 years of full-time backend programming experience in a major programming language such as Python, Java, Scala, PHP or Ruby.
As a part of the job, be prepared to at least learn Python, Scala or Java for writing ETL scripts and also some PHP to integrate your work with our web applications.
We’re looking for people with
Good practical knowledge of (ANSI) SQL and database design
Excellent understanding of data modeling techniques
Good practical knowledge of programmatically fetching and sending data over REST APIswith some programming language such as Python, Ruby, Java, PHP, etc.
Experience in programming ETL (extract-transform-load) workflows with a programming language such as Python or Scala
Experience, confidence – and also humility, in selecting the tooling for the acquisition, processing and storage of data for each ETL use case
At least a basic understanding and some experience in the topic of data quality and query performance
What’s in it for you?
A chance to grow with us and as we scale – a chance to navigate your career
A fun, transparent, collaborative atmosphere with lots of team spirit
No strict hierarchy, no vicious boss
A competitive salary and benefits
A top notch office in the heart of downtown Helsinki