At Trustly, we're passionate about simplifying the way people transfer, pay, and get refunded online. Our payment solutions serve merchants in various industries. This includes E-com, Financial Services, Gaming and Travel - linking the world's top brands with millions of consumers who expect security and convenience.
Our people are from all corners of the globe, and this we're proud of. We strongly believe that diversity is what helps us create solutions that are more inclusive. Making sure that we're constantly innovating. Our fast-growing team is headquartered in Stockholm, Sweden with additional offices across Europe and the Americas. Together, we are modernizing payments and the work you'll do here will make a lasting impact.
About the Data & Insights team
The Data & Insights team's mission is to transform Trustly into a truly data-driven organization. We are responsible for every aspect of the data journey, from collection & modeling to analysis and visualization. We provide the whole organization with analytics artifacts aiming to improve our decisions, operations and the customer's experience.
About the role
Trustly is developing fast and so are the requirements on data being made available for analytics, data science, reporting, BI as well as for operational use cases. The Data Engineering team's part in this is to provide a platform and tools to enable data producers and consumers to share and analyze data in a frictionless way. Trustly's data platform is mostly built on GCP using components such as Airflow (Cloud Composer), Beam (Dataflow), BigQuery and Google Cloud Storage but also rely on Kafka for event ingestion. We also maintain the company's dbt Cloud installation to enable Analytics Engineers and other downstream SQL power users to build their own data pipelines. Parts of Trustly's payment product run on AWS so knowledge about that ecosystem is important to us as well.
What you'll do
Develop and operate Trustly's analytics data lake on Google Cloud Platform Develop and maintain batch and streaming data pipelines from internal as well as external data producersWork with transformations of data from one or multiple source systems to feed our analytics and ML applicationsHave the opportunity to take part in building a new operational data platform to fulfill Trusty's operational data needsHelp us identify new use-cases for our data and work with stakeholders from the rest of the organization on how to realize those
Who you are
MSc Software Engineering or equivalent with at least 3-5 years experience from a Data Engineering roleGood coding skills with Python and proficient in SQLExperience from building streaming data pipelines in Kafka (or similar)Experience from building cloud data solutions in GCP and/or AWSExperience with TerraformExperience with DBT
We are looking for someone who is not afraid of voicing and acting on new ideas and values good communication with internal and/or external stakeholders. If you are passionate about working with different areas across the organization, then this would be an interesting role for you.
Apply now, we would love to talk to you!