Atlassian is looking for a Senior Data Engineer to join our Go-To Market Data Engineering (GTM-DE) team which is responsible for building our data lake, maintaining our big data pipelines / services and facilitating the movement of billions of messages each day. We work directly with business teams and plenty of platform and engineering teams to ensure growth and retention strategies at Atlassian. We do this by providing metrics and other data elements which are reliable and trustworthy, as well as services and data products to help teams better self serve and improve their time to reliable insights. We are looking for an open-minded, structured thinker who is passionate about building services that scale. You will be reporting into the Senior Data Engineering Manager.
What you'll do
As a senior data engineer in the GTM-DE team, you will have the opportunity to apply your strong technical experience building highly reliable data products. You enjoy working in an agile environment. You are able to take vague requirements and transform them into solid solutions. You are motivated by solving challenging problems, where creativity is as crucial as your ability to write code and test cases.
On a typical day you will help our partner teams ingest data faster into our data lake, you'll find ways to make our data products more efficient, or come up with ideas to help build self-serve data engineering within the company. Then you will move on to building micro-services, architecting, designing, and promoting self serve capabilities at scale to help Atlassian grow.
On your first day, we'll expect you to have:
At least 5+ years of professional experience as a software engineer or data engineerA BS in Computer Science or equivalent experienceStrong programming skills (some combination of Python, Java, and Scala)Experience writing SQL, structuring data, and data storage practicesExperience with data modelingKnowledge of data warehousing conceptsExperienced building data pipelines and micro servicesExperience with Spark, Airflow and other streaming technologies to process incredible volumes of streaming dataA willingness to accept failure, learn and try againAn open mind to try solutions that may seem impossible at firstExperience working on Amazon Web Services (in particular using EMR, Kinesis, RDS, S3, SQS and the like)
It's preferred, but not technically required, that you have:
Experience building self-service tooling and platformsBuilt and designed Kappa architecture platformsA passion for building and running continuous integration pipelines.Built pipelines using Databricks and well versed with their API'sContributed to open source projects (Ex: Operators in Airflow)