About the company
Our client is a global leader in energy technology. Located in 90 countries, operates across the whole energy landscape. From conventional to renewable power, from grid technology to storage to electrifying complex industrial processes. The mission is to support companies and countries with what they need to reduce greenhouse gas emissions and make energy reliable, affordable and more sustainable.
Job Summary
Responsibilities:
đDesigning, building, and maintaining scalable data pipelines using technologies such as Dataform, Google BigQuery, Docker, Pub/Sub, and Airflow; đYou will belong to our cross-functional team in which you will work with requirements, test, development, and deployment and have a dialogue with stakeholders around the solutions; đCollaborating with other developers in the team by coaching, performing code reviews and being involved in design decisions. đTechnological stack - Google Cloud Platform (or experience from other cloud provider), Dataform, Google BigQuery, Docker, đPub/Sub, Airflow, Terraform, Python, SQL.
Requirements
đExperience in software development minimum 5 years, experience from a cloud or containerized stack with virtual infrastructure, preferably GCP; đA deep understanding of data modeling concepts, ETL processes and how to build a good data infrastructure with a modern setup; đExperience with modern DevOps practices, including CI/CD pipelines, version control using GitLab and infrastructure as code using Terraform; đExperience with modern data architectures, such as Event-driven, Data Lake, Data Warehouse and Lake House; đExperience with modern data modeling techniques like Dimensional Modelling and Data Vault; đAdvanced knowledge of Python and SQL.