About the company
Our client is a global leader in energy technology. Located in 90 countries, operates across the whole energy landscape. From conventional to renewable power, from grid technology to storage to electrifying complex industrial processes. The mission is to support companies and countries with what they need to reduce greenhouse gas emissions and make energy reliable, affordable and more sustainable.
Job Summary
Responsibilities:
📍Design, develop, and maintain data pipelines and ETL processes; Data modeling, data cleansing; 📍Designing and implementing ETL/ELT solutions for transferring data between various sources and platforms; 📍Automating data processing workflows using tools such as Airflow or other workflow management tools; 📍Optimizing the performance of databases, including designing and implementing data structures and using indexes appropriately Implement data quality and data governance processes. 📍Technologies - MS SQL, Azure Data Factory, Azure Data Lake Storage, Azure Databricks, dbt, S3, Airflow, Python.
Requirements
📍3+ years of experience as a Data Engineer; 📍Experience developing and administering ETL processes in the Cloud (Azure, AWS or GCP) environment; 📍Experience developing Data Lake / Data Warehouse solutions; 📍Strong programming skills in Python, SQL; 📍Experience with ETL tools and data integration; 📍Strong problem-solving and analytical skills.