About the company
At Huron, weāre redefining what a consulting organization can be. We go beyond advice to deliver results that last. We inherit our clientās challenges as if they were our own. We help them transform for the future. We advocate. We make a difference. And we intelligently, passionately, relentlessly do great workā¦together. Are you the kind of person who stands ready to jump in, roll up your sleeves and transform ideas into action? Then come discover Huron. Whether you have years of experience or come right out of college, we invite you to explore our many opportunities. Find out how you can use your talents and develop your skills to make an impact immediately. Learn about how our culture and values provide you with the kind of environment that invites new ideas and innovation. Come see how we collaborate with each other in a culture of learning, coaching, diversity and inclusion. And hear about our unwavering commitment to make a difference in partnership with our clients, shareholders, communities and colleagues.
Job Summary
Responsibilities:
šData Pipeline Development: šBuild and maintain scalable ETL/ELT pipelines using Databricks. Leverage PySpark/Spark and SQL to transform and process large datasets. šIntegrate data from multiple sources including Azure Blob Storage, ADLS and other relational/non-relational systems. šCollaboration & Analysis: šWork Closely with multiple teams to prepare data for dashboard and BI Tools. šCollaborate with cross-functional teams to understand business requirements and deliver tailored data solutions. šPerformance & Optimization: šOptimize Databricks workloads for cost efficiency and performance. šMonitor and troubleshoot data pipelines to ensure reliability and accuracy. šGovernance & Security: šImplement and manage data security, access controls and governance standards using Unity Catalog. šEnsure compliance with organizational and regulatory data policies. šDeployment: šLeverage Databricks Asset Bundles for seamless deployment of Databricks jobs, notebooks and configurations across environments. šManage version control for Databricks artifacts and collaborate with team to maintain development best practices.
Requirements:
šStrong expertise in Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime etc.). šProficiency in Azure Cloud Services. šSolid Understanding of Spark and PySpark for big data processing. šExperience in relational databases. šKnowledge on Databricks Asset Bundles and GitLab.
The crypto industry is evolving rapidly, offering new opportunities in blockchain, web3, and remote crypto roles ā donāt miss your chance to be part of it.