About the company
Kraken, the trusted and secure digital asset exchange, is on a mission to accelerate the adoption of cryptocurrency so that you and the rest of the world can achieve financial freedom and inclusion. Our 2,350+ Krakenites are a world-class team ranging from the crypto-curious to industry experts, united by our desire to discover and unlock the potential of crypto and blockchain technology. As a fully remote company, we already have Krakenites in 70+ countries (speaking 50+ languages). We're one of the most diverse organizations on the planet and this remains key to our values. We continue to lead the industry with new product advancements like Kraken NFT, on- and off-chain staking and instant bitcoin transfers via the Lightning Network.
Job Summary
The opportunity
📍Work and master knowledge of blockchain data! 📍Investigate notable cryptocurrency transactions and addresses for insights and detecting issues. 📍Build scalable and reliable data pipelines that collects, transforms, loads and curates data from internal systems. 📍Ensure high data quality for pipelines you build and make them auditable 📍Support design and deployment of distributed data store that will be central source of truth across the organization 📍Develop, customize, configure self service tools that help our data consumers to extract and analyze data from our massive internal data store 📍Evaluate new technologies and build prototypes for continuous improvements in data engineering
Skills you should HODL
📍Deep hands-on experience with at least one major blockchain, potentially more (i.e. UTXO, EVM chains, etc..) 📍2+ years of work experience in relevant field (Analytics Engineer, Data Engineer, DWH Engineer, Software Engineer, etc) 📍Excellent SQL and data manipulation skills using common frameworks like Spark/PySpark, Pandas, Polars, or similar. 📍Experience with data warehouse technologies and relevant data modeling best practices. 📍Experience building data pipelines/ETL (or ELT) and familiarity with design principles (Apache Airflow is a big plus) 📍Experience with at least one of major programming languages (e.g. Python, Scala, Java,..) 📍Experience with business requirements gathering for data sourcing while working remotely and asynchronously