About the company
Kraken, the trusted and secure digital asset exchange, is on a mission to accelerate the adoption of cryptocurrency so that you and the rest of the world can achieve financial freedom and inclusion. Our 2,350+ Krakenites are a world-class team ranging from the crypto-curious to industry experts, united by our desire to discover and unlock the potential of crypto and blockchain technology. As a fully remote company, we already have Krakenites in 70+ countries (speaking 50+ languages). We're one of the most diverse organizations on the planet and this remains key to our values. We continue to lead the industry with new product advancements like Kraken NFT, on- and off-chain staking and instant bitcoin transfers via the Lightning Network.
Job Summary
The opportunity
📍Build scalable and reliable data pipeline that collects, transforms, loads and curates data from internal systems 📍Augment data platform with data pipelines from select external systems 📍Ensure high data quality for pipelines you build and make them auditable 📍Drive data systems to be as near real-time as possible 📍Support design and deployment of distributed data store that will be central source of truth across the organization 📍Build data connections to company's internal IT systems 📍Develop, customize, configure self service tools that help our data consumers to extract and analyze data from our massive internal data store 📍Evaluate new technologies and build prototypes for continuous improvements in data engineering.
Skills you should HODL
📍4+ years of work experience in relevant field (Data Engineer, DWH Engineer, Software Engineer, etc) 📍Experience with data warehouse technologies and relevant data modeling best practices (Presto, Athena, Glue, etc) 📍Experience building data pipelines/ETL and familiarity with design principles (Apache Airflow is a big plus!) 📍Excellent SQL and data manipulation skills using common frameworks like Spark/PySpark, or similar. 📍Proficiency in a major programming language (e.g. Scala, Python, Golang,..) 📍Experience with business requirements gathering for data sourcing.