About the company
Bitquery is a leading blockchain data provider with a set of software products that parse, index, and store blockchain data in a unified way. Today we process over 1 Petabyte of data and end-to-end infra in-house. We also serve governments worldwide. Our aim is to become the most prominent data company in crypto.
Job Summary
Role & Responsibilities:
📍Design and implement data quality layer in the large scale data infrastructure; 📍Execute a data quality testing framework to validate data at various stages of the processing lifecycle; 📍Hands-on with test preparation and execution in the Agile and DevOps environment; 📍Become familiar with Bitquery’s blockchain products, data sets, and processing pipelines; 📍Coordinate with subject matter experts to develop, maintain, and validate test scenarios; 📍Meet with internal stakeholders to review current testing approaches and provide feedback on ways to improve/extend / automate; 📍Preparation, review and update of test cases and relevant test data consistent with the system requirements, including functional, integration & regression 📍Analyze, debug, and document quality issues 📍Record and report test status at the respective stages 📍Be proactive and follow Shift-left testing approach in identifying the issues early and following-up on them efficiently 📍Support quality assurance initiatives and help institutionalize best practices 📍Maximize the opportunity to excel in an open and recognising work culture. Be a problem solver and a team player to make a bigger contribution to the achievements. 📍Open to learn from each other in the team and each experience day-to-day
Requirements
📍Min. 5+ years of hands-on with databases 📍Experience working with Big data products 📍Good knowledge of SQL, data warehousing, data analytics, APIs, etc.