About the company
At PINTU, We are building the #1 crypto investment platform to focus on new investors in Indonesia and Southeast Asia. We know that 99% of new investors are underserved because existing solutions cater to the 1% who are pros and early adopters hence we built an app that helps them to learn, invest and sell cryptocurrencies with one click away.
Job Summary
In this role, you will:
šConceptualize and generate infrastructure that allows big data to be accessed and analyzed; šReformulate existing frameworks to optimize their functioning; šTest such structures to ensure that they are fit for use; šLiaise with coworkers and specific stakeholders to elucidate the requirements for each task šKeep up-to-date with blockchain standards and technological advancements that will improve the quality of your outputs.
Who We Are Looking For
šA Bachelorās Degree in Computer Science or a related field preferred š5+ years of Data Integration experience. š3+ years of hands-on experience with one of the following technologies: Apache Spark, SQL or BigQuery or PostgreSQL šProficiency in at least one of the following programming languages: Python, Scala, and Java šExperience in writing Apache Spark or Apache Beam including an understanding of optimization techniques. šExperience in data streaming and integration with Kafka. šHaving experience in GCP Products like Bigquery, Dataflow, PubSub, Bigtable, Composer, and GCS šProficiency in traditional RDBMS with an emphasis on Postgres, and MySQL. šGeneral understanding of ETL/ELT frameworks, error handling techniques, data quality techniques and their overall operation šProficient in developing and supporting all aspects of a big data cluster: Ingestion, šProcessing, integration (Python, Spark, Scala), data cleansing, workflow management (Airflow), and querying (SQL). šProficient in Docker and containerization šCapable of navigating and working effectively in a DevOps model including leveraging related technologies: Jenkins, GitLab, Git etc.