About the company
Public is an investing platform that allows people to invest in stocks, ETFs, treasuries, crypto, art, collectibles, and more – all in one place. Public’s platform helps people be better investors with access to custom company metrics, live shows about the markets, and real-time analysis. Members control how they invest with a suite of powerful tools, and get insights from a community of millions of investors, creators, and analysts. Since 2019, Public has raised over $300 million. Investors include Accel, Tiger Global, Will Smith's Dreamers VC, The Chainsmokers' Mantis VC, and Shari Redstone's Advancit Capital, as well as renowned figures in business and culture, like Maria Sharapova, Tony Hawk, and NYU Stern professor Scott Galloway.
Job Summary
What you’ll do
📍Develop an in-depth understanding of our business fundamentals and success metrics in order to build data solutions that capture top-line initiatives and inform decision-making 📍Partner closely with our engineering and product pods to provide data and insights on our most important brokerage features 📍Build and manage advanced reporting, dashboards, data models, and tools to support cross-functional teams and initiatives, and enable teams to self-service where possible, while being the primary point of contact for ad hoc data requests 📍Extract, manipulate, cleanse, and synthesize data from a variety of sources through ETL technologies 📍Identity, manage, and implement data-driven recommendations for product and lifecycle marketing through alerts, automated reverse ETL pipelines, A/B testing, and machine learning applications 📍Be a thought leader in refining the ways we are developing, testing, deploying, organizing and documenting our data infrastructure, data transformation models, and BI tools for self-service reporting
Who you are
📍5+ years of experience in a data-professional capacity such as Data Analyst, Data Scientist, and/or Analytics Engineer 📍Expert SQL knowledge and experience developing data models using DBT 📍Fundamental understanding of D2C businesses and unit economics 📍Ability to collect required data from internal and external systems, design data storage structures, and build automated data pipelines that are reliable and scalable in a fast-growing data ecosystem. Preferred experience with Fivetran, Airflow, and/or Airbyte 📍A strong understanding of data modeling principles and modern data platforms to build data model architecture, data ETL processes, reporting, and analytics solutions. Preferred experience with Snowflake, DBT, Looker, HighTouch, and Segment 📍The ability to push the boundaries of analytical insights with modern machine learning and data science stacks (including Python)