As a Data Engineer you will help build the architecture that drives our business. You will design and implement scalable systems that can handle 50K events per second, that's 40TB of compressed data, processing 100s of batch processes and visualized by powerful homegrown front end tools.
This is you
- 2+ years of Programming or scripting experience (Scala, Python, PHP...)
- 2+ years experience implementing BI solutions, you have strong dimensional modeling skills, you know how to design the best data model for each type of use case (ad-hoc, dashboards, statistics ..).
- 2+ years experience designing and implementing architectural components of a Data Warehouse in TBs scale. Preferably using Spark and Big Data/Column store DBs (Redshift, Presto, Vertica …).
- 2+ years experience implementing large scale ETL architecture, supporting multiple data sources (internal and external) and multiple use cases (product analytics, marketing, statistical modeling, executive dashboards).
- Excellent SQL skills, you can optimize a query for a specific database and understand the different capabilities of open source DBs and vendor DBs.