Qualifications:
- Minimum 5 years of experience designing, building, and maintaining data
platforms.
- Proficiency building software systems, testing, and the fundamental principles of
software design (e.g. SOLID).
- Proficiency in Python and experience with data engineering libraries and
frameworks.
- Experience optimizing query performance on cloud data warehouse platforms
and relational databases (e.g. Snowflake).
- Experience with DBT for data transformation.
- Experience with data pipeline orchestration tools (e.g., Airflow).
- Familiarity with DevOps tooling (e.g. Docker, K8S, Helm, Github Actions)
- Familiarity with various components of cloud computing platforms (e.g. AWS).
- Excellent team player with strong communication skills
- Comfortable collaborating with stakeholders and navigating ambiguity to
understand product requirements
- Bachelor’s degree in Computer Science, Engineering and/or equivalent
experience
Key Responsibilities:
Build key components of our client's rocketship:
- Design, build, and maintain scalable and reliable data pipelines using Python,
Airflow, and other tools.
- Collaborate closely with data scientists and analysts to understand their
requirements and translate them into efficient data processing workflows.
- Manage and optimize data storage and querying in Snowflake to ensure high
performance and reliability.
- Implement best practices for data governance, security, and privacy.
- Coordinate and participate in on-call rotations
- Develop a culture of high ownership, innovation, empathy, and collaboration
- Make investments in engineering reliability, productivity, and excellence
- Collaborate with stakeholders across the world to drive teams towards high
impact