Senior Data Engineer - ETL, SQL, Data Warehousing

SF, Seattle, or Remote
Engineering
Full-time
Coffee Meets Bagel’s vision is to inspire singles to share and connect authentically. We make lasting impressions on people’s lives by helping them form meaningful connections with other amazing singles. In this role, you will work closely with the Platform Engineering, Analytics, Data Science, and Product Management teams to power insights that improve the online dating experience for our users.

Responsibilities:

    • Lead the ongoing vision, evolution and innovation of our big data warehouse infrastructure in AWS
    • Architect and maintain reliable ETL processes and data pipelines
    • Scale our data warehouse infrastructure as our products and users grow
    • Help teams analyze business intelligence data and provide any necessary reports (e.g. LTV, UA, ROI, Cohorts, Attribution)
    • Develop reusable tools to make Data Warehousing completely turnkey and easy to use
    • Troubleshoot and tune existing data warehouse applications (e.g. Redshift)
    • Assist users with writing correct and performant analytics queries

Qualifications:

    • B.S. degree in computer science, mathematics, statistics or a similar quantitative field
    • 6+ years of work experience in relevant field (e.g. Data Engineer, BI Engineer, Software Engineer)
    • Experience with Redshift and Postgres
    • Extremely meticulous towards the correctness of query results
    • Experience building ETL/ELT and familiarity with various design principles
    • Experience with Business Intelligence and Analytics tools (e.g. Mode, Tableau, Looker, etc)
    • Experience in scaling and optimizing schemas, optimizing SQL queries and performance tuning ETL pipelines in the OLTP, OLAP and Data Warehouse environments
    • Solid understanding of methods and approaches to relational and NoSQL data stores (e.g. logging, columnar, star and snowflake, dimensional modeling)
    • Knowledge on database performance concepts like indices, segmentation, projections, and partitions
    • Hands-on experience with Big Data technologies (e.g Hadoop, Hive, Spark)
    • Proficiency with object-oriented and/or functional programming languages (e.g. Java, Scala, Python, Go) or strong with scripting languages.
    • Excellent written and verbal communication and interpersonal skills, ability to effectively. collaborate and troubleshoot issues with technical and business partners
    • Detail-oriented business acumen and analytical problem solving skills