Senior Data Engineer

Los Angeles or Remote
Data /
Full-time /
Remote
Raya is hiring a Senior Data Engineer to play a pivotal role in empowering the company to be data-driven. The ideal candidate is experienced and confident building out core data infrastructure, and is looking for an opportunity to have more ownership and impact.  

We prioritize learning and teamwork and love giving people the opportunity to champion big challenges and grow into better versions of themselves. A great candidate is excited to support our growing use of data, including working closely with the machine learning and analytics teams, to build a better user experience. Most importantly, you believe in Raya’s vision, which is to enrich lives by fostering meaningful relationships and experiences through quality, in person interactions.

We offer comprehensive medical and dental coverage, $50 a day food delivery budget, equity based employment, a great culture, learning opportunities, unlimited vacation, 12 weeks paid parental leave, and we pay all employees $1,000 a year to go somewhere in the world that they’ve never been because of our values of human connection, empathy, and curiosity.




In this role, you will:

    • Design and build our data architecture, and play a central role in our data strategy.
    • Collaborate with the Machine Learning team to architect and deploy the next generation of our ML platform.
    • Evaluate, select, and integrate best-in-class tools and frameworks, guiding the company on "build vs. buy" decisions to optimize data processes and infrastructure.
    • Work closely with other engineering teams to ensure our data platform integrates with our wider architecture in a secure & performant fashion.
    • Partner with the Analytics/Data Science team to ensure our data platform can support all analytical data needs across the organization.

Qualifications

    • 5+ years of experience in data engineering with a proven track record of successful data architecture design as well as platform implementation and oversight.
    • Expertise in handling large-scale data systems and cloud platforms (preferably AWS).
    • Expert in Python and SQL.
    • Familiarity with Postgres, MongoDB, and / or Snowflake are a plus.
    • Strong experience with data pipeline and workflow management tools like Apache Airflow, Dagster, or Prefect.
    • In-depth knowledge of real-time data processing frameworks such as Kinesis, Kafka, or Flink. Experience with Segment is a plus. 
    • Experience with data modeling practices and ETL frameworks, with a strong emphasis on performance optimization and security.
    • Excellent communication and collaboration skills to work effectively across teams.