Senior Data Engineer

Remote /
ML engineering /
Full-time
Build a better sustainable world!

Consumer packaged goods (CPG) companies all over the world – including names you already have in your fridge, toiletries bag, make-up case, and pantry – rely on Turing’s AI technology to create new products faster, safer, smarter, and leaner. With our industry-leading platform, we leverage data and combine industry-specific AI technology to create new products 10x faster than the existing physical trial-and-error process.

Turing Labs is a Series A funded company backed by Insight Partners, Y-combinator, Moment Ventures, and several industry leaders. 

The Turing team has decades worth of experience and passion for creating the next generation of products. Today, we employ some of the world’s best machine learning algorithms – and the people who create them – to help drive product innovation that impacts billions of people’s lives. 

You can learn more about our journey and where we are going on TechCrunch, VentureBeat, The Wall Street Journal, Business Insider, and Forbes

Join one of the fastest-growing SaaS on the path of becoming the default platform for product development.

You will get exposure to the world's top algorithms and AI - a truly stimulating environment with an opportunity to work with the industry's leading minds.

As a Senior Data Engineer, you will own all things related to data engineering including building and maintaining data models, integrations, and pipelines for our AI software platform. This role is very hands-on and requires a structured mindset and solid implementation skills. 

Responsibilities

    • Build data pipelines, data processing tools and scripts
    • Perform tasks such as writing scripts, web scraping, getting data from APIs etc, and integrate data from several data sources
    • Automate data pipelines using scheduling tools like Airflow
    • Reinvent prototypes to create production-ready data flows
    • Be responsible for the technical solution design, lead the technical architecture and implementation of data acquisition and integration projects, both batch and real time. Define the overall solution architecture needed to implement a layered data stack that ensures a high level of data quality and timely insights.
    • Participate in the development of documentation, technical procedures and user support guides 
    • Articulate benefit analysis of technologies in open source and proprietary products. Pilot and choose optimal tools and solutions
    • Support data science research by designing, developing, and maintaining all parts of the data pipeline

How you will lead

    • Communicate with stakeholders to clarify requirements. Craft technical solutions and assemble design artifacts (functional design documents, data flow diagrams, data models, etc.).
    • Serve the team as a subject matter expert & mentor for ETL design.
    • Proactively identify performance & data quality problems and drive the team to remediate them. Advocate architectural and code improvements to the team to improve execution speed and reliability.
    • Harness operational excellence & continuous improvement with a can do leadership demeanor.
    • Be prepared for changes in business direction and understand when to adjust designs.
    • Work effectively in an unstructured and fast-paced environment both independently and in a team setting, with a high degree of self-management with clear communication and commitment to delivery timelines.

You already have

    • A BS/MS degree in Computer Science, Engineering, Mathematics, Physics, or equivalent/related degree.
    • Built programmatic ETL pipelines with SQL-based technologies and platforms.
    • Solid understanding of databases, and are working with sophisticated datasets.
    • Shown strong problem solving with acute attention to detail and ability to meet tight deadlines and project plans.
    • Shown the ability to research, analyze, interpret, and produce accurate results within reasonable turnaround times with an iterative mentality with rapid prototyping designs.
    • Shown technical leadership with an emphasis on the data lake, data warehouse solutions, business intelligence, big data analytics, enterprise-scale custom data products.
    • Knowledge of data modeling techniques and high-volume ETL/ELT design.
    • Experience with version control systems (Github, Subversion) and deployment tools (e.g. continuous integration) required.
    • Experience working with Public Cloud platforms like GPC, AWS, or Snowflake.
    • Familiarity with scrum/agile project management methodologies and SDLC stages.
    • At least 3 years of expert experience with SQL.
    • At least 3 years of experience with the AWS ecosystem.
    • At least 2 years of Python development experience using Pandas.
What we offer

      An opportunity to take a crucial, early role in an exciting, rapidly growing startup. 
      You’ll work with a smart, no-ego team
      Competitive wage, comp, and equity package
      Medical, dental, and vision package
      A monthly health & wellness stipend
      Weekly virtual social events and annual company retreats
      Work-from-home stipend

Benefits

* Platinum level medical insurance 100% covered by Turing
* $1,500 home office budget
* Unlimited PTO + 10 Federal Holidays
* 4% 401k employer match
* Dental & Vision insurance
* Turing Swag
* Travel to meet with our team yearly (pending pandemic)

Turing Labs is proud to be an equal opportunity employer. We do not discriminate in hiring or any employment decision based on race, color, religion, national origin, age, sex (including pregnancy, childbirth, or related medical conditions), marital status, ancestry, physical or mental disability, genetic information, veteran status, gender identity or expression, sexual orientation, or other applicable legally protected characteristic. Turing Labs considers qualified applicants with criminal histories, consistent with applicable federal, state and local law. Turing Labs is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures.