Data Engineer

Cambridge, MA
Engineering /
On-site
Our Mission
Our mission is to solve the most important and fundamental challenges in AI and Robotics to enable future generations of intelligent machines that will help us all live better lives.

Data Engineers will work cross-functionally, creating new technology to support software development for robots. If you have a passion for developing data collection and processing infrastructure for robots and robotic learning, you will want to join us! We are onsite in our new Cambridge, MA office where we are building a collaborative and exciting new organization.

Responsibilities

    • Work collaboratively with research scientists and software engineers on software development for a range of different robotic platforms.
    • Develop and maintain our data storage solutions and data pipelines in cloud and on-premise infrastructure.
    • Use Python and Terraform to develop and scale cloud-native data stores.
    • Build event- and batch-driven ingestion systems for machine learning and R&D.
    • Write and maintain user guides for internally developed tools.
    • Create and use systems to clean, integrate, or fuse datasets to produce data products.
    • Establish and monitor data integrity and quality through visualization, profiling, and statistical tools.
    • Perform updates, migrations, and administration tasks for data systems.
    • Develop and implement data governance and data retention strategies.

Requirements

    • BS/MS in computer science, robotics, or equivalent experience.
    • 6+ years of experience in a data engineering, software engineering, DevOps, or MLOps role.
    • Strong experience building event-driven data ingestion systems.
    • Strong experience with distributed data/computing tools, such as Spark, Ray, EMR, Dataproc, Dask, or Pandas on Spark.
    • Strong experience with ETL design and implementations in the context of large, multimodal, distributed datasets.
    • Strong experience with workflow orchestration tools, such as Airflow, Argo Workflows, Cloud Composer, MWAA, Step Functions, or Prefect.
    • Demonstrated experience building containerized applications using tools and frameworks such as Docker, Docker-compose, Podman, or OCI.
    • Demonstrated experience with schema management and schema evolution.
    • Demonstrated experience with databases and data storage solutions, such as Google Cloud Storage (GCS), S3, BigQuery, NoSQL and/or SQL.
    • Experience with container orchestration tools, such as Kubernetes, GKE, EKS, or AKS.
    • Experience with UNIX/Linux including basic commands and shell scripting.

Bonus (Not Required)

    • Associate- or Professional-level GCP certifications.
    • 3+ years of experience working on time-series data and streaming applications.
    • 3+ years of experience with NoSQL implementation such as Mongo, Cassandra, DynamoDB, Datastore, or BigTable.
    • 3+ years of experience working with on-prem compute and storage appliances.
    • 3+ years of experience with data streaming tools, such as Kafka, Flink, Kinesis, Beam, Spark Streaming, or Dataflow.
    • 2+ years of experience customizing package managers or build tools, such as Make, Poetry, or Bazel.
    • 2+ years of experience with Infrastructure as Code tools such as Terraform, Go CDK, or AWS CDK.
    • 2+ years of experience using data quality tools, such as great-expectations, or Cerberus.
We provide equal employment opportunities to all employees and applicants for employment and prohibit discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.