Research Scientist/Engineer, Robot Foundation Models (Large Behavior Models)

Los Altos, CA; Cambridge, MA
Robotics – Robotics /
Full-time /
Hybrid
At Toyota Research Institute (TRI), we’re on a mission to improve the quality of human life. We’re developing new tools and capabilities to amplify the human experience. To lead this transformative shift in mobility, we’ve built a world-class team in Robotics, Human-Centered AI, Human Interactive Driving, and Energy & Materials.

The Mission
Make general-purpose robots a reality.

The Challenge
We envision a future where robots assist with household chores and cooking, aid the elderly in maintaining their independence, and enable people to spend more time on the activities they enjoy most. To achieve this, robots need to be able to operate reliably in messy, unstructured environments. Our mission is to answer the question “What will it take to create truly general-purpose robots that can accomplish a wide variety of tasks in settings like human homes with minimal human supervision?”. We believe that the answer lies in cultivating large-scale datasets of physical interaction from a variety of sources and building on the latest advances in machine learning to learn general purpose robot behaviors from this data.

The Team
Our goal is to revolutionize the field of robotic manipulation, enabling long-horizon dexterous behaviors to be efficiently taught, learned, and improved over time in diverse, real world environments.

Our team has deep cross-functional expertise across simulation, perception, controls, and machine learning, and we measure our success in terms of fundamental capabilities development, as well as research impact via open-source software and publications. Our north star is fundamental technological advancement in building robots which can flexibly perform a wide variety of tasks in diverse environments with minimal human supervision. Come join us and let’s make general-purpose robots a reality.

We operate a fleet of robots, and robot-embodied teaching and deployment is a key part of our strategy. Some of our ongoing work is highlighted here.

The Opportunity
We’re looking for a driven research scientist or research engineer with a strong background in embodied machine learning and a “make it happen” mentality. The ideal candidate is able to operate independently when needed, but works well as part of a larger integrated group at the cutting edge of state-of-the-art robotics and machine learning. Experience with robots or other embodied systems (such as autonomous vehicles) is strongly preferred, particularly in the manipulation domain.

If our mission of revolutionizing robotics through machine learning resonates with you, get in touch and let’s talk about how we can create the next generation of AI-powered capable robots together.

Responsibilities

    • Work as part of a dynamic, closely-knit research team building general-purpose robot foundation models.
    • Implement, extend, and create state-of-the-art methods for robot behavior learning from a mixture of interactive embodied data and online data sources such as text, images, and video.
    • Implement high-performance machine-learning pipelines and optimize data and learning stacks for scalability, efficiency, and performance.
    • Be a key member of the team and play a critical role in rapid progress measured by both the development of internal capabilities and high-impact external publication.
    • Collaborate with internal research scientists and our partner labs at top academic research universities including MIT, Stanford, Berkeley, CMU, Columbia, and Princeton to drive pioneering research at scale.

Qualifications

    • 4+ years of relevant industry experience or a PhD.
    • Experience training large models and deploying them on embodied systems.
    • Extensive practical experience with Machine Learning using a major framework such as PyTorch or TensorFlow. Familiarity with data pipelines, model serving and optimization, cloud training, and dataset management is also useful.
    • Strong understanding of the state-of-the-art in behavior learning, language, and/or computer vision.
    • Some familiarity with robots and the challenges inherent in conducting research on physical hardware platforms.
    • Strong proficiency in Python.
    • An ability to move fast and switch between modes of rapid prototyping and robust implementation as required.
    • A strong track record of impact, either via first author research publications at top-tier machine learning or robotics conferences, or via meaningful contributions to successful industry initiatives.
The pay range for this position at commencement of employment is expected to be between $165,760 and $238,280/year for California-based roles; however, base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. Note that TRI offers a generous benefits package (including 401(k) eligibility and various paid time off benefits, such as vacation, sick time, and parental leave) and an annual cash bonus structure. Details of participation in these benefit plans will be provided if an employee receives an offer of employment.

Please reference this Candidate Privacy Notice to inform you of the categories of personal information that we collect from individuals who inquire about and/or apply to work for Toyota Research Institute, Inc. or its subsidiaries, including Toyota A.I. Ventures GP, L.P., and the purposes for which we use such personal information.

TRI is fueled by a diverse and inclusive community of people with unique backgrounds, education and life experiences. We are dedicated to fostering an innovative and collaborative environment by living the values that are an essential part of our culture. We believe diversity makes us stronger and are proud to provide Equal Employment Opportunity for all, without regard to an applicant’s race, color, creed, gender, gender identity or expression, sexual orientation, national origin, age, physical or mental disability, medical condition, religion, marital status, genetic information, veteran status, or any other status protected under federal, state or local laws.