Sr. Data Scientist / Machine Learning Engineer (Chicago, IL)

Chicago, IL
Data Science
Full time
About Us
Mastery Logistics Systems is reimagining the technology that moves goods from one place to another.  We are building an intuitive, efficient, comprehensive solution that brings cutting edge technology, the power of data and AI, and user experience to the forefront.  For those that know the jargon and the logistics industry, we are building the next generation of technologies to allow large brokers, 3PL’s, and shippers to benefit from the progress of technology and connectivity.  Our initial offering - a TMS built for freight brokerages - targets large, distributed brokerages and 3PL’s, with follow-on shipper TMS available in the future.  If you don’t know the industry (yet!), that’s cool – we want your voice and perspective to help us break out of established patterns and truly innovate. 

The problems in this industry are big and exciting! There’s also a lot of low hanging fruit where other industries and types of applications have found solid solutions.  We are tackling everything from fast and efficient data input (Natural Language Processing? Voice? Better forms?) to ingesting large amounts of data and applying AI to looking at blockchain to securely digitize paperwork.  If you are passionate about humanizing an industry, automating in innovative ways, building for quality and scale, helping make people's lives easier, touching every part of our economy or all of the above then this is the place for you. 
Mastery Logistics Systems is committed to providing a great, inclusive working environment by challenging our team members while being respectful of your time and the needs of your personal life. That’s why we encourage time exploring innovative ideas alongside more standard daily tasks. We have a 4-day work week (with 1 Friday a month reserved for company meetings) and require employees to schedule at least one week off every four months. 
Our team has the domain knowledge and connections to make an impact, and we’re looking for experienced and thoughtful people to help. We need people who are flexible problem solvers, collaborate consistently and know how to communicate their solutions well. We are small and nimble, and each member of the team can make a tremendous impact both technically and culturally.  While a start-up, we are well-funded, have an initial paying customer with which to test and launch, and are founded by top experts and veterans in the logistics industry. Join us and help make something great. 

Tasks & Responsibilites

    • Parse, standardize, and analyze large volumes of unstructured and semi-structured text data
    • Create services to productionize machine learning models and implement numerical optimization techniques in real-time
    • Design, implement, and maintain data processing pipelines in Python, Scala or Ruby
    • Build cloud infrastructure necessary to process data from a wide variety of data sources utilizing Docker, Kubernetes, Terraform, Azure DevOps and orchestration tools such as Argo and Airflow
    • Conduct and present exploratory data analyses to stakeholders in order to drive data strategy
    • Collaborate with product managers, designers, and engineers to define requirements, formalize problems, collect data, and implement solutions
    • Design, implement, and monitor experiments in order to analyze user behavior and inform application feature development
    • Actively participate in code review and team architecture discussions


    • Practical experience applying statistical techniques, machine learning, and/or optimization methodologies to real-world business problems
    • Strong SQL, RDMS, and data modeling experience
    • Experience developing software in Python, including architecting, building, deploying, and maintaining data and ETL pipelines
    • Experience in cloud infrastructure (AWS, Azure, GCP, Heroku, etc.)
    • Strong communication skills
    • Compassion and empathy

Nice to Have

    • Experience designing and running experiments in a SaaS context
    • Experience building and maintaining RESTful APIs to expose machine learning models
    • Experience with NoSQL technologies
    • Experience with any of the following: probabilistic graphical models, network analysis, natural language processing (NLP), time-series forecasting
    • Experience with concurrency and distributed computing, including frameworks in the Hadoop ecosystem
Women, non-binary people and those with marginalized genders, people of color, LGBTQIA+ folks, veterans, differently-abled people, and other under-represented candidates are strongly encouraged to apply.

Please apply to the "Remote" position if you are not located in or willing to relocate to IL (Chicago or Champaign areas). Remote jobs are in timezones UTC-4:00 - UTC-8:00 (Atlantic to Pacific US time).

At this time, we are unable to sponsor visas.