Senior Data Engineer

San Francisco, CA /
Engineering /
Full Time
Before you read on, take a look around you. Chances are, pretty much everything you see has been shipped, often multiple times, in order to get there. E-commerce is exploding, and with it, parcel shipping is becoming a meaningful factor in a business' ability to succeed. Creating a compelling shipping experience for customers is hard but necessary.

At Shippo, our goal is to level the playing field by providing businesses access to shipping tools and terms that would not be available to them otherwise.

Shippo lowers the barriers to shipping for businesses around the world. As free and fast shipping becomes the norm, better access to shipping is a competitive advantage for businesses. Through Shippo, e-commerce businesses, marketplaces, and platforms are able to connect to multiple shipping carriers around the world from one API and dashboard. Businesses can get shipping rates, print labels, automate international documents, track shipments, and facilitate returns.

Internally, we think of Shippo as the building blocks of shipping. Shippos are a diverse set of individuals. We look for cultural and skill fit in every new person. Join us to build the foundations of something great, roll up your sleeves, and get important work done everyday. Founded in 2013, we are a proud team based out of San Francisco. Shippo’s investors include D1 Capital Partners, Bessemer Venture Partners, Union Square Ventures, Uncork Capital, VersionOne Ventures, FundersClub, and others.

We are seeking a new Senior Data Engineer! You will be responsible for building systems to collect and process events of massive scale to gain operational and business insight into the performance and optimization of shipping services.

The data engineer will work closely with product, engineering, and business leads in generating customer-facing and internal dashboards, ad hoc reports, and models to provide insights and affect platform behavior. This will also include building and maintaining the infrastructure to collect and transform raw data.

Responsibilities and Impact

    • Design, build, scale, and evolve large scale data systems 
    • Collaborate with Finance, Product, and Customer Success teams on joint fraud mitigation and investigation projects
    • Integrate data from various data stores to ensure consistency and availability of data insights 
    • Prioritize tasks and deliverables, optimizing for a balanced delivery on short-term and long-term goals 
    • Articulate and present findings and recommendations at different levels, with a clear bias towards impactful learning and results 
    • Drive the usability and impact of the data projects in ad hoc analysis and real-time and batch processing 
    • Champion engineering organization’s adoption and ongoing use of the data infrastructure 

Requirements

    • Coding experience in server-side programming languages (e.g. Go, Python, Java, Ruby) as well as database languages (SQL)
    • Experience working with server-side MVC frameworks (e.g. Django, .NET, Spring, Rails, Phoenix)
    • Coding experience in front-end programming Javascript Frameworks (e.g. React, Redux Ember,  Angular,  Meteor)
    • Solid understanding of object-oriented programming and familiarity with various design and architectural patterns
    • Experience integrating with APIs that use REST, gRPC, SOAP and other technologies
    • Exceptional verbal, written, and interpersonal communication skills
    • Deep understanding of customer needs and passion for customer success
    • Exhibit core behaviors focused on craftsmanship, continuous improvement, and team success
    • 5+ years of experience in software development
    • BS or MS degree in Computer Science or equivalent experience

Bonus Points

    • Experience with RDBMS, such as PostgreSQL, MySQL, and NoSQL, and columnar data stores
    • Experience with implementing ETL process
    • Experience with Big Data frameworks such as Hadoop, MapReduce and associated tools
    • Experience building stream-processing systems, using solutions such as Kinesis Stream or Spark-Streaming
    • Experience with statistical analysis and a data visualization package such as R, Mathematica, Stata, Tableau, etc. 
    • Experience with cloud environments and DevOps tools; working experience with AWS and its associated products a plus 
    • Experience with machine learning a plus 

Benefits, Perks, and More

    • Medical, dental, and vision healthcare coverage for you and your dependents. Pets coverage also available!
    • Flexible policy for PTO and work arrangement
    • 3 VTO days for ShippoCares volunteering events
    • $2,500 annual learning stipend for your personal and professional growth
    • Charity donation match up to $100
    • Free daily catered lunch, drinks, and snacks
    • Fun team events outside of work hours - happy hours, “escape room” adventures, hikes, and more!