Data Analytics Engineer
San Francisco, New York or Remote within the U.S. /
Medium’s mission is to help people deepen their understanding of the world and discover ideas that matter. We are building a place where ideas are judged on the value they provide to readers, not the fleeting attention they can attract for advertisers. We are creating the best place for reading and writing on the internet—a place where today’s smartest writers, thinkers, experts, and storytellers can share big, interesting ideas.
We are looking for a Data Analytics Engineer who will help build, maintain, and scale our business critical data warehouse and BI platform. In this role, you will lead development of both transactional and data warehouse designs, mentoring our team of cross functional engineers and data scientists. You’ll gain a deep understanding of how we use data in our business, and help make self-serve data a reality at Medium.
At Medium, we are proud of our product, our team, and our culture. Medium’s website and mobile apps are accessed by millions of users every day. Our mission is to move thinking forward by providing a place where individuals and publishers can share their stories and perspectives. Behind this beautifully-crafted platform is our engineering team who works seamlessly together. From frontend to API, from data collection to product science, Medium engineers work multi-functionally with open communication and feedback.
What you will do
- Work on high impact projects that improve data availability and quality, and provide reliable access to data for the rest of the business.
- You’ll be the go-to Looker expert at the company, and will help bridge the gap between understanding business needs and knowing how to design efficient, usable data models.
- Work with engineers, product managers, and data scientists to understand data needs and implement data exploration tools and dashboards
- Build data expertise and own data quality for allocated areas of ownership.
- You’ll help define the self-serve data strategy at Medium, advocate for best practices, lead trainings, and investigate new technologies.
- Design, architect, and support new and existing ETL pipelines and Looker data models, and recommend improvements and modifications.
- Analyze, debug and maintain critical data pipelines. Tune SQL queries and Snowflake data warehouse configurations to improve performance while keeping costs in mind.
- Identify and help triage infra issues with our ETL infrastructure.
Who you are
- You have 2+ years of software engineering and/or data analytics experience.
- You have experience with schema design and dimensional data modeling.
- You have experience writing and optimizing large, complex SQL and ETL processes, particularly on column-oriented databases and event-based data structures.
- You have designed and built data models in Looker, and you know how to balance trade-offs between performance and usability.
- You have a BS in Computer Science / Software Engineering or equivalent experience.
Nice to have
- Knowledge and experience using Spark
- Proficiency with Python
- Experience with Snowflake
At Medium, we foster an inclusive, supportive, fun yet challenging team environment. We value having a team that is made up of a diverse set of backgrounds and respect the healthy expression of diverse opinions. We embrace experimentation and the examination of all kinds of ideas through reasoning and testing. Come join us as we continue to change the world of digital media. Medium is an equal opportunity employer.
Interested? We'd love to hear from you.
Please note that communication regarding your application, interviews, and job offers will only come from e-mail addresses ending in"@medium.com". Anything else is not a legitimate outreach.