Data Engineer

Taipei
Deeplearning AI – Engineering /
Full time /
Hybrid
AI is the new electricity. Millions of AI engineers will be required to transform industries with artificial intelligence and we’re building an education platform to train them. Deeplearning.ai wants to provide a world-class education to people around the globe so that we can all benefit from an AI-powered future. 

Deeplearning.ai is looking for a Data Engineer with strong computer science fundamentals with a passion for improving learners' experiences. The ideal candidate will thrive in the early development stage of a leading educational environment focusing on AI-related topics. 

As a Data Engineer you will be responsible for building and delivering high-quality data infrastructure and support to the technical content deeplearning.ai is providing. Our team is growing fast, and we are looking for a strong engineer to develop our educational products. In this role, you will work alongside a team of talented content creators as well as our outside partners, to build various layers of the infrastructure of world-renowned AI-driven education.

This is a full-time position based in Taipei.

Here’s what you’ll do:

    • Develop a learner-centered platform to deliver the best learning experience for the deeplearning.ai learners.
    • Maintain quality and ensure responsiveness and scalability of the developed application.
    • Design and develop data pipelines and visualizations to help our teams iterate quickly through data-driven decisions.
    • Maintain a high-quality code base and write clear documentation.
    • Develop data infrastructure for grading tools and network training.

Here are the experiences you should have:

    • Broad and solid CS foundation knowledge, including data structures & algorithms, OS, computer networks, and databases.
    • At least 3+ years of software development experience and have experience building end-to-end projects.

Here are the skills you should have:

    • At least 3+ years of software development experience and experience in building end-to-end projects.
    • 3+ years of experience with data pipeline tools such as Airflow, Redshift, and Metabase.
    • Bonus: experience with general backend (Linux, SQL, Python), web frameworks (Django, Flask, FastAPI), and API (REST, GraphQL).
    • Strong ability to convert ideas to running code.
    • Proficient with Docker.
    • Knowledge of security concerns and best practices.
    • Excellent verbal and oral communication skills in both Mandarin and English.

By working with us you will:

    • Be a part of a world-class technical team working alongside offices in different parts of the world.
    • Have the opportunity to consolidate a quickly growing startup.
    • Have access to state-of-the-art infrastructure and technology.
    • Have a competitive salary in a well-funded high-growth company.

We hope you will fit well with our team’s culture:

    • Strong work ethic: All of us believe in our work’s ability to change human lives. Consequently, we work not just smart, but also hard.
    • Growth mindset: We are eager to teach you new skills and invest in your continual development. But learning is hard work, so this is something we hope you’ll want to do.
    • Good team member: We care and watch out for each other. We’re humble individually and go after big goals together.
    • Flexibility: You should be flexible in your tasks and do whatever is needed, ranging from lower-level tasks such as coordinating complicated schedules to high-level work such as thinking through corporate strategy.