Big Data Engineers (Airflow, Python, Spark)

Bangalore, India
Technology – Engineering /
Full Time /
Hybrid
At Nielsen, we believe that career growth is a partnership. You ultimately own, fuel and set the journey. By joining our team of nearly 14,000 associates, you will become part of a community that will help you to succeed. We champion you because when you succeed, we do too. Embark on a new initiative, explore a fresh approach, and take license to think big, so we can all continuously improve. We enable your best to power our future. 

About the Role

As a Software Data Engineer you will be responsible for data pipelines to work on both scheduled and real time use cases.  Schedule pipelines run typically in Airflow and real time processing could be airflow or integrated querying in backend services. 

Responsibilities

    • Work closely with team leads and backend developers to design and develop functional, robust pipelines to support internal and customer needs.
    • Write both unit and integration tests, and develop automation tools for daily tasks.
    • Develop high quality, well documented, and efficient code.
    • Manage and optimize scalable pipelines in the cloud.
    • Optimize internal and external applications for performance and scalability.
    • Develop automated tests to ensure business needs are met, and write unit, integration, or data quality tests.
    • Communicate regularly with stakeholders, project managers, quality assurance teams, and other developers regarding progress on long-term technology roadmap.
    • Recommend systems solutions by comparing advantages and disadvantages of custom development and purchased alternatives.

Key Skills (Domain Expertise)

    • 2+ years of experience as a software/data engineer.
    • Bachelor’s degree in Computer Science, MIS, or Engineering.

Technical SKills

    • Experience with database systems, with knowledge of SQL and NoSQL stores (e.g. MySQL, Oracle, MongoDB, Couchbase, etc.)
    • Experience developing sophisticated data driven software platforms.
    • Experience with relational databases and/or analytical data stores (e.g., Spark, Presto, Pandas).
    • Experience delivering, supporting enterprise software.
    • Experience with cloud service technologies, ex: AWS.
    • Experience with container technologies, ex: Docker.
    • Experience with source control systems, ex: GIT.
    • Information visualization.
    • "Big data" systems and analysis.
    • Experience with data warehouses or data lakes.

Mindset and Attributes

    • Strong communication skills with ability to communicate complex technical concepts and align organization on decisions.
    • Sound problem-solving skills with the ability to quickly process complex information and present it clearly and simply.
    • Utilizes team collaboration to create innovative solutions efficiently.