Data Quality Engineer (R-15088)

Hyderabad - India
Data & Analytics /
Employee: Full Time
/ Hybrid
Why We Work at Dun & Bradstreet
Dun & Bradstreet unlocks the power of data through analytics, creating a better tomorrow. Each day, we are finding new ways to strengthen our award-winning culture and accelerate creativity, innovation and growth. Our 6,000+ global team members are passionate about what we do. We are dedicated to helping clients turn uncertainty into confidence, risk into opportunity and potential into prosperity. Bold and diverse thinkers are always welcome. Come join us!

About Us
Our global community of colleagues bring a diverse range of experiences and perspectives to our work. You'll find us working from a corporate office or plugging in from a home desk, listening to our customers and collaborating on solutions. Our products and solutions are vital to businesses of every size, scope, and industry. And at the heart of our work, you’ll find our core values: to be data inspired, relentlessly curious and inherently generous. These values are foundational to our evolving culture and guide how we work with each other every day!
About the role
You will be part of the team responsible for measuring the quality of our Global Inventory data. As a Data Quality Engineer, you will play a crucial role in maintaining the quality of our data, enabling our organization to make informed decisions and drive business success. This rule involves working closely with cross-functional teams to ensure the accuracy, consistency, and reliability of our data assets. You will collaborate closely with stakeholders and other data engineers to create data quality monitors, automate data quality processes, and drive continuous improvements.

Key Responsibilities:

    • Execute a comprehensive data quality monitoring strategy which aligns with the organization's Data Quality Standards and business objectives.
    • Develop a strong understanding of Dun & Bradstreet’s inventory data.
    • Perform baseline data quality monitoring to proactively identify data quality issues metrics.
    • Employ advanced data analysis and profiling techniques.
    • Automate data quality monitoring solutions and internal processes.
    • Operate within data models which ensure the data is stored in an organized structure.
    • Utilize PowerBI and/or Looker to design, create, connect and administer dashboards which derive insights from data quality monitoring results.
    • Execute on a robust data validation framework with automated testing processes.
    • Communicate with the globally distributed stakeholders using JIRA and Confluence.
    • Capture requirements accurately and seek strong understanding of use cases.
    • Recommend improvements to data quality team’s internal processes.
    • Generate regular reports on data quality metrics.
    • Review data to identify patterns or trends that may indicate errors in processing.
    • Maintain comprehensive documentation of data quality processes and findings.
    • Comply with data governance policies and procedures.
    • Educate yourself on industry best practices and technologies related to data quality.

Required Traits Include:

    • Required Traits Include:
    • Bachelor's degree in Computer Science, Information Technology, or a related field.
    • 2+ years of experience and demonstrated in-depth knowledge of data analysis, querying languages, data modelling, and the software development life cycle.
    • Strong skill in SQL (preferably BigQuery).
    • Agile mindset and understanding of agile project management (Scrum/Kanban).
    • Understanding of Database design, modelling, and best practices.
    • Experience with cloud computing technologies (preferably GCP).
    • Experience with PowerBI, Looker or similar data visualisation tool.
    • Analytical, process improvement and problem-solving skills.
    • Good communication skills and the ability to articulate data issues and solutions.
    • Commitment to meet deadlines and uphold the release schedule, and role model good teamwork for colleagues.
    • Valuable Traits Include:
    • Skill in Python and/or Scala for data wrangling and data analysis.
    • Understanding of DevOps best practices including CI/CD, automation, monitoring, observability, agile project management, version control, and continuous feedback.
    • Experience with data observability tools such as Acceldata or Informatica DQ.
    • Experience with XML and JSON data structures.
    • Understanding of ETL processes and their impact on data quality.
    • Knowledge of Machine Learning, specifically anomaly detection.
    • Experience collaborating across time zones as part of a global team.

    • What we’re looking for:
    • Dynamic and results-driven team members with the focus on facilitating action and effecting change.
    • An innovative and inspirational approach.
    • Self-motivation with the desire to learn new techniques – relentless curiosity.
    • Ability to prioritize work with a flexible approach to balancing multiple tasks.
    • Ability to work independently.
    • A great team player.
All Dun & Bradstreet job postings can be found at Official communication from Dun & Bradstreet will come from an email address ending in

Notice to Applicants: Please be advised that this job posting page is hosted and powered by Lever. Your use of this page is subject to Lever's Privacy Notice and Cookie Policy, which governs the processing of visitor data on this platform.