Software Engineer II (R-13419)

Pune - India /
Technology /
Employee: Full Time - Remote
/ Remote
Why We Work at Dun & Bradstreet
Dun & Bradstreet unlocks the power of data through analytics, creating a better tomorrow. Each day, we are finding new ways to strengthen our award-winning culture and accelerate creativity, innovation and growth. Our 6,000+ global team members are passionate about what we do. We are dedicated to helping clients turn uncertainty into confidence, risk into opportunity and potential into prosperity. Bold and diverse thinkers are always welcome. Come join us!

Job Description

Eyeota is looking for an exceptional Software Developer for our Data Engineering team who can contribute to building a world-class big data engineering stack that will be used to fuel our Analytics and Machine Learning products.  This person will be contributing to the architecture, operation, and enhancement of:

Our petabyte-scale data platform with a key focus on finding solutions that can support Analytics and Machine Learning product roadmap. Everyday terabytes of ingested data need to be processed and made available for querying and insights extraction for various different use cases.
Our bespoke Machine Learning pipelines.  This will also provide opportunities to contribute to the prototyping, building, and deployment of Machine Learning models.

Have at least 6 to 8 years Experience.
Deep technical understanding of Java or Golang.
Production experience with Python is a big plus, extremely valuable supporting skill for us.
Exposure to modern Big Data tech: Scylla, Kafka, Spark, Cassandra, Snowflake, Big Query etc…  while at the same time understanding that certain problems may require completely novel solutions.
Experience includes working in Agile/Lean model
Experience with supporting and troubleshooting large systems
Exposure to configuration management tools such as Ansible or Salt
Experience with any of cloud platforms such as AWS, GCP, Azure
Good addition - Experience working with large-scale data 
Good addition - Good to have experience architecting, developing, and operating data warehouses, big data analytics platforms, and high velocity data pipelines