[Job - 22687] Master Data Developer, Colombia

Colombia
Hadar – Prod_Hadar /
Homeoffice /
Remote
We are tech transformation specialists, uniting human expertise with AI to create scalable tech solutions.
With over 6,500 CI&Ters around the world, we’ve built partnerships with more than 1,000 clients during our 30 years of history. Artificial Intelligence is our reality. 


We are looking for a Data Platform Developer to join our team in our development center, working with a major US client in a fast-paced, high-impact environment. In this role, you will architect and implement robust data pipelines and modern analytics platforms, working in close collaboration with cross-functional teams across the US and Brazil. You will be responsible for building scalable data solutions leveraging modern cloud-native tools.
Fluent English communication is essential for engaging with our global stakeholders and ensuring alignment across distributed teams.

Requirements for this challenge:
- Solid experience as a Data Developer
- Strong SQL expertise with the ability to optimize, refactor, and validate large-scale data transformations.
- Proficiency in Python (or similar language) for scripting and automation of data workflows.
- Hands-on experience with Snowflake, including performance tuning, data governance, masking, and workload management.
- Advanced knowledge and production experience with dbt for transformation logic, testing, documentation, and CI/CD integrations.
- Proven experience implementing Data Vault 2.0 models, including Hubs, Links, Satellites, PIT tables, and business vault patterns using AutomateDV or similar frameworks.
- Experience orchestrating ETL/ELT pipelines using Airflow, with knowledge of DAG structuring, dependency management, and dynamic task generation.
- Familiarity with modern data orchestration tools, such as Prefect, Dagster, or AWS Glue.
- Comfortable working in environments using CI/CD pipelines with GitHub Actions, integrating dbt, testing, and deployment to Snowflake or similar platforms.
- Solid understanding of data modeling best practices, including normalization, dimensional modeling, and historization.
- Ability to translate business requirements into scalable data architectures, and to communicate technical concepts effectively with stakeholders.

Nice to have:
- Experience with data observability tools like Monte Carlo, Datafold, or Great - - Expectations to ensure trust in data pipelines.
- Experience with containerization technologies like Docker or Kubernetes for reproducible environments and scalable deployments.
- Exposure of SAP
- Knowledge of GraphQL, RESTful APIs, or streaming ingestion frameworks such as Kinesis or Firehose.
- Experience working in hybrid architectures, including data lakehouses, or multi-cloud strategies.

#LI-JP3

Our benefits include:

- Premium Healthcare
- Meal voucher
- Maternity and Parental leaves
- Mobile services subsidy
- Sick pay-Life insurance
- CI&T University   
- Colombian Holidays
- Paid Vacations
And many others. 


Collaboration is our superpower, diversity unites us, and excellence is our standard. 
We value diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment. We encourage applications from diverse and underrepresented groups to our job positions.