ELT Developer (GCP Stack: Big Query, Python, Firestore)
Argentina / Brazil / Mexico / Poland / Georgia
Application Development – Back-end Development /
Remote
Solvd is an AI-first advisory and digital engineering firm on a mission to redefine how AI transforms business. Working at the intersection of strategy and execution, we help clients move from experimentation to real ROI through the industry’s most advanced strategic advisory, data and platform engineering, and AI integration services. We’re an AI-native firm with over 12 years of experience, supported by offices in the USA, Brazil, Mexico, Ukraine, Poland, Argentina and Georgia.
We are seeking a skilled ELT Developer with expertise in Google Cloud Platform (GCP) services, including BigQuery, Python, and Firestore. The ideal candidate will design and implement efficient ELT pipelines to support data integration, analytics, and operational workloads. This role involves collaborating with cross-functional teams to ensure seamless data processing and transformation.
Responsibilities:
- ELT Pipeline Development:
- Design, develop, and maintain ELT pipelines using GCP tools such as BigQuery, Dataflow, and Firestore.
- Optimize data pipelines for performance, scalability, and reliability.
- Data Transformation & Integration:
- Transform raw data into structured formats suitable for analytics and reporting.
- Integrate data from multiple sources into centralized repositories or operational systems.
- Collaboration & Documentation:
- Work closely with data engineers, analysts, and stakeholders to gather requirements and deliver solutions.
- Document ELT processes, workflows, and best practices.
- Monitoring & Troubleshooting:
- Monitor ELT jobs to ensure timely data availability.
- Debug and resolve issues in data pipelines to maintain data integrity.
Mandatory requirements:
- Bachelor's degree in Computer Science, Information Technology, or a related field. Or equivalent experience.
- 3+ years of experience in ELT/ETL development on cloud platforms.
- Proven expertise in GCP services like BigQuery, Firestore, Dataflow, and Pub/Sub.
- Strong proficiency in Python for scripting and automation.
- Solid understanding of data warehousing concepts and data modeling.
- Experience with SQL for querying and transforming data.
- Familiarity with NoSQL databases like Firestore.
- Knowledge of Apache Beam for building scalable data processing pipelines.
- Strong problem-solving abilities.
- Excellent communication skills for collaboration across teams.
Optional requirements:
- Experience with GCP Cloud Scheduler or other orchestration tools.
- Google Professional Data Engineer certification or equivalent.
- Familiarity with CI/CD pipelines for deploying ELT workflows.