Data Platform Architect
Jakarta
IT Infrastructure, Operations, Data & Monitoring – Data Engineer /
Full-time - Contract /
On-site
The Data Platform Engineer is responsible for the design & development of Amar Bank’s end-to-end data platform architecture. This role emphasizes enterprise-grade design, operational excellence, scalability, and security, ensuring that Amar Bank’s data infrastructure is tightly aligned with business goals, supports high availability, and seamlessly integrates with the Unified Customer Profile (UCP).
The ideal candidate brings deep expertise in Google Cloud Platform (GCP) while maintaining a broad perspective across data tools, frameworks, and other cloud technologies. They must be capable of implementing and managing data solutions at scale, leading strategic initiatives, architecting cloud-native pipelines, and establishing platform standards. Equally important, they ensure full-lifecycle data governance, observability, and compliance.
Beyond technical excellence, the candidate must demonstrate an insatiable hunger for learning, strong self-drive, and resilience in navigating today’s rapidly evolving AI-driven ecosystem. In an era where the ground is constantly shifting, humility, adaptability, and a relentless curiosity to explore and master new technologies will distinguish those who thrive in this role.
Responsibilities:
- Architect and implement a robust and scalable data platform leveraging GCP-native services such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and GKE.
- Working together with an existing team to design and deliver the completion of the Unified Customer Profile (UCP) initiative by integrating data across multiple sources into a single trusted view.
- Define and enforce enterprise data architecture standards related to schema design, ingestion, metadata, transformation standardization, security parameters (DLP rules), and lineage.
- Work closely with data engineers, analysts, and scientists to provide a platform that meets their needs.
- Partner with data team units to translate requirements into technical architecture.
- Optimize cloud resource usage and ensure cost-efficient operations of the data platform.
- Implement and maintain Infrastructure-as-Code for repeatable deployments using Terraform or Deployment Manager.
- Drive the adoption of observability, monitoring, and automated alerting within the data platform ecosystem.
- Ensure data governance through policy-based access, auditability, classification, and metadata tracking.
- Mentor junior engineers and guide platform evolution to support both real-time and batch processing patterns.
- Monitor and troubleshoot performance issues in data platforms, providing timely resolution and optimization strategies.
Requirements
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
- 7+ years of experience in data/platform engineering, with 3+ years working on GCP.
- Proven expertise with GCP services: BigQuery, Dataflow, Pub/Sub, Cloud Composer, GKE.
- Strong skills in Infrastructure-as-Code (Terraform or Deployment Manager).
- Experience designing and managing scalable batch and streaming data pipelines.
- Solid understanding of data governance, security, access control, and observability.
- Proficient in Python, SQL, and data modeling; familiarity with Java/Go is a plus.
- Demonstrated leadership in cross-functional projects and mentoring engineers.
- Comfortable working in fast-paced, agile environments with a focus on results.
- GCP certification and knowledge of modern data tools (e.g., dbt, Airflow, DataHub) are a plus.