Data Integration Engineer GCP

Mexico City, Mexico
Lingaro /
Payroll /
Remote
Lingaro Group is the end-to-end data services partner to global brands and enterprises. We lead our clients through their data journey, from strategy through development to operations and adoption, helping them to realize the full value of their data.

Responsabilities

    • You will join a team responsible for designing, modeling, and developing the entire Google Cloud Platform (GCP) data ecosystem for one of our clients, encompassing Cloud Storage, Cloud Functions, and BigQuery.
    • Your involvement will span the entire process, from gathering, analyzing, and modeling business/technical requirements to documenting them. This role will require direct interaction with clients.
    • You'll be tasked with modeling data from diverse sources and technologies, as well as troubleshooting and providing support for the most challenging and impactful issues to deliver new features and functionalities.

Requirements

    • Minimum of 4 years of experience as a Data Engineer, with a minimum of 3 years dedicated to working with GCP cloud-based infrastructure and systems.
    • Profound expertise in cloud computing platforms, particularly Google Cloud. Candidates should demonstrate the ability to design, construct, and deploy data pipelines and applications within cloud environments.
    • Mastery in data modeling techniques and database optimization, with a strong emphasis on query optimization, indexing, and performance tuning to facilitate efficient data retrieval and processing.
    • Advanced proficiency in managing database systems, encompassing SQL (with a prerequisite of expertise in BigQuery), and NoSQL databases. Candidates should possess the capability to architect, configure, and administer databases to ensure optimal performance and reliability.
    • Extensive experience with data integration methodologies, including ETL and ELT, is essential. Candidates must exhibit proficiency in integrating data from various sources and transforming it into an analyzable format.
    • Exceptional communication skills are necessary to effectively collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders. The ability to articulate technical concepts to non-technical audiences clearly and concisely is crucial.
    • Proficiency in programming languages such as SQL, Python, and other scripting languages.
    • Familiarity with essential tools including Git, Jira, and Confluence, among others.
    • Willingness and enthusiasm to learn new technologies and solutions.
    • Experience working in a multinational environment.

Nice to have

    • Big data technologies and cloud platforms certifications are advantageous.
    • Proficiency in BI solutions, such as Tableau.
    • Familiarity with Azure cloud-based infrastructure and systems.
    • Exposure to ETL tools like Talend and Alteryx.
    • Previous engagement in distributed team environments.