Data Integration Engineer

Sliema
Engineering /
Full-time /
Remote
We seek a dynamic and results-driven professional to join our team as a Data Integration Engineer. As a rapidly growing player in the industry, we are looking for an individual with a strong background on data warehousing and pipeline development. You will play a pivotal role in designing, developing, and maintaining our data infrastructure, ensuring the accuracy and efficiency of our data processes. This role reports directly to our Director of Business Intelligence.

Responsibilities

    • Design, develop, maintain, and document Python-based data processing and ETL jobs ensuring data is complete, accurate, and that system failures are minimized and resolved efficiently
    • Develop and maintain a state-of-the-art Data Warehouse, ensuring it meets the evolving needs of the business
    • Utilize Apache Airflow to orchestrate and manage ETL workflows, ensuring optimal performance and reliability
    • Adhere to coding standards, participate in code reviews, and assist with controlled releases as part of best practice workflows
    • Proficient in API integrations with third-party software to retrieve and integrate data seamlessly into the system
    • Ensure data pipelines are optimized to minimize costs, while also being efficient enough to meet the reporting needs of the business
    • Ensure that requirements are captured effectively and efficiently, translating business needs into technical solutions
    • Communicate results of technical projects to both technical and non-technical users, ensuring clear understanding and alignment with business goals.

Requirements

    • Minimum of 5 years of experience in a Data Engineering role, with a strong focus on data warehousing and pipeline development
    • Expertise in writing complex SQL and programming in Python
    • Experience with dbt, Apache Airflow, and Google BigQuery, demonstrating a deep understanding of these technologies
    • Proficient in developing and integrating APIs with third-party software to facilitate data exchange and workflow automation
    • Experience in building and maintaining robust data processing pipelines and ETL/ELT processes
    • In-depth knowledge of data warehousing methodologies and hands-on experience with major Relational Database systems
    • Experience in Event Streaming a plus
    • Demonstrated ability to work effectively in a distributed team environment, with a strong sense of ownership, urgency, and drive
    • Comfortable with Git and experienced in conducting code reviews and adhering to best coding practices
    • Strong communication skills and the ability to document all code and features for maintainability and future reference
    • A Master’s Degree in Engineering, Computer Science, or related field, or equivalent work experience.