Data Principal Engineer

Ebury is a FinTech success story, positioned among the fastest-growing international companies in its sector.

Headquartered in London & with offices across Europe opening in Dubai, North America and APAC, we empower businesses that want to trade and transact internationally. We drive innovation through technology, providing businesses with the tools they need to manage their international trade and support their growth, including import lending, currency and risk products, and payment and collection facilities.

The talent of our 700+ multicultural staff, combined with our cutting-edge technology, tailored product range, and exceptional customer service, has enabled us to double in size year after year. Today we have 18 offices across the world, with even more exciting expansion plans to come.

Even through our tremendous growth, we maintain a vibrant and enjoyable company culture, and those who excel in our highly meritocratic and fast-paced environment will be generously rewarded.

Role Highlights

    • Hands-on coding (60%)
    • Availability for regular trips to London required
    • Design and drive the implementation of a data hose from Ebury production that integrates data models coming from relational and non-relational sources for BI consumption, working with Data team and Tech team 
    • Analyses and identifies performance bottlenecks within our current Ebury back office system data architecture
    • Working with the SRE team, ensures high availability and performance of our data storage solutions
    • Define and implement a strategy to decouple data processing for BI from Data models used in production
    • Research and develop PoC for quasi-real time data pipelines 
    • Contribute to the product roadmap to drive work through tech teams
    • Responsible for the high availability of the API

Must Haves

    • Proven experience building data pipelines from live production environments
    • Proven experience on optimizing SQL database performance including CQRS patterns
    • Experience designing and implementing building real-time data pipelines and streaming solutions
    • Strong backend development track record (python)
    • Experience defining work for engineering teams to implement (product stories)

Nice to Have

    • Managed large Database infrastructure deployments
    • Comes from the Financial sector
    • Postgres optimisation
    • AWS Data pipeline infrastructure, django signals, airflow and aws kinesis, kafka, flink or spark

Due to the high number of applications received, only successful candidates will be contacted.