Senior Data Engineer

Belgrade
Information Technology – Information Technology /
Full-Time /
On-site
We are looking for a seasoned Senior Data Engineer to design and deliver enterprise-grade data solutions. You will build and optimize ingestion pipelines, scalable architectures, governance, and security frameworks – laying the foundation for AI Agents that will power Unlimit’s next generation of intelligent products.

What You'll Be Doing:

    • At Unlimit, we are reimagining the future of financial technology by putting data and AI at the core of everything we do.
    • As a Senior Data Engineer, you won’t just be moving data from point A to point B – you’ll be building the backbone of an AI-native company.
    • From day one, you’ll shape the way our data is collected, structured, and secured, ensuring it can serve as trusted fuel for real-time decisions, advanced analytics, and the AI Agents we are developing to drive smarter, more adaptive services.
    • Your pipelines will support our architects, BI specialists, and data scientists – but more importantly, your work will enable Unlimit to transform data into intelligence and intelligence into action.
    • This is not a “keep the lights on” role. It’s a chance to build from the ground up, influence standards and governance, and ensure that our AI Agents rest on a foundation that is reliable, resilient, and scalable.
    • The systems you design will determine how fast and how far we can go in delivering the next-generation financial solutions worldwide.

Qualifications You Bring:

    • Must-Have:
    • Deep knowledge of Snowflake, with proven hands-on experience; certification is a strong plus.
    • 5+ years in data-focused technical roles (data engineering, database development, ETL, data management, data warehouses, and pipelines).
    • Expertise in evaluating, selecting, and integrating ingestion technologies to solve complex data challenges.
    • Demonstrated experience designing and developing enterprise data warehouses (e.g., Teradata, Oracle Exadata, Netezza, SQL Server, Spark).Proficiency in building ETL/ELT ingestion pipelines using tools such as DataStage, Informatica, or Matillion.
    • Strong SQL scripting skills.
    • Cloud experience with AWS (Azure and GCP are a plus).Solid programming skills in Python; Scala experience required.
    • Familiarity with data catalogues (e.g., Polaris, Data.World).
    • Ability to support and collaborate with cross-functional, agile teams in a dynamic environment.
    • Advanced English communication skills.
    • Bachelor’s degree in Engineering, Computer Science, or a related field (or equivalent experience).

    • Nice-to-Have:
    • Experience with large-scale data migrations into Snowflake is a plus.
    • Background in data governance and security is a plus.