Senior Data Engineer at Finonex

Warsaw, Poland
Finonex – Finonex /
Hybrid /
Hybrid
Finonex is a company that develops innovative and advanced online financial trading platforms. They focus on creating accessible, secure, and user-friendly trading experiences powered by cutting-edge technology, supported by 24/7 service.

We’re united by a collaborative, Agile culture that values creative and productive teams, transparent communication, and shared success. As a company that thrives on innovation and curiosity, we actively explore emerging technologies and industry trends—including AI and generative systems—through internal sessions, tech talks, and a culture of experimentation.

Now, as we establish a brand-new Data Department, we are looking for highly skilled professionals who want to help build our data platform and shape the future of Finonex’s data strategy from the ground up.

Responsibilities

    • Design, implement, and maintain scalable data pipelines using Python and orchestration tools like Apache Airflow
    • Take a leading role in evaluating and selecting foundational components of our data stack (e.g., Databricks vs. Snowflake)
    • Architect and optimize data workflows across AWS and on-premise environments
    • Work with modern file formats (Iceberg, Parquet, ORC) and understand their trade-offs for performance, cost, and scalability
    • Collaborate with analysts, engineers, and stakeholders to provide accessible and reliable datasets
    • Help assess and implement frameworks such as DBT for data transformation and modeling
    • Develop best practices and internal standards for data engineering, monitoring, and testing
    • Contribute to the long-term vision and roadmap of Finonex’s data infrastructure

Requirements

    • 5+ years of hands-on experience as a Data Engineer or similar role
    • Strong proficiency in Python for data processing and workflow automation
    • Experience with Apache Airflow (managed or self-hosted)
    • Solid understanding of data lake and table formats like Iceberg, Parquet, Delta Lake, and their use cases
    • Familiarity with big data processing (e.g., Apache Spark) and query engines (e.g., Trino)
    • Experience working in AWS environments, and comfort with hybrid cloud/on-premise infrastructure
    • Proven ability to work independently, drive decisions, and evaluate technologies with minimal oversight

Nice to Have

    • Experience with DBT or other data modeling and transformation tools
    • Prior involvement in building or migrating data platforms from scratch
    • Exposure to Superset or other modern BI tools
    • Knowledge of data governance, cataloging, or lineage tracking solutions
    • Interest in shaping team culture, mentoring, and knowledge sharing

Benefits

    • Work in a highly professional team. Informal and friendly atmosphere in the team
    • Paid vacation — 20 business days per year, 100% sick leave payment
    • Equipment provision
    • Partially compensated educational costs (for courses, certifications, professional events, etc.)
    • Legal and Accounting support in Poland
    • Ability to work from our comfortable office in Warsaw at Prosta 51
    • 5 sick days per year
    • ‍Medical insurance (after the end of the probationary period)
    • Flexible working hours — we care about you (!) and your output
    • English and Polish classes 2 times a week (online)
    • Bright and memorable corporate life: corporate parties, gifts to employees on significant dates