Data Engineer (Contractor)

São Paulo
Product – Analytics & Insights /
Contract /
Remote
Jeeves is a groundbreaking financial operating system built for global businesses that provides corporate cards, cross-border payments, and spend management software within one unified platform. The company operates across 20+ countries including Brazil, Canada, Colombia, Mexico, the United Kingdom, across Europe, and the United States, and serves over 5,000 clients ranging from venture-backed startups to SMBs around the world. With a mission to empower businesses with more efficient and cost-effective financial solutions worldwide, Jeeves combines cutting-edge financial technology with exceptional team expertise to transform the business financial landscape. Jeeves has been recognized as one of The Information's 50 Most Promising Startups in 2023, as well as a Y Combinator Top Company 2021-2023 and won “Fintech of the Year" at the European Fintech Awards.

Since graduating from Y Combinator in 2020, Jeeves has successfully raised over $380 million and is backed by top world-class investors including Andreessen Horowitz, Y Combinator, CRV, Tencent, Stanford University, Clocktower Ventures, and founders of more than 15 unicorns including David Velez (Nubank), Carlos Garcia (Kavak) and Sebastián Mejía (Rappi).

We’re looking for a Data Engineer to help scale the data infrastructure that powers our payments platform. You’ll be responsible for building and maintaining robust, efficient data pipelines and models that support everything from operational reporting to product insights. This is a hands-on role focused on developing and optimizing the modern data stack — using tools like dbt, Airflow, and cloud-based data platforms.

If you’re passionate about building clean, reliable data systems and have experience working with high-volume transactional or payments data, we’d love to hear from you.

Location: This role is a full-time remote position. #LI-REMOTE

What You’ll Do:

    • Design, build, and maintain ETL/ELT pipelines using dbt and Apache Airflow to transform raw payment and customer data into usable datasets.
    • Develop and optimize data models that support analytics, dashboards, and data science workflows.
    • Collaborate closely with engineering, product, and analytics teams to define data needs and deliver reliable, scalable solutions.
    • Monitor and improve the performance, reliability, and scalability of data pipelines.
    • Implement data quality checks, anomaly detection, and alerts to ensure trust in our data.
    • Manage and optimize usage of our cloud data warehouse (e.g., Snowflake, BigQuery, or Redshift).
    • Contribute to internal tooling, documentation, and best practices that help scale our data operations.

Must-Have Skills:

    • 3+ years of experience as a Data Engineer or in a similar role working with modern data tools.
    • Strong experience with dbt for data modeling and transformation.
    • Proficiency with Airflow or other orchestration tools.
    • Deep understanding of SQL and data warehousing concepts.
    • Experience with cloud-based data warehouses
    • Solid programming skills in Python for data pipeline development and automation.
    • Strong grasp of data architecture, including building for scale, modularity, and maintainability.
    • Familiarity with version control (Git) and CI/CD workflows in a data engineering context.
    • Strong experience with data governance principles, including data cataloging, lineage, classisfication, and security.

Nice-to-Have:

    • Experience working with payments data, such as transaction logs, settlement files, or ledger systems.
    • Familiarity with tools for data observability and monitoring (e.g., Monte Carlo, Great Expectations).
    • Exposure to real-time or near-real-time data processing frameworks (e.g., Kafka, Spark Streaming).
    • Experience working in a fintech or high-growth startup environment.

Why Join Us:

    • Work on high-impact data systems powering a growing fintech payments platform.
    • Own and influence the core data infrastructure from early stages.
    • Collaborate with a smart, driven team that values autonomy and continuous improvement.