Reporting Engineer III (Remote)

China (Remote)
IT – Cloud Platform Services /
Full Time /
Remote
Why you should join dLocal?

dLocal enables the biggest companies in the world to collect payments in 40 countries in emerging markets. Global brands rely on us to increase conversion rates and simplify payment expansion effortlessly. As both a payments processor and a merchant of record where we operate, we make it possible for our merchants to make inroads into the world’s fastest-growing, emerging markets. 

By joining us you will be a part of an amazing global team that makes it all happen, in a flexible, remote-first dynamic culture with travel, health, and learning benefits, among others. Being a part of dLocal means working with 800+ teammates from 25+ different nationalities and developing an international career that impacts millions of people’s daily lives. We are builders, we never run from a challenge, we are customer-centric, and if this sounds like you, we know you will thrive in our team.


We are looking for a Software Engineer who wants to build high-performance, scalable, enterprise-grade applications. You'll be part of a talented software team working on apps to deliver insights to big clients like Netflix, Amazon, Nike, Facebook, and more!

What will I be doing?

    • Contributing in all phases of the analytical application development life cycle.
    • Designing, developing and delivering high-volume applications for data analytics systems.
    • Writing well-designed, testable and efficient code.
    • Ensuring designs are in compliance with specifications.
    • Supporting continuous improvement by investigating alternatives and technologies and presenting these for architectural review.

What skills do I need?

    • Great knowledge over Python/Scala/Java
    • Great knowledge over SQL & DBMS
    • Deep understanding on data modelling (Star schema, Snowflake) and manipulation/cleansing
    • Good knowledge on non-relational databases (NoSQL) and semi-structured/unstructured data
    • Good knowledge of distributed processing (Apache Spark, Hadoop, Hive, Presto, or similar)
    • Experience with AWS environment (S3, Redshift, RDS, SQS, Athena, Glue, CloudWatch, EMR, Lambda, or similar)
    • Experience with code versioning (GitHub, BitBucket, or similar)
    • Experience in Batch processing (ETL/ELT)
    • Advanced/fluent English

    • Desirable skills:
    • Experience with Agile/Kanban work methodology (JIRA)
    • Experience using Unix OS
    • Experience with dataviz tools (Tableau, Looker, DataStudio, PowerBI, or similar)
    • Understanding of files formats and how to manipulate them (AVRO, JSON, PARQUET, CSV, etc)
    • Knowledge in GCP
    • Knowledge in Streaming
    • Knowledge on orchestration tools (Apache Airflow, Prefect, Mage, or similar)
What happens after you apply?

Our Talent Acquisition team is invested in creating the best candidate experience possible, so don’t worry, you will definitely hear from us. We will review your CV and keep you posted by email at every step of the process!

Also, you can check out our webpageLinkedinInstagram, and Youtube for more about dLocal!