Senior Data Engineer
Mexico City / Bogota / Sao Paulo
Engineering – Credit Products /
Full-Time /
Hybrid
About PayJoy
PayJoy is a mission-first financial service provider dedicated to helping under-served customers in emerging markets to achieve financial stability and success. We lend through our patented technology that turns a smartphone into digital collateral, and our cutting-edge machine learning, data science, and anti-fraud AI allow us to offer the lowest cost and qualify the most customers in the industry. As of 2024 we have brought billions of dollars in credit to 12 million customers, doubling in the last two years while remaining strongly profitable and sustainable for the long term.
This role
As a Senior Data Engineer, you will play a key role in designing, developing, and maintaining scalable and efficient data pipelines to support the organization's data needs. You will be responsible for ensuring the smooth and reliable flow of data across various systems and platforms, enabling teams to access accurate and actionable data for decision-making. Your expertise will be crucial in optimizing data architecture, transforming raw data into usable formats, and ensuring data integrity, security, and availability.
A successful Senior Data Engineer will possess strong technical expertise in programming languages like Python, alongside experience with big data technologies such as Spark and Kafka, and cloud platforms like AWS or GCP. They will demonstrate problem-solving skills, focusing on data quality, governance, and scalable architecture, ensuring reliable, high-performance data pipelines. Excellent collaboration and communication skills are essential, enabling them to work effectively with cross-functional teams while translating technical complexities for non-technical stakeholders. With a mindset of continuous learning, they will stay updated on emerging tools and best practices, and provide leadership or mentorship to junior team members.
Responsibilities
- Design and Develop Data Pipelines: Build, optimize, and maintain reliable, scalable, and efficient data pipelines for both batch and real-time data processing.
- Data Strategy: Develop and maintain a data strategy aligned with business objectives, ensuring data infrastructure supports current and future needs.
- Tool & Technology Selection: Evaluate and implement the latest data engineering tools and technologies that will best serve our needs, balancing innovation with practicality.
- Regularly review, refine, and optimize SQL queries across different systems to maintain peak performance.
- Identify and address bottlenecks, query performance issues, and resource utilization.
- Setup best practices and work with developers on education of what they should be doing in the software development lifecycle to ensure optimal performance.
- Manage and maintain production AWS RDS MySQL, Aurora and Postgres databases, replicas ensuring their reliability and availability.
- Perform routine database operations, including backups, restores, and disaster recovery planning.
- Monitor database health, diagnose and resolve issues in a timely manner.
- Serve as the primary point of contact for database performance and usage related knowledge, providing guidance, training, and expertise to other teams and stakeholders.
- Monitoring & Troubleshooting: Implement monitoring solutions to ensure high availability and troubleshoot data pipeline issues in real-time.
- Documentation: Maintain comprehensive documentation of systems, pipelines, and processes for easy onboarding and collaboration.
- Collaboration: Work closely with data science, analytics, and product teams to understand data requirements and deliver tailored data solutions.
Performance Tuning:
Database Administration:
Knowledge and Training:
Requirements
- Experience: 5+ years of experience in data engineering
- Technical Expertise: Deep understanding of data engineering concepts, including ETL/ELT processes, data warehousing, big data technologies, and cloud platforms (e.g., AWS, Azure, GCP).
- Programming Skills: Proficiency in programming languages such as Python, Scala, or Java, and experience with SQL and NoSQL databases.
- Knowledge of best practices in cloud database administration including parameter tuning, backup, capacity management and performance tuning.
- Architectural Knowledge: Strong experience in designing and implementing data architectures, including real-time data processing, data lakes, and data warehouses.
- Tool Proficiency: Hands-on experience with data engineering tools such as Apache Spark, Kafka, Snowflake, Airflow, Databricks and modern data orchestration frameworks.
- Innovation Mindset: A track record of implementing innovative solutions and reimagining data engineering practices.
- Education: Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field.
Benefits
- Local benefits will change according to the candidate's location
- $2,000 USD annual Coworking Travel allowance
- $2,000 USD annual Professional Development allowance
- $500 USD annual Fitness allowance
- $500 USD Phone Finance allowance
- $250 USD Home Office equipment allowance
- $200 USD Headphone allowance
PayJoy is proud to be an Equal Employment Opportunity employer and we welcome and encourage people of all backgrounds. We do not discriminate based upon race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.
Finance for the next billion * Ownership * Break Through Walls * Live Communication * Transparency & Directness * Focus on Scale * Work-Life Balance * Embrace Diversity * Speed * Active Listening