Data Engineer (Remote)
Waymark is a team of healthcare providers, technologists, and builders whose mission is to bring the best healthcare to people with Medicaid benefits. Guided by the communities we serve, we bring support and technology-enabled care to help primary care providers keep Medicaid patients healthy. We are building the tools and designing an approach to enable care to reach the patients who can benefit most.
Our core values embody the essence of what makes Waymark a unique team today, and what we look for, nurture, and sustain as a team. We are bold builders, believing that the greatest challenges in care delivery can be solved when we harness the power of community and technology. We are humble learners, seeking feedback and perspectives different from our own, and welcome challenges to our conclusions. We experiment to improve, actively seeking data to inform decisions and to assess our own performance. We act with focused urgency, our commitment to our mission drives us to tirelessly pursue results.
If this vision resonates with you, we hope you consider bringing your creativity, your energy, your curiosity to Waymark.
As a data engineer at Waymark, you will tackle a critical healthcare challenge that will determine our success: identifying how we can optimize the system for identifying what patients to outreach to and getting the right information about them to our provider teams.
You will help create and maintain data pipelines that bring healthcare data securely into our environment, transform that data intelligently, move that data to our internal CRM so that team members as varied as social workers and pharmacists can help our patients, and help our analytics team identify which interventions were effective. You will help shape our understanding of how to improve and scale our services and technologies.
- Identifying, designing and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.
- Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS/GCP and SQL technologies.
- Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition.
- Working with stakeholders including data, design, product and executive teams and assisting them with data-related technical issues.
- Bachelor's or Master's degree in Computer Science, Information Systems, Engineering or equivalent.
- 3+ years’ experience in Big Data Distributed system.
- Strong experience in Python & Java.
- Strong understanding of data structures and algorithms.
- Database architecture testing methodology, including execution of test plans, debugging, and testing scripts and tools.
- Fluent in relational based systems and writing complex SQL.
- Fluent in complex, distributed and massively parallel systems.
- Healthcare experience with healthcare data, privacy, and/or systems including claims and/or electronic health record data, FHIR/HL7.
- Experience with container management frameworks such as Docker, Kubernetes, ECR.