Senior Data Engineer

Irving, TX
Corporate – Business Intelligence /
Corporate /
Remote
Overview

We are looking for a Senior Data Engineer to join our growing team of analytics experts. This hire will be responsible for expanding our data pipeline architecture. The ideal candidate is an experienced data pipeline builder with solid SQL skills. 

You will be responsible for creating and optimizing data pipeline architecture, assembling large complex data sets that meet business requirements, and identifying, designing, and implementing process improvements such as automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability. In the context of New Western, you will be part of our team of data engineers, data analysts, and data scientists. You will ensure optimal data flow and collection for these cross-functional teams and maintain data delivery architecture consistent throughout ongoing projects.

As the Sr. Data Engineer, you must be self-directed and comfortable supporting the needs of multiple teams, systems, and products through a wide variety of data sources. You must also be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.

The Senior Data Engineer is responsible for building and maintaining data systems while constructing data schemas and datasets to enhance the value of the company’s data, its interoperability, reliability, and quality while enabling ease of analytical use.

This is a unique opportunity to join our dynamic Business Intelligence team that leverage cloud-first technologies to develop data-driven tools and solutions aimed at enhancing the success of our Agents and business.

Although we are headquartered in Dallas, Texas, this is a remote first role that can be located in the eastern or central time zones.

About New Western
 
New Western is on a mission to ease the affordable housing shortage by reviving distressed homes across the country. In the meantime it has become one of the largest and fastest-growing real estate investment marketplaces in the nation, and buys a home every 13 minutes. Recognized as a Glassdoor Best Place to Work in 2023 & 2024, you’ll have a chance to make a real and visible impact — you’d be joining a lean, nimble, close-knit team of high performers where your contributions can make a difference from day one, and have some fun along the way.

As a Senior Data Engineer, Your Main Responsibilities Will Be:

    • Develop and maintain data cloud (Snowflake) capabilities and ELT/ETL data integrations using Python, SQL, and other languages in conjunction with tools such as Matillion and dbt.
    • Design, develop, and maintain data models within dbt using SQL, Python, and Jinja.
    • Design and implement scalable data pipelines to ingest and refine large and diverse data sets. 
    • Collaborate with cross-functional teams to understand data needs and flows to enable the design of the best possible solution for each unique need.
    •  Provide data quality solutions in a timely manner and be responsible for data governance, protection, and integrity. 
    • Design and implement data mappings and conversions to enable systems migration activities.
    • Be a key contributor in analyzing the enterprise data architecture and leading schema changes that streamline our business process flows while ensuring data usability and integrity.
    • Work closely with data analysts, data scientists, software developers, business process and integration engineers, and product managers to develop appropriate and cost-efficient solutions.
    • Perform additional duties as assigned.

We're Looking For A Teammate That Has the Following Qualifications:

    • 5+ years experience programming with a programming language such as Python, SQL, PHP, C++, Java, etc. 
    • 5+ years professional experience working with both relational and analytical databases (Snowflake experience is preferred)
    • 3+ years of professional experience in building robust data pipelines and writing ELT/ETL jobs using custom code (not just drag-and-drop pipelines)
    • Good understanding of Enterprise Architecture concepts related to dimensional data modeling
    • Experience building and deploying data-related infrastructure (messaging, storage, compute, transforms, scheduling and execution, monitoring, and/or CI/CD pipelines) across development/staging/production environments
    • Experience with business systems including Salesforce Sales Cloud, HubSpot, Jira and GitHub are a plus
    • Experience with and an understanding of AWS related technologies including S3, EC2, Lambda functions, etc are a plus
    • Experience with Unix/Linux OS and basic shell scripting are a plus
    • Experience with Apache Airflow is a plus
    • Strong communication, writing, presentation, and interpersonal skills are a must
    • A strong architectural mindset, self-starter initiative, and a drive for completion
    • BS or MS in a technical field: Computer Science, Engineering, or similar

We Offer The Following Benefits and Perks:

    • PPO Medical - No cost to employees and low cost options to add coverage for dependents.
    • Dental, vision, short-term disability, long term disability and life insurance.
    • 401K Plan with up to a 3.5% match.
    • Flexible PTO policy - Be at your best by taking the time you need when you need.
    • Remote First - We work remotely and take the opportunity to meet and collaborate when it makes sense.

#LI-CT