Sr. Data Architect

Nevada
Engineering & Information Technology – Data Engineering & Machine Learning /
Full-Time /
Remote
Clear Capital is building the future of real estate data, and we need your help! We are seeking experienced product builders: with your talent as a Data Architect, help us reach our goals of knowing more about a property than anybody else and in the process making home ownership valuations more fair and equitable for millions of people.

Become part of an innovative team supporting and developing the data and machine learning products that will shape Clear Capital’s future. The Data Architect role at Clear Capital will work closely with data engineering and product teams to build next generation data products.  Working alongside Software Engineers, Data Quality Analysts, ML Engineers and Dev Ops Engineers who, like you, are dedicated to build exceptional and cutting edge solutions.

We are looking for a Data Architect to assist with the development and implementation of systems leveraging structured and unstructured data to deliver data solutions at scale.  As Data Architect at Clear Capital, you are committed to enabling the best work of others on the team. You help yourself and your team to consistently “level up.” You think ahead to anticipate the needs of others and provide concise information for decision-making.

What you will work on

    • Design, develop, automate, monitor and maintain data movement applications using a variety of AWS services and techniques.
    • Design data pipelines by developing processes and tooling for acquisition and transformation of large datasets.
    • Participate in stand-up meetings, planning meetings and review sessions (using
    • Scrum / Agile methodology).
    • Interact with cross-functional teams to ensure complete delivery of solutions.
    • Build and drive a comprehensive data governance strategy and solution from POC to production.

Who we are looking for

    • Qualified candidates should have a Bachelor’s degree or relevant experience focused on information technology and 6+ years of overall experience building complex data solutions.. 
    • Understanding of data warehouse, data lakes and master data management approaches, ETL industry standards and best practices. 
    • Experience with public and private cloud migration and data pipelines.
    • Systems engineering, coding and debugging skills in Python.
    • Working knowledge of shell scripting languages, e.g. BASH.
    • Experience with data pipeline and workflow management tools (Airflow a plus).
    • Experience with compute-at-scale technologies, such as MapReduce, Stream Processing, Spark, Serverless architectures.
    • SQL performance tuning skills for relational platforms (PostgreSQL).
    • Experience creating and maintaining scalable and robust AWS solutions (EC2, Kubernetes, EKS, Lambda, QuickSight, Terraform, Glue, Athena, Redshift).
    • A passion for logging, both for analytical use and for determining fault when things go awry (Logstash, SumoLogic, CloudWatch).
    • Outlines data governing standards and principles and helps to develop a reference architecture that includes data domain specs. 
    • Defines data quality, standards, and process for the organization.
    • Strong business and communication skills verbal and written.  Capable of selling ideas to both technical and executives audiences.