Senior Data Engineer

Roseville, CA or Reno, NV /
Technology – Data /
Full-Time
Who We Are Looking For
We are looking for a courageous Senior Data Engineer to join our team and help build new real estate property data pipelines, infrastructure and services. Not only do we design, deploy and operate our organization's data infrastructure, but we are also a product team responsible for delivering data products and services. We believe real estate property data can be beautifully designed, easy to explore and consume, and be user friendly. If this sounds interesting, then we want to meet you.

The team
On the outside our team resembles most others with a hierarchy, roles and responsibilities. On the inside we operate side by side and in a manner that would be described as flat. We believe in maintaining a safe environment where team members can express strong opinions that are loosely held. We rely on each other to collectively deliver the highest quality real estate property data to our company and the industry at large.

What you will work on
You will work with our team on data ingestion, standardization, record linkage, lineage, analytical objects and flighting curated datasets to our internal services, partners and products. Time and attention are placed on our infrastructure to identify and develop solutions that use the best tools and patterns for each engineering challenge. Our solutions are elegant and scalable, enabling us to quickly integrate additional datasets into our real estate property ecosystem.

Requirements

    • Minimum of eight (8) years experience with relational database technologies (e.g. Oracle, PostgreSQL, MySQL)
    • Minimum of five (5) years of experience with Hadoop ecosystem (e.g. Hive, Pig, Sqoop, Flume, Oozie)
    • Minimum of five (5) years of experience with AWS cloud technologies (e.g. EMR, S3, EC2, Redshift, Quicksight, Athena, DynamoDB, Elastic, RDS, Lambda, API Gateway, SQS, Kinesis, Data Pipeline, Glue)
    • Minimum of five (5) years experience authoring data pipelines (e.g. with Apache Airflow, or Pentaho, or Informatica, or your own work and project)
    • Minimum of five (5) years experience working with RESTful and SOAP APIs
    • Demonstrate a solid understanding and practice of the Software Development Life Cycle
    • Minimum of five (5) years monitoring and alerting platforms (e.g. OMD and Nagios, CloudWatch)
    • Minimum of five (5) years centralized logging solutions (e.g. Graylog, ELK Stack)
    • Working knowledge of Git version control
    • Intermediate to high Linux and SysAdmin Skills
    • Working Knowledge of Disaster Recovery and how is applies to RPO and RTO
    • Apply advanced database administration, management and performance tuning
    • Provide and develop processes for administration of hot/cold backups, database recovery, security
    • Manage standby databases, backups and snapshots
    • Write and maintain documentation describing changes and modifications to all systems and databases
    • Demonstrates solid understanding of ETL/ELT patterns, functional programming, and data partitioning
INDT

About Us

Clear Capital is the premier provider of real estate valuation, analytics, and technology solutions. Powered by its more than 45 years worth of information on nearly every U.S. metro, neighborhood, and property, Clear Capital’s solutions are trusted by community credit unions and billion-dollar financial institutions alike. Clear Capital is headquartered in Reno-Tahoe with a team of more than 500 nationwide, dedicated to going wherever it leads, and doing whatever it takes.

To all recruitment agencies: Clear Capital does not accept agency resumes. Please do not forward resumes to our jobs alias, Clear Capital employees or any other company location. Clear Capital is not responsible for any fees related to unsolicited resumes.

Compensation
Salary commensurate with experience

Clear Capital is an equal opportunity employer