DevOps Engineer (Data) - Salt Lake City

Salt Lake City

Varo Money is seeking to launch a Bank that delivers more affordable traditional financial products, provides leading edge innovation and helps consumers manage their financial lives more effectively and with minimal friction. When organized, the Bank will be a national retail bank. Varo’s business model combines a traditional relationship banking approach with modern technology to create a better banking experience backed by robust controls and sound risk management.  The Bank’s primary products will be offered through mobile, online and phone based banking channels and will include traditional banking products such as DDAs, savings accounts, CDs, unsecured installment loans and lines of credit, credit cards, home equity loans and brokered mortgages integrated with financial health tools on the smart phone to help customers solve every day financial problems and achieve better financial outcomes. Varo has raised $78M to date led by Warburg Pincus and The Rise Fund / TPG Growth.

We are currently looking for a DevOps Engineer to join our Engineering team. If you are looking for growth - this position is for you. We are a startup, and we need to build a lot of new systems. This is a unique opportunity to learn amazing technologies, be part of a great team, improve the financial lives of others, and have a lot of fun too! In our Engineering team we value teamwork, technical excellence, and quality!

You will own and manage our Postgres RDS and Redshift data infrastructure and will be responsible for the successful delivery and availability of data throughout the company. You should possess a Devops mindset and think in terms of infrastructure as code and create solutions that are platform-oriented and less service-oriented. You will be in charge of thinking strategically and execute on scaling our data serving operations in a cost-efficient and secure manner. You should also possess a passion for making data ubiquitous to drive decisions and endeavor to keep yourself up to date with industry trends. We run a very AWS-centric infrastructure and your ability to think creatively on how to integrate various technologies in the AWS ecosystem of services will be a critical success factor for this role.  

This is a high impact role that requires out-of-the-box thinking, forward vision and an ability to move quickly and execute amidst competing demands. It presents a great opportunity for someone looking for a greenfield project where she/he can make her/his mark and have an outsized influence on outcome.


    • Prior experience functioning in the role of DevOps, Data Engineer or DBA in a Linux environment
    • Extensive on-the-job experience in managing AWS resources - VPC, EC2, Route53, Aurora, RDS, ECS, IAM, etc.
    • Prior experience administering relational databases such as Postgresql, Mysql, Oracle and/or Data Warehouses like Redshift, Vertica, etc.
    • Ability to write and tune complex SQL queries
    • Expert coding skills in any one of Python, Java, Scala, Golang
    • Experience in data modeling and creating data pipeline ETL jobs
    • Expertise in Linux administration
    • Good knowledge of basic networking technology and concepts: TCP/UDP, SSL, HTTP, NAT


    • Experience with ETL tools like Talend, Informatica, Pentaho, etc.