Solutions Architect - West
San Mateo, CA
AtScale bridges the gap between business users and their data.
The average enterprise sits on a mountain of data that business users could use to drive better performance. Unfortunately, most enterprises are not equipped to gain value out of their data: according to Forrester, close to 75% of the data stored within an enterprise goes unused. Enterprises need ways to deliver data to business users with simplicity, speed and security.
Back in 2009 at Yahoo!, AtScale’s founding team experienced this first hand, and witnessed the power and potential of Hadoop. However, we struggled to make the data in Hadoop accessible to the business users who needed it. Tired of moving data into expensive, legacy databases, the AtScale team decided to build what they could not buy.
As a Solutions Architect, you will work as a consultative team member in our Field Services organization. AtScale Solutions Architects function as Hadoop team leads at our customer locations for short term engagements to support the implementation of AtScale. Engaging with customers from Proof of Concept (POC) stages through to implementation of complex distributed production environment, you work collaboratively with them to successfully deploy and implement AtScale in their production environment. You are an all-around player, also helping the AtScale sales team successfully communicate and demonstrate AtScale’s capabilities. You’ve got the technical depth to roll up your sleeves to work with Hadoop and Hive and the polish to represent AtScale with the utmost professionalism.
- Designing and architecting solutions with our customers, scoping new engagements and implementations both short and long term, and guiding the team during product implementations.
- Resolving technical issues and advising the customer on best practices for big data, Hadoop environments and AtScale.
- Driving successful installations of the product, configuration, tuning, and performance.
- Assisting the customer with capacity planning for their environment to scale.
- Write and produce technical documentation.
- Collaborating with internal teams to channel client feedback and solutions into future releases of the product.
- Advocating for feature requests and bug fixes on behalf of the customer.
- Being meticulous about tracking things and follow-through.
- Visiting customer sites.
Experience and Requirements
- BS or higher in Computer Sciences or related field
- 5+ years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions
- Experience in Hadoop related tools and technologies is a must (HDFS and/or MapReduce, HBase, Hive, Spark, Impala)
- BI Experience a must (Tableau, Qlik, Cognos, MicroStrategy, BO, SSAS)
- Java, Scala, Python, Shell Scripting a plus
- Ability to understand big data use-cases and recommend standard design patterns commonly used in Hadoop-based deployments
- Ability to understand and translate customer requirements into technical requirements
- Knowledge of distributed systems
- Familiarity with data warehousing concepts
- Knowledge of complex data pipelines and data transformation
- Willingness to roll up the sleeves in a fast-paced, highly varied environment
- Ability to travel to customer sites
Preferred location is SF bay area yet will consider applicants from LA or Seattle area.