Shop Your Way - Cloud Data/Systems Engineer

U.S.A Remote
Technology Platforms, Engineering & Development – Cloud Platforms /
Full-Time /
Remote
Do you consider yourself an innovator, a builder, a game-changer? Do you have what it takes to transform the payments, loyalty, and commerce space? Do you want to help develop and scale truly “never been done before” services and technologies? 

If yes, then we want to hear from you at Shop Your Way

SYW Overview
Shop Your Way (“SYW”) is a high-growth FinTech and e-Commerce enablement platform that leverages a proprietary technology infrastructure to service some of the largest and most innovative brands in America. SYW powers multiple business and consumer services, including SYW Rewards and the SYW Mastercard, along with a leading mobile app and digital destination. 

When bundled together as part of our Pay Your Way (“PYW”) service, we have a truly differentiated loyalty + credit solution for our business partners and their customers, without the hassle of setting up a direct credit relationship with a bank or underwriter. This integrated module makes it easy for Partners of any size to improve their conversions, drive new revenue, save on transaction and payment costs, and create more repeat business. 

Today, we’re providing this “loyalty-card as a service” to Raise (gift cards) and Way (mobility) – two of the “Andreessen Horowitz Top 50 Marketplaces” – and other national brands. Now, we’re expanding our offering to include other new application services while also unlocking new channels. SYW is at an exciting inflection point, built for scale and driving growth.  

Key Highlights include:
●  Recent investment of +$30M of “growth capital” from Private Equity / Hedge Fund sponsor
●  Long-term strategic partnership with Citibank powering a multi-billion dollar credit portfolio
●  Compelling pipeline of New Business Development initiatives, focused on SYW 5321 Card Externalization (“loyalty-card as a service”)
●  Proprietary, best-in-class loyalty and data technology platform built to power $10+ billion of partner business
●  A sizable (9 million active) yet hyper local member base of active shoppers
●  A valuable rewards currency, with more new places to burn – from gift cards to sports media to parking, car wash, electrical vehicle charging to crypto and ETFs
●  A multi-tender, proprietary “wallet” with rewards, gift cards, and credit
●  Platform can be white-labeled and quickly integrated into existing retail and loyalty systems  https://business.syw.com/

Roles Summary: Cloud Data/Systems Engineer

The Cloud Data/Systems Engineer will be responsible for architecting transformation and modernization of enterprise data solutions on GCP cloud integrating native GCP services and 3rd party data technologies. A solid experience and understanding of considerations for large scale architecting, solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must. The candidate must have a broad set of technology skills across these areas and demonstrate the ability to design scalable, efficient solutions with appropriate combination of GCP and 3rd party technologies for deploying on GCP cloud.

Responsibilities:

    • Work with implementation teams from concept to operations, providing deep technical subject matter expertise for successfully deploying large scale data solutions in the enterprise, using modern data/analytics technologies on premise and cloud
    • Build and implement a solution architecture, provision infrastructure, secure and reliable data-centric services and application in GCP
    • Work with data team to efficiently use Cloud infrastructure to analyze data, build models, and generate reports/visualizations
    • Integrate massive datasets from multiple data sources for data modeling
    • Design, build, implement and manage APIs and API proxies.
    • Organizes and implements all API development processes internally and externally.
    • Ensures that API’s are satisfactory to business requirements, including features, infrastructure and systems
    • Troubleshoots and tests all features, systems, and functionality of end products.
    • Implement methods for automation of all delivery components to minimize labor in development and production
    • Formulate business problems as technical data problems while ensuring key business drivers are captured in collaboration with product management
    • Knowledge in machine learning algorithms especially in recommender systems
    • Extracting, Loading, Transforming, cleaning, and validating data
    • Designing pipelines and architectures for data processing
    • Creating and maintaining machine learning and statistical models
    • Querying datasets, visualizing query results and creating reports

Requirements:

    • BS, MS or PhD in Computer Science, Engineering, Economics, Business or Mathematics.
    • 3+ years experience in designing and optimizing data models on GCP cloud using GCP data stores such as BigQuery, BigTable.
    • 3+ years experience analyzing, re-architecting and re-platforming on-premise data warehouses to data platforms on GCP cloud using GCP/3rd party services.
    • 3+ years hands-on experience architecting and designing data lakes on GCP cloud serving analytics, BI application integrations and implementing scalable API solutions at production scale.
    • Minimum 3 year of experience in performing detailed assessments of current state data platforms and creating an appropriate transition path to GCP cloud.
    • Minimum 3 year of designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture using Java, Python, Scala etc.
    • Hands-on experience with Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud BigQuery, Cloud PubSub, Cloud Functions, etc.
    • Experience in architecting and implementing metadata management, data governance and security for data platforms on GCP.
    • Experience in designing operations architecture and conducting performance engineering for large scale data lakes a production environment.
    • Experience architecting and operating large production Hadoop/NoSQL clusters on premise or using Cloud services.

Preferred Requirements:

    • Expert knowledge of network, internet and data storage security, software engineering best practices across the development lifecycle, agile methodologies, coding standards, source management, and building processes for testing, and operations.
    • Extremely comfortable with communicating with users, other technical teams, and senior management to collect requirements and describe data modeling decisions and data engineering strategy.
    • Skilled in communicating with remote teams across different time zones.
    • Is accustomed to working with cross-functional teams to troubleshoot and resolve errors
    • Google Cloud Platform certification is a plus.