AWS Technical Lead
Ann Arbor, MI
Consulting Practice – Consulting Director
Don’t just dream about What’s Next – build it!
Nexient specializes in bringing a “Product Mindset” to our work, using small, nimble teams, Agile thinking and other techniques from product development to build great custom software people love to use.
And as America’s leading 100% US, 100% Agile software services partner, we’re building it here instead of sending tech jobs offshore.
People are noticing.
Our clients include some of America’s favorite brands in retail, healthcare, financial services and more – but also some very cool product companies you may not have heard of yet. We’re recognized as a Gartner Cool Vendor, HfS Hot Vendor and the only 100% US tech company in the World’s Top 100 Outsourcers. You might have read about us in The New York Times.
Silicon Valley innovation (through our Bay Area innovation hub, and network of startups and emerging technology partners)
Scalable, Midwest delivery (with 65+ Agile teams in high talent cities like Ann Arbor, Michigan)
Onsite collaboration (with product managers and other key roles working face-to-face with client teams around the country)
Come build with us!
Role and Responsibilities:
- Partners with and becomes a knowledge expert across the organization to solve complex business problems, and form and test hypotheses using a wide array of skills.
- Provides a consultative approach with business users, asking questions to understand the business need and deriving the data flow based on those needs.
- Supports enabling deep business analytics and statistically based data analysis projects to drive business value.
- Gathering, reviewing and interpretation of business and technical requirements and understanding use case objectives
- Responsible for evaluating and recommending data query and modeling tools to assist with data architecture activities, development and communication of best practices for those tools, and activities to improve effectiveness and quality.
- Develops strategic data architecture for data warehouses used for analytics and use cases.
- Responsible for creation, comprehensive understanding, maintenance and communication of logical and physical data models.
- Provides best practice direction and even hands-on implementation of optimized relational database solutions.
- Responsible for creation and communication of data architecture standards and best practices, and data movement and mapping design.
- Determines structured and unstructured data requirements by cohesively blending the needs of the business with the feasibility of current and emerging data technologies and architecture standards.
- Develops database and big data solutions by designing data architectures, including business architecture definition, data storage, data analytics, data transformation, and data access
- Seamlessly works with data engineers, database and big data administrators, application interface developers, DevOps and IT operations personnel, as well as product management and clients.
- Organizes tasks and resources to complete work and meet deadlines according to established departmental procedures.
- Works closely with the data engineering teams to design highly performant and scalable data solutions.
- Assists developers in the design and improvement of database queries, provides training and learning opportunities.
- Experience of building scalable high availability analytics solutions Experience of building unit tests, integration tests, system tests and acceptance tests Agile practices
- Responsible for validation of the data quality and integration of all data architecture components, and capture of necessary data architecture metrics for benchmarking and trending to support quality of processes and related artifacts.
- Provides delivery leadership in the design, deployment and data conversion of complex data solutions, including data validation, UAT and Data Quality practices
- Establishes and participates in data management processes including data lineage, data profiling, data quality management, data stewardship and governance.
Technical Skills Required:
- BS or MS degree in a quantitative field such as computer science, physics, math, engineering, or economics. Equivalent education and/or experience may be substituted for the minimum education requirement.
- Minimum of 10 years of experience designing large scale data models for data warehouse environments and 2 years with big data environments, including star schemas, OLAP cubes, modeling structured and semi-structured data, and ETL and streaming workflows.
- Deep understanding of traditional and modern data ingestion, data storage, data processing and data access technologies.
- Demonstrated expertise in building highly automated self-service data platforms, with scalablity in terms of both throughput and capacity
- Extensive background with transactional database systems along with an expert knowledge of SQL and comfort with related querying languages/dialects.
- Experience with various modern database systems and strong grasp of their strengths/weaknesses, e.g., graph databases, Mongo, Cassandra, Cockroach, Hadoop, Flink, Spark, Kafka, etc.
- Experience working in public cloud environments like AWS and understanding tools/techniques for data ingestion and analytics such as Glue, Kinesis, S3, SQS, SNS, Lambda, Step Functions, Redshift, Postgres
- Experience in DevOps, using continuous integration and continuous development (we use Jenkins, Nexus, Git), automation of testing data and transformations, and cloud-native designs and tools, e.g., Docker, Kubernetes, etc.
- Experience with decoupled architectures, realtime or near-realtime event/data streaming, enterprise integration and common messaging patterns, as well as workflow management technologies such as Airflow, Oozie, Azkaban, Luigi, etc.
- Experience with modern programming languages like Java, Python, Ruby, etc., as well as scripting and configuration management
- Experience implementing monitoring, quality assurance, data validation processes in a relational and big data environment (metrics driven reliability and performance).
- Expertise in data security best practices, especially in a cloud or hybrid setting, including data access controls and tools, auditing, encryption/securing data at rest and in transit, privacy/anonymization constraints, etc.
- Experience with Agile/Scrum development methodologies.
- Ability to interpret business requirements to formulate technical solutions and to decide among competing technical solutions.
- Excellent verbal and written communication skills and experience in collaborative environments.
- Ability to constructively resolve conflict, manage expectations for competing priorities, and drive towards business goals.
Benefits & Perks:
Company sponsored health, dental, vision insurance and flexible spending account (FSA) for employees and their families including domestic partners
Paid Vacation Time and company holidays
Lunch & Learn
Passionate, collaborative and awesome co-workers
World class training and career mentorship