At Alkymi, we’re on a mission to supercharge human productivity and enable intelligent decision making with instantly actionable data. We’re working every day to push the boundaries of what’s possible using machine learning, computer vision, world-class engineering, and outstanding user experience.
Our core products, Alkymi Data Inbox and Patterns, make it possible for anyone to build intelligent data extraction workflows, driving transformative productivity gains for our customers and unlocking a new automation paradigm that will fundamentally change the way humans interact with and leverage data.
As we continue to scale, we're seeking a DevOps Engineer who will work directly with our CTO, engineering and data science teams, to improve our continuous integration and deployment pipelines and build automated packaging of our application for on-site deployment onto our enterprise customers' infrastructure.
You’ll build upon a GitOps workflow deployed onto Kubernetes running on AWS and machine learning pipelines deployed onto AWS SageMaker and Kubernetes.
Your Day to Day
- Develop tooling to improve engineer productivity through better Continuous Integration (CI) and Continuous Deployment (CD) processes.
- Monitor production deployments through Datadog and issue triaging.
- Participate in daily scrum stand-up meetings and weekly sprint demos.
- Extensive knowledge of *nix-based operating systems with familiarity of Debian / Ubuntu, Fedora, and BSD-based distributions.
- Familiarity with building and deploying containers on Kubernetes or similar container orchestration systems.
- Experience automating infrastructure deployment with tools like Terraform.
- Experience building virtual machines (VMs) with tools like Vagrant.
- Familiar with git, shell scripts, supervisord,
- Experience with managing infrastructure deployed onto AWS.
- Experience working with PostgreSQL and Redis.
- RabbitMQ experience is a plus.
- BS in Computer Science, Engineering, etc