SDE 3 - DevOps

Bengaluru, India / Gurugram, India
Engineering – Dev Ops /
Full Time /
Hybrid
Who is AiDash?

AiDash is making critical infrastructure industries climate-resilient and sustainable with satellites and AI. Using our full-stack SaaS solutions, customers in electric, gas, and water utilities, transportation, and construction are transforming asset inspection and maintenance - and complying with biodiversity net gain mandates and carbon capture goals. Our customers deliver ROI in their first year of deployment with reduced costs, improved reliability, and achieved sustainability goals. Learn more at www.aidash.com.

What will you do?

    • Develop and enhance engineering products, ensuring alignment with our infrastructure goals, which include cloud-native and resilient applications.
    • Apply advanced technologies to automate operations and workflows.
    • Contribute significantly to the software development lifecycle, collaborating with Engineering and Operations teams. This includes writing efficient code and setting up automated deployment processes.
    • Assist in the integration of new technologies and practices in DevOps and cloud infrastructure, with a focus on improving automation and security measures.
    • Participate in a 24x7 on-call rotation, collaborating with teams globally during critical incidents.
    • Manage project priorities and deadlines, focusing on the integration and deployment of software solutions.
    • Provide guidance to team members and contribute to the improvement of processes and procedures.
    • Effectively communicate with a range of stakeholders, both technical and non-technical.
    • Be actively involved in all stages of product development.
    • Write clean and efficient code, adapting to a dynamic work environment.
    • Collaborate on automating workflows and improving code testing/deployment processes.
    • Contribute to the development of automated deployment pipelines.

What are we looking for?

    • 5+ years of experience in software development with skills in Java, Go, Python, or similar languages.
    • Proficiency in cloud technologies like AWS, GCP, Kubernetes, Docker, Terraform.
    • Understanding of networking concepts and database systems (e.g., Redis, MySQL, ElasticSearch).
    • Familiarity with deploying ML models and big data technologies.
    • Bachelor’s degree in Computer Science, Engineering, or a related field.
    • Strong communication skills and the ability to work well in a team.
    • Experience with specific technologies like ECS, eBPF, CNI, Envoy, Istio.
    • Understanding of Python in the context of MLFlow.
    • Good problem-solving and interpersonal skills.