Big Data DevOps Engineer

Europe /
Engineering – Big Data /
Full-time: Remote
Binance is the global blockchain company behind the world’s largest digital asset exchange by trading volume and users, serving a greater mission to accelerate cryptocurrency adoption and increase the freedom of money.

Are you looking to be a part of the most influential company in the blockchain industry and contribute to the crypto-currency revolution that is changing the world?

As the biggest service provider in cryptocurrency exchange technology, we have an immense amount of data; with careful mining, such information helps us build better strategies, optimize our stack, and drive business decisions.

We are looking for a motivated Big Data Engineer that will take ownership of our Big Data Infrastructure. Ideally someone that has experience in dealing with data pumping out from high throughput, low latency mission critical systems. With your strong technical capabilities, you will participate in the architecturing of the infrastructure which is capable of handling the ingress of such data flow in realtime and hence trigger necessary business action/decision; but also persistence of such data for analytics and business intelligence.

This is a full-time position that can be remote from any location.

Responsibilities

    • Capture expectation from stakeholders and translate it into technical vision and goal
    • Design, deploy and operate the data platform and infrastructure
    • Continuous monitoring and improvement of the data service’s performance, capability, availability & scalability
    • Response to incidents promptly and identify potential issues
    • Improve oneself and service to the latest technology

Qualifications

    • 3-6+ years of Big Data DevOps experience
    • Familiar with data processing including ETL, ELT, realtime streaming & transformation
    • Familiar with various data structure / messaging format specification
    • Demonstration knowledge of various big data facilities, features and suitability of different use cases
    • Knowledge on common big data components such as Hadoop, Hive, Spark, Spark Streaming, Presto, HBase, ElasticSearch, Kafka, ZooKeeper, Redis, Airflow will be a big plus
    • Solid experience on big data processing, covering data capturing, transformation and data provisioning
    • Good experience in monitoring, optimizing and troubleshooting big data infrastructure
    • Proficient in shell scripting, Python and data query languages
    • Experiences in AWS data services is a plus
    • Strong teamwork, analytical mind, and keep striving for a better solution
    • Dedicated to commitment and quality of delivery
#LI-Remote #LI-EF1

Conditions
Do something meaningful; Be a part of the future of finance technology and the no.1 company in the industry
Fast moving, challenging and unique business problems
International work environment and flat organisation
Great career development opportunities in a growing company
Possibility for relocation and international transfers mid-career
Competitive salary
Flexible working hours, Casual work attire