Agile Data Engineer

Brussels
Engineering – Data /
CDI /
Hybrid
SFEIR is a digital strategy and technological consulting firm

WHAT DRIVES US ?

We coach companies in their digital transformation by providing them with support and specialized resources in the development of cutting-edge applications. We help our clients successfully design the latest generation of digital solutions and business applications.

As part of its international development in Belgium, we are looking for an Agile Data Engineer.

THE JOB 

What project will you be working on? 

This will be defined as soon as you arrive, for the moment we can tell you about the projects carried out by our Data Engineers:

✅Vincent has a mission in the cosmetics sector, it's about setting up a data platform on a Cloud Provider for reporting and marketing automation use cases, with the following technologies: python, docker, SQL, BigQuery, Airflow, Looker, AI platform, Kubernetes. 🏥

✅Oussama has a mission within a digital native player, it is the implementation of a data platform (batch and stream) on Google Cloud for analytics and internal reporting (matchmaking, insurers performances, users behaviours...). The technical stack is : Cloud Storage, Dataflow, Bigquery, Cloud Composer (Airflow), Datastudio, Terraform, GitLab. 📦

✅Pascal has a mission in the healthcare sector. He is working on the implementation of a platform that automatically adapts to the workload on a Cloud provider to analyze data from the genome sequencing of cancer patients to help in diagnosis. The technical stack is composed of : Kubernetes, DataFlow, BigQuery, MongoDB, Elastisearch. 🏦

✅Yuliana, currently on assignment in the luxury sector, is exploring new perimeters and cloud technologies on GCP through the realization of a merger between a transactional platform and a second data science / data analysis platform. The objective is to bring together two different data worlds for machine learning perspectives. The technical stack is composed of : Cloud Storage, Postgresql, Bigquery, Cloud Composer (Airflow), AI Platform, Dataiku, Prefect, Tableau, Datastudio, Cloud Scheduler, Pubsub, CloudRun, Docker,Terraform, Gitlab.  📸 

IDEAL SKILLS

Technical stack: Python, Java, Scala or Go, TensorFlow, Spark. SQL, NoSQL, GCP, AWS or Azure.

✅ Be comfortable with standard development practices (agile methodology, GitHub/GitLab, CI/CD tools).
✅ Be comfortable with at least one programming language (python, Java, Scala, Go)
✅ Good knowledge of Machine Learning or data manipulation libraries (pandas, scikit-learn, TensorFlow, Spark)
✅ Good knowledge of SQL and databases (SQL, NoSQL)
✅ A first experience on at least one Cloud Provider (GCP, AWS, Azure)
✅ A first experience on an orchestration tool (Airflow, Dagster, ...)

This would be a plus:

✅ Knowledge of DevOps practices, experience with an infrastructure automation tool (Terraform, Pulumi)
✅ Certification on a Cloud Provider (AWS, GCP, Azure)

RECRUITMENT PROCESS

We have set up a different and innovative recruitment process : The PlayOffs
3 technical tests in pair-programming (algorithms, language, platform) but we will also be mindful of your personality, skills, potential and leadership.

WHAT WE OFFER

 🏦 A competitive salary package (at the high end of the market)

⚖️ Remote accepted. Subject to customer's agreement

🎓 An access to our official training and certifications (AWS, GCP, Azure, K8S, Confluent, ...)

💬 Be surrounded by a Tech Community : you will no longer have to do your technical watch alone, SFEIR offers you various technical events to keep you updated

🏥 DKV and hospitalization 

🚗 Company car and European card fuel 
 
💻 Other : Meal vouchers, equipment (Mac or PC), Phone subscription, Sfeir Pack