Robin Jean

ingénieur big data

Remote from Saint-Denis

  • -20.8789
  • 55.4482
  • Indicative rate €510 / day
  • Experience 2-7 years
  • Response rate 100%
  • Response time 12h
Propose a project The project will only begin when you accept Robin's quote.

Confirmed availability

Propose a project The project will only begin when you accept Robin's quote.

Location and geographical scope

Saint-Denis, France
Remote only
Works remotely most of the time


Project length
  • Between 3-6 months
  • ≥ 6 months



  • Français

    Native or bilingual

  • Anglais

    Full professional proficiency

Skills (16)

Robin in a few words

As a computer enthusiast, I chose to become a Computer Science Engineer in the Big Data area with the aim to understand and participate in the tomorrow’s world.

For 4 years, I had the chance to actively participate to the implementation of two different and very critical distributed solutions in three of the largest French companies in the energy, banking and luxury sectors and one big telecommunication company in New Zealand.

The main tasks I had to work on, were :
- To design and implement data ingestion and data aggregation jobs (batch / streaming)
- To develop CI/CD pipelines to automate deployment
- To develop monitoring and alerting solutions about Hadoop resource usage.

I learnt different hard skills:
- Big Data Technologies : Spark, Hadoop (YARN, HDFS, HBase), Nifi, Kafka, Cassandra, Solr
- Programming Languages : Java, Scala, Python
- CI/CD : Jenkins, GitLab CI
- Testing : Unit Test, Cucumber
- Monitoring : Elasticseach, Kibana, Grafana
- Cloud : AWS, Azure

But also some usefull soft skills
- Teamwork and Agiles methods (Kanban, Scrum)
- Communication
- Leadership

So, I am currently looking for a position as Big Data Engineer in an open-minded, competent and motivated team with which I will progress and share good times.

Do not hesitate to add me if you think that we could collaborate !


Total Direct Energie - Total

Energy & Utilities

Big Data Engineer

Paris, France

October 2019 - May 2020

- Implementation of Spark Streaming jobs in Scala
- Improvement of code performances and resources usage
- Leading the CI/CD pipelines development using Azure DevOps and Jenkins
- Implementation of Java REST web services

Spark NZ


Big Data Engineer

Auckland, Nouvelle-Zélande

October 2018 - May 2019

- Leading the implementation of the cluster monitoring and alerting solution
- Implementation of the Azure Databricks ingestion template flow which will be used across all Spark teams

Kering - Kering

Luxury Goods

Big Data Engineer

Paris, France

January 2018 - September 2018

- Implementation of ingestion flow using Apache Nifi from {Kafka, Rest API, SFTP, Kafka, SQL Server, Microsoft Server, Oracle} to {Kafka, Cassandra, Solr, S3, Elasticsearch, Slack}
- Development of different Spark batch jobs in Scala
- Proactive implementation of GitLab and Jenkins pipeline to deploy a real use case in different environment

Société Générale - Société Générale

Banking & Insurance

Big Data Engineer

Paris, France

October 2016 - November 2017

Within ITEC/DCC/DRM's growing team (5 people in September 2016, 15 now) in charge of storage and distribution of risk metrics, my missions are :
- To implement/improve Non Regression Test
- To lead the implementation of the monitoring tools of our different jobs with the help of two trainees
- To implement an API and a Spark job allowing users to correct data during certification process
- To participate in the recrutment process

Société Générale - Société Générale

Banking & Insurance

Full-stack Engineer

Paris, France

March 2016 - August 2016

Within the ITEC/CTT/EQD department, the Fin'IT initiative aims to create intern FinTech.
Contribution to the development of two Proofs Of Concept :
- Easy Margining : a tool allowing to optimize the initial margin front of different clearing houses
- DirectETF : an easy to use investment platform dedicated to individuals

Scaled Risk

Banking & Insurance

Big Data Engineer

Paris, France

June 2015 - August 2015

Scaled Risk is a Fintech which is designing a Big Data solution for the finance market.
Implementation of an Unicast bus for the communication between components in a big data context and updating of the fault tolerance tests.

An Tian - Société Général


Robin a fait preuve de beaucoup de compétences dans la domaine Bigdata (hadoop, spark, hive, oozie) et l’efficacité de son travail nous a grandement satisfait. Il a toujours été une personne fiable et responsable, tant dans son travail que dans ses rapports avec ses collègues. Je recommande très vivement Robin pour tout poste auquel il souhaiterait accéder. Il est le type même de l’employé avec qui tout le monde voudrait travailler.


charter modal image

Success is a team effort

Contribute to this success and the community's professionalism by signing the Freelancer Code of conduct

Sign the code