Eloïse Gomez

Data ops Architect / Senior data Engineer

Moves to Paris

  • 48.8546
  • 2.34771
  • Indicative rate €580 / day
  • Experience 7+ years
Propose a project The project will only begin when you accept Eloïse's quote.

Availability not confirmed

Part time, 3 days a week

Propose a project The project will only begin when you accept Eloïse's quote.

Location and geographical scope

Location
Paris, France
Can work in your office at
  • Paris and 50km around

Preferences

Project length
≤ 1 month

Verifications

Languages

  • Anglais

    Full professional proficiency

  • Français

    Native or bilingual

Categories

Skills (10)

Eloïse in a few words

Bonjour,
Je m'appelle Eloïse. Je travaille depuis 7 ans dans des start ups autour de sujets big data. Je serais ravie de vous accompagner dans des missions autour des sujets suivants :
- Audit de vos données (volumétrie / qualité / exhaustivité) et de vos use-cases
- Préconisation d'architecture technique (solution big data ou autre / flux de données)
- Installation et déploiement de cette architecture (je peux travailler avec des équipes Ops pour les assister)
- Aide à l'architecture fonctionnelle
- Mise en place d'une solution de BI / reporting / dashboarding (expériences avec Tableau )
- Définition et analyse du besoin avec les équipes fonctionnelles concernées (marketing / techniques / contrôle de gestion...)
- Développement des workflows d'alimentation / transformation / analyse
- Analyse des données et organisation de "démos"
- Formation des équipes aux parties techniques et fonctionnelles de l'approche data

N'hésitez pas à me contacter si vous avez des questions.

Portfolio

Experience

Deezer

High Tech

Data Architect

Paris, France

September 2015 - April 2018

Employment Duration2 yrs 2 mos
Data delivery and quality
Rework of the BigData project to decentralize the releases process among the data teams
Refactor central Data Warehouse to separate core tables into specialized tables
Design Data pipelines and ETLs used to build recommendation and others KPIs
Hadoop migration from Cloudera to Hortonworks distribution
Monitor our Hadoop cluster
Evaluate a new data store for sub second queries on event data : Druid
Project management and communication
Define roadmaps by working close with internal stakeholders (Data Teams) in order to translate their
data requirements into data structures in form of aggregated tables, new ETLs, Dashboards
Animate the data community of good practices for Data developers and users
Daily support of the data teams

Drivy

High Tech

Data Engineer

Paris, France

April 2018 - April 2019

Migration from Redshift to Snowflake
Airflow upgrade: automatization of the cluster deployment and pipelines migrations
Define & refactor main ETL pipelines to optimize data delivery
Monitor the AWS Cluster

Dataiku

High Tech

Data ops Architect

Paris, France

April 2019 - Today

Understand customer requirements in terms of scalability, availability and security and provide architecture recommendations
Deploy Dataiku DSS in a large variety of technical environments (on prem/cloud, Hadoop, Kubernetes, Spark, …)
Design and build reference architectures, howtos, scripts and various helpers to make the deployment and maintenance of Dataiku DSS smooth and easy
Automate operation, installation, and monitoring of the data science ecosystem components in our infrastructure stack
Provide advanced support for strategic customers on deployment issues
Coordinate with Revenue and Customer teams to deliver a consistent experience to our customers
Train our clients and partners in the art and science of administering a Dataiku DSS instance
Evangelize the challenges of building Enterprise Data Science Platforms to technical and non-technical audiences

Education

charter modal image

Success is a team effort

Contribute to this success and the community's professionalism by signing the Freelancer Code of conduct

Sign the code