Currently, we are searching a Big Data Engineer to join an ambitious project in the Internet search field in Bordeaux, France.
- Collaborate with the entire data team to enable data-based decision making and data based products
- Manage and evolve our cloud based and on-premise Data Warehouse (Big Query, Vertica and Tableau) which acts as the unique source of all relevant data and ingests high volume near-time data from various sources
- Your ideas and work will continuously improve our real-time communication platform which is used by several million users per day
- You are a subject matter expert and guide junior team members
- You enjoy exciting challenges and work scientifically on cutting-edge technology
- You will collaborate with many distributed and international teams
- You will develop, extend and maintain a wide variety of infrastructure components needed for stream processing in an AWS environment primarily based on Apache Spark and Kafka
- Be the in-house expert on systems, processes and software to best track, store, process and access data
- Design and evolve our Data Processing pipeline to feed both BI needs and also machine learning needs to serve the business
- NoSQL technologies including Cassandra, HBase and MongoDB
- AWS, Microsoft Azure, Google Cloud Platform
- 5+ years of professional experience either in Big Data, Data Engineering or Business Intelligence. This might include ETL, data warehousing or data visualization
- Excellent programming skills in programming languages such as Python, Scala or Java.
- Commercial solution architecture experience in Big Data technologies and environments such as Spark, Hadoop, Kafka, AWS, Microsoft Azure and GCP.
- Ability to maintain and engineer ETL scripts and data pipeline (we use Jenkins)
- Extensive experience in writing SQL and experience with Bash scripting
- Experience with cloud-based data warehouses (google cloud platform or aws) and analytical database like Vertica
- Masters’ degree in Computer Science or related studies
- Strong collaboration skills with data stakeholders
- Strong verbal and written communication skills are a must, as well as the ability to lead effectively across internal and external organizations and virtual teams.
Not a must, but nice to have:
- Experience in Golang, steaming frameworks like Kafka, Tableau
- Knowledge of both Agile and Scrum methodologies
- Deep understanding of MPP & NoSQL databases
Perks and Benefits:
- Highly motivated, growing, diverse team made up of 22 different nationalities
- Flexible work schedules and options to work from home
- Flat hierarchies and open-door policy
- Social events ranging from company lunches and after work drinks to annual off site company events
- Competitive compensation and vacation entitlement
Sounds like you or would like to get more information?
and Let’s Build The Future Together
To apply for this job email your details to email@example.com