This job has expired, please see additional jobs below
Data Architect (Big Data)
Ubisoft
Montreal, , Canada
Job Details - this job has expired, please see similar jobs below
Summary:
The Data Architect is part of the Enterprise data team and is responsible for the Data Management in the Bigdata platform Datalake. The incumbent will analyse, recommend, develop & maintain data and processes in the Datalake and interact with Technical Leads, Programmers, BI developers, Data Scientists, Security Team, IT (GNS), TG, etc., to safeguard the Data is proper, optimized and scalable for various usage.
The Datalake team aims to centralize all Ubisoft raw data in the Datalake in near Real-time and Batch and prepare for various usage (Data warehouse, Data Science and data exploration).
Responsibilities:
The main and routine tasks of the Data Architect are to:
• Participate in the design, architecture and evolution of the Bigdata platform project;
• Deliver optimum Data performance and scalability in partnership with Production and internal services such as IT (GNS), TG, etc.;
• Drive Data usage strategies, principles, standards and best practices via development and communication;
• Develop and maintain real-time data ingestion pipeline (Kafka and Storm);
• Develop and maintain Datalake through HDFS, Hive, Impala & hbase data structures;
• Develop new benchmarks and tools for Data performance measurement and capacity;
• Analyse and recommend best technologies available for services accompanied with the projects;
• Review, recommend and approve Data management;
• Support the Developers and Software Architects to provide the best solutions to meet current project needs that are aligned with strategic direction;
• Provide technical mentoring of the Infrastructure team;
• Carry out all other related tasks.
Qualifications
Training:
• Degree in computer sciences or certificate with equivalent experience.
Relevant Experience:
• Minimum 7 year experience in the industry with a minimum of 4 years in this role.
Skills and Knowledge:
• Excellent Knowledge and experience of programming Java
• Excellent knowledge in Distributed Streaming Platform Apache Kafka and it’s advanced features;
• Excellent knowledge in stream processing with Storm or Spark Streaming or similar technology;
• Excellent knowledge and experience of Cloudera Bigdata technologies (Hbase, HDFS, Hive, Impala);
• Good knowledge of Programming in Python and C#
• Good knowledge and experience Linux Server administration;
• Knowledge Informatica ETL tool is an asset;
• Good knowledge of algorithm analysis and optimization strategy;
• Collaboration & communication skills;
• Knowledge on other emerging open source project is an asset (like Druid etc);
• Capacity to work under pressure and solve multifaceted problems;
• Fluent English, French is an asset