This job has expired, please see additional jobs below
Data Architect (Java / Kafka)
Ubisoft
Montreal, , Canada
Job Details - this job has expired, please see similar jobs below
Summary:
The Data Architect is part of the Enterprise data team and is responsible for the Data Management in the Bigdata platform Datalake. The incumbent will analyse, recommend, develop & maintain data and processes in the Datalake and interact with Technical Leads, Programmers, BI developers, Data Scientists, Security Team, IT (GNS), TG, etc., to safeguard the Data is proper, optimized and scalable for various usage.
Responsibilities:
The main and routine tasks of the Data Architect are to:
• Participate in the design, architecture and evolution of the Bigdata platform project;
• Deliver optimum Data performance and scalability in partnership with Production and internal services such as IT (GNS), TG, etc.;
• Drive Data usage strategies, principles, standards and best practices via development and communication;
• Develop and maintain real-time data ingestion pipeline (Kafka and Storm);
• Develop and maintain Datalake through HDFS, Hive, Impala & hbase data structures;
• Develop new benchmarks and tools for Data performance measurement and capacity;
• Analyse and recommend best technologies available for services accompanied with the projects;
• Review, recommend and approve Data management;
• Support the Developers and Software Architects to provide the best solutions to meet current project needs that are aligned with strategic direction;
• Provide technical mentoring of the Infrastructure team;
• Carry out all other related tasks.
Qualifications
Training:
• Degree in computer sciences or certificate with equivalent experience.
Relevant Experience:
• Minimum 7 year experience in the industry with a minimum of 4 years in this role.
Skills and Knowledge:
• Excellent Knowledge and experience of programming (Java, Python etc.);
• Excellent knowledge and experience of Bigdata technologies (Hbase, HDFS, Hive, Impala);
• Knowledge in real-time processes (Kafka & Storm);
• Excellent knowledge and experience working with hybrid hosting environments, such as Cloud, co-location, and in-house datacenters allowing for dynamic scalability of servers/services;
• Excellent Knowledge and experience of programming (Java, Python etc.);
• Excellent knowledge and experience of server technologies under Linux;
• Knowledge Informatica ETL tool is an asset;
• Good knowledge of algorithm analysis and optimization strategy;
• Collaboration & communication skills;
• Business value oriented;
• Capacity to work under pressure and solve multifaceted problems;
• Bilingual (oral and written).