Senior Big Data Engineer

  • 37978
  • Non-Life - Data Science
  • |
  • Ontario, Canada
  • |
  • Aug 17, 2020
Insurance
RESPONSIBILITIES
  • Design and develop Big Data solutions for both operational and analytical requirements using Hadoop Open Source frameworks.
  • Design, develop and integrate data ingestion, ETL/ELT data pipelines and processing/transformation processes.
  • Work with senior stakeholders to develop a clear understanding of requirement drivers.
  • Address non-functional requirements including performance, data governance, scalability, continuous integration, migration and compatibility.
QUALIFICATIONS
  • BSc in Computer Science, Engineering or related fields.
  • Expertise with Hadoop distributed frameworks handling large amount of data using Spark and Hadoop ecosystems.
  • Experience Hadoop technologies such as Oozie, Kafka, Druid, Hive, Storm, Ignite, Kudu, NiFi.
  • Experience with data loading tools like Flume, Sqoop, as well as different layers of Hadoop Framework - Storage, Analysis and Map Reduce Jobs.
  • Must have hands on Spark/Scala experience.
  • Advanced programming skills.
  • Strong experience with ETL tools.
  • Minimum of 4 -5 years of experience handling variety of data, data formats and data storage.
  • Strong communication skills with ability to communicate across teams, internal and external at all levels.
  • Experience in the financial sector, and specifically the P&C Insurance industry is considered a strong asset.
  • Firm understanding of database systems - data modelling, SQL and transactional processing.
  • Experience with API management best practices.
  • DevOps and Agile collaboration knowledge are necessary.
  • Experience with developing big data solutions in the cloud would be great.