Data Scientist

  • 37690
  • Non-Life - Data Science
  • |
  • Ontario, Canada
  • |
  • Jul 7, 2020
Insurance
RESPONSIBILITIES
  • Collect, parse, visualize and analyzes data sets from multiple sources to build Predictive and Prescriptive models using established or emerging open source tools.
  • Design, develop, and implement start-to-end cloud-based data and AI model pipelines.
  • Code, test and deploy Data/Model Pipelines to create robust and scalable analytical applications and constantly improve them over time.
  • Ensures that data pipelines and models are repeatable, secure and will serve multiple user groups.
  • Translate complex functional and technical requirements into detailed architecture and design to build analytical applications.
  • Works with peers to ensures that automated processes protect data privacy/security and integration process follows data and model governance guidelines.
  • Contribute to the knowledge body with centralized documentation and code bank and facilitate workshops for the end users.
  • Work in partnership with various business functions in an agile development environment.
  • Communicate the results/analysis/visualization precisely and clearly to non-technical users from various business functions.
  • Identify new opportunities and develop customer-centric analytical solutions meeting business needs aligned to business strategy.
  • Work with minimal direction, with the proven ability to co-ordinate complex projects and problem-solving mindset.
QUALIFICATIONS
  • Masters or PhD in Operations Research/ComputerScience/Finance/Economics/Mathematics/Statistics/Physics or any other data and compute intensive field.
  • Code efficiently in R, Python, JScript, Spark, Shell Scripting, SQL and proficient in Linux OS.
  • Awareness of C/C++/Java would be an asset.
  • Hands-on experience of one or more Machine Learning and Deep Learning open-source libraries, e.g. Scikit-learn, Spark MLLib, H2O, TensorFlow, Caffe2, Theano, CNTK, SystemML, Gluon, MXNet, Keras, PyTorch etc.
  • Hands-on experience of various analytical solution deployment frontend and backend tools such as Flask, Redux/Postgres/SQLAlchemy, Nginx, React.js, Redux, CSS, Auth0, Docker and Kubernetes.
  • Hands-on experience of data architecture, data modeling, data pipelining, serverless computing and parallel processing.
  • Working knowledge of design thinking and agile prototyping to develop and test business use cases and analytical applications.