This recruiter is online.

This is your chance to shine!

Apply Now

Int Data Engineer to design, build and maintain data infrastructure

Montreal, QC
  • Number of positions available : 1

  • To be discussed
  • Contract job

  • Starting date : 1 position to fill as soon as possible

Our valued client is looking for an Int Data Engineer to design, build and maintain data infrastructure.


Initial 5 month contract hybrid in Ottawa with possibility for conversion to full time permanent. (min 2 days a week onsite)


Responsibilities:

  • Designing, building, and maintaining data infrastructure that supports the efficient extraction, transformation, and loading (ETL) of data from various sources.
  • Develops ways to increase the usability of data through changes in business process and/or data cleansing techniques
  • Design, build, and maintain data pipelines and ETL processes using tools such as Streamsets, Apache Spark, Hadoop, and other big data technologies.
  • Develop and maintain data infrastructure including data warehouses and other data storage solutions to support business intelligence and reporting needs
  • Design and implement data security measures to protect sensitive data.
  • Develop and maintain data quality control processes to ensure the accuracy, reliability, and accessibility of data to all stakeholders.
  • Monitors system performance of scheduled ETL batches, and streaming data process and ensures all systems are working at an acceptable standard and that data pipelines are scalable, repeatable, and secure.
  • Performs data migration between development, UAT, and production systems, and plans and coordinates these data migrations.
  • Analyzes and troubleshoots technical issues quickly to resolution, working with internal ITS sections and software vendors, when required.


Must Have Skills:

  • 5+ years experience in a Data Analytics environment with progressively more technical responsibilities in an engineering role
  • Designing and creating:
  • ETL processes, data pipelines and workflows.
  • Logical and physical data models using data modelling best practices.
  • Develop scripts, applications and APISs to automate data processing tasks using programming languages such as SQL, Python, Java, Scala, shell scripting, JavaScript
  • Designing, building, and supporting data warehouses, data hubs and other data storage and processing systems.


Nice to Have Skills:

  • Experience with cloud computing platforms such as Azure and being familiar with setting up and managing cloud-based data storage and computing environments.
  • Working with stream processing frameworks such as Apache Kafka or Streamsets
  • Designing and implementing:
  • Real-time data processing pipelines using these tools.
  • Database solutions using technologies such as MySQL, PostgreSQL or SQL Server.
  • Project implementation analysis and support in data management systems, data integrity and security as it relates to environmental business systems.
  • Machine Learning concepts and tools (R, Python, Jupyter Notebook).
  • Utilization of tools such as Apache Spark, Hadoop, and other big data technologies.
Apply

Requirements

Level of education

undetermined

Work experience (years)

undetermined

Written languages

undetermined

Spoken languages

undetermined