Int Data Engineer to design, build and maintain data infrastructure
S.i. Systems
Montreal, QC-
Number of positions available : 1
- Salary To be discussed
-
Contract job
- Published on April 28th, 2024
-
Starting date : 1 position to fill as soon as possible
Description
Our valued client is looking for an Int Data Engineer to design, build and maintain data infrastructure.
Initial 5 month contract hybrid in Ottawa with possibility for conversion to full time permanent. (min 2 days a week onsite)
Responsibilities:
- Designing, building, and maintaining data infrastructure that supports the efficient extraction, transformation, and loading (ETL) of data from various sources.
- Develops ways to increase the usability of data through changes in business process and/or data cleansing techniques
- Design, build, and maintain data pipelines and ETL processes using tools such as Streamsets, Apache Spark, Hadoop, and other big data technologies.
- Develop and maintain data infrastructure including data warehouses and other data storage solutions to support business intelligence and reporting needs
- Design and implement data security measures to protect sensitive data.
- Develop and maintain data quality control processes to ensure the accuracy, reliability, and accessibility of data to all stakeholders.
- Monitors system performance of scheduled ETL batches, and streaming data process and ensures all systems are working at an acceptable standard and that data pipelines are scalable, repeatable, and secure.
- Performs data migration between development, UAT, and production systems, and plans and coordinates these data migrations.
- Analyzes and troubleshoots technical issues quickly to resolution, working with internal ITS sections and software vendors, when required.
Must Have Skills:
- 5+ years experience in a Data Analytics environment with progressively more technical responsibilities in an engineering role
- Designing and creating:
- ETL processes, data pipelines and workflows.
- Logical and physical data models using data modelling best practices.
- Develop scripts, applications and APISs to automate data processing tasks using programming languages such as SQL, Python, Java, Scala, shell scripting, JavaScript
- Designing, building, and supporting data warehouses, data hubs and other data storage and processing systems.
Nice to Have Skills:
- Experience with cloud computing platforms such as Azure and being familiar with setting up and managing cloud-based data storage and computing environments.
- Working with stream processing frameworks such as Apache Kafka or Streamsets
- Designing and implementing:
- Real-time data processing pipelines using these tools.
- Database solutions using technologies such as MySQL, PostgreSQL or SQL Server.
- Project implementation analysis and support in data management systems, data integrity and security as it relates to environmental business systems.
- Machine Learning concepts and tools (R, Python, Jupyter Notebook).
- Utilization of tools such as Apache Spark, Hadoop, and other big data technologies.
Requirements
undetermined
undetermined
undetermined
undetermined
Other S.i. Systems's offers that may interest you
- Job posting | Senior Python developer to build an enterprise data warehouse. | Ottawa,ON
- Job posting | Senior Bilingual Project Manager to delivery a migration from Office 2016 to M365 | Montreal,QC
- Job posting | Sr. Platform Engineer with Typescript (NodeJS) experience to work in AWS environment for a banking client | Toronto,ON