This recruiter is online.

This is your chance to shine!

Apply Now

Data Engineer, IT

Toronto, ON
  • Number of positions available : 1

  • To be discussed
  • Starting date : 1 position to fill as soon as possible

GENERAL FUNCTION

As a Data Engineer, you will be responsible for designing, building, and maintaining scalable and reliable data pipelines, ensuring the quality and accessibility of data across the organization. This role will work closely with the engineering team to support our data-driven decision-making processes.

DUTIES/RESPONSIBILITIES

  • Enhance data platform acquiring additional data sources - structured and unstructured
  • Reporting data structure design and oversee the design phases of the projects assuring that design conforms to established architecture design guidelines and standards
  • Develop and deliver long-term goals for data architecture standards
  • Create short-term tactical solutions to achieve long-term objectives and data management roadmap
  • Ensure the success of enterprise-level application rollouts; connect vendors and providers to select products and services that meet the company's goal
  • Develop highly scalable and extensible Data platform which enables collection, storage, modeling, and analysis of massive data sets from numerous channels
  • Be an active part in analyzing and constantly learning the latest data technologies and their innovative applications in both business intelligence analysis and new service offerings, adopt and implement these insights and best practices
  • Drive projects that merge digital and transactional data for advanced analytics.
  • Design, develop, and implement data pipelines using AWS services such as AWS Glue, Lambda, S3, Kinesis, and Redshift to process large-scale data.
  • Build and maintain robust ETL processes for efficient data extraction, transformation, and loading, ensuring data quality and integrity across systems.
  • Design and manage data warehousing solutions on AWS, particularly with Redshift, for optimized storage, querying, and analysis of structured, semi-structured data and unstructured data.
  • Implement and manage scalable data lake solutions using AWS S3, Glue, and related services to support structured, unstructured, and streaming data.
  • Implement data security best practices on AWS, including access control, encryption, and compliance with data privacy regulations.
  • Optimize data workflows and storage solutions for cost and performance. Set up monitoring, logging, and alerting for data pipelines and infrastructure health.
  • Work closely with data scientists, analysts, and business stakeholders to understand data needs and deliver data solutions aligned with business goals.
  • Create and maintain documentation for data infrastructure, data pipelines, and ETL processes to support internal knowledge sharing and compliance.

EDUCATION

  • BS in computer science, or related scientific field
  • Preferred: Master’s degree in computer science, or related technical field

EXPERIENCE

  • 3+ years of relevant experience in Data Engineering.
  • 3+ years of data analysis, data modeling and data management experience
  • Experience with building a data Hub over Cloud / AWS
  • Strong experience with SQL and one other programming language, Python
  • Hands on Experience parsing NOSQL file systems such as JSON, XML, AVRO, Parquet.
  • Experience with Redshift, Postgres, SQL Server or Snowflake
  • Exposure to traditional BI Tools (Tableau, Power BI, Qlik, SSRS, etc.)
  • Experience building web-services is a plus
  • Experience with AWS / Redshift/Postgres/ Snowflake / ETL is required
  • Experience in AI, BI and Data Lake projects.
  • SQL Server / SSIS experience. Exposure to No SQL databases
  • Understanding of Web services (SOAP, XML, UDDI, WSDL)
  • Redshift, well versed with Python scripting and use of libraries (Numpy, Pandas)
  • Power BI (preferred), Tableau, Qlik

SKILLS

  • Functional knowledge of financial instruments such as Installment Loans, Payday Loans and data domains such as loans, underwriting, collection, promotions, call center data, Fraud, etc.
  • Proven ability to conduct data analysis using SQL
  • Must Have Python scripting and use of libraries (Numpy, Pandas)
  • Power BI, Qlik
  • Understanding of Web services (SOAP, XML, UDDI, WSDL)
  • Establish data quality control framework while sourcing data
  • Experience in AI, BI and Data Hub projects.
  • SQL Server / SSIS experience. Exposure to No SQL databases
  • Analyze digital data generated and build visualizations.

Requirements

Level of education

undetermined

Work experience (years)

undetermined

Written languages

undetermined

Spoken languages

undetermined