This recruiter is online.

This is your chance to shine!

Apply Now

Data Architect - 2818

Toronto, ON
  • Number of positions available : 1

  • To be discussed
  • Contract job

  • Starting date : 1 position to fill as soon as possible

Duration: 6 months

Location: Hybrid (3 days a week required in the Markham office, subject to change)


Role Purpose:

• Architect and implement advanced data solutions using Snowflake on AWS, ensuring scalable, secure, and high-performance data environments.

• Migration of the existing Datawarehouse solution to Snowflake

• Technology platform evaluations in the data and analytics space

• Collaborate with cross-functional teams (data engineers, AI engineers, business, solution architects) to translate business requirements into technical solutions aligned with the data strategy.

• Ensure data governance, security, and compliance within the Snowflake ecosystem, adhering to regulatory and organizational standards.


Experience and Capabilities:

• Extensive experience (8+ years) in data architecture and engineering, with a proven track record in large-scale data transformation programs, ideally in insurance or financial services.

• Proven experience in architecting and implementing advanced data solutions using Snowflake on AWS,

• Expertise in design and orchestrating data acquisition pipelines using AWS Glue for ETL/ELT, Snowflake OpenFlow and Apache Airflow for workflow automation, enabling seamless ingestion of different data from diverse sources.

• Proven experience in DBT to manage and automate complex data transformations within Snowflake, ensuring modular, testable, and version-controlled transformation logic.

• Experience in implementing the lake house solution, Medallion architecture for financial or insurance carriers

• Experience in optimizing and tune Snowflake environments for performance, cost, and scalability, including query optimization and resource management,

• Experience in architecting/lead migration of workloads from Cloudera to Snowflake

• Design Streamlit apps and define new capabilities and data products leveraging snowflake ML and LLOPS capabilities.

• Experience in evaluating the data technology platform including data governance suites, data security products

• Exposure to enterprise Datawarehouse solution like Cloudera, AWS Redshift and informatica tool sets- IDMC, PowerCenter, BDM

• Develop robust data models and data pipelines to support data transformation, integrating multiple data sources and ensuring data quality and integrity.

• Document architecture, data flows, and transformation logic to ensure transparency, maintainability, and knowledge sharing across teams.

• Strong knowledge of data lifecycle mgmt., data retention, data modelling and working knowledge of cloud computing, and modern development practices.

• Experience with data governance, metadata management, and data quality frameworks (e.g., Collibra, Informatica).

• Experience in converting policy/data conversion from legacy to modern platform

• Deep expertise in Snowflake (Snowpro advanced certification preferred), with hands-on experience delivering Snowflake as an enterprise capability.

• Hands-on experience with AWS Glue for ETL/ELT, Apache Airflow for orchestration, and dbt for transformation (preferably deployed on AWS ECS).

• Proficiency in SQL, data modeling, ETL/ELT processes, and scripting languages (Python/Java).

• Familiarity with data mesh principles, data product delivery, and modern data warehousing paradigms.


Apply

Requirements

Level of education

undetermined

Work experience (years)

undetermined

Written languages

undetermined

Spoken languages

undetermined