Ce recruteur est en ligne!

Voilà ta chance d'être vu en premier!

Postuler maintenant

Data Architect - 2818

Toronto, ON
  • Nombre de poste(s) à combler : 1

  • À discuter
  • Emploi Contrat

  • Date d'entrée en fonction : 1 poste à combler dès que possible

Duration: 6 months

Location: Hybrid (3 days a week required in the Markham office, subject to change)


Role Purpose:

• Architect and implement advanced data solutions using Snowflake on AWS, ensuring scalable, secure, and high-performance data environments.

• Migration of the existing Datawarehouse solution to Snowflake

• Technology platform evaluations in the data and analytics space

• Collaborate with cross-functional teams (data engineers, AI engineers, business, solution architects) to translate business requirements into technical solutions aligned with the data strategy.

• Ensure data governance, security, and compliance within the Snowflake ecosystem, adhering to regulatory and organizational standards.


Experience and Capabilities:

• Extensive experience (8+ years) in data architecture and engineering, with a proven track record in large-scale data transformation programs, ideally in insurance or financial services.

• Proven experience in architecting and implementing advanced data solutions using Snowflake on AWS,

• Expertise in design and orchestrating data acquisition pipelines using AWS Glue for ETL/ELT, Snowflake OpenFlow and Apache Airflow for workflow automation, enabling seamless ingestion of different data from diverse sources.

• Proven experience in DBT to manage and automate complex data transformations within Snowflake, ensuring modular, testable, and version-controlled transformation logic.

• Experience in implementing the lake house solution, Medallion architecture for financial or insurance carriers

• Experience in optimizing and tune Snowflake environments for performance, cost, and scalability, including query optimization and resource management,

• Experience in architecting/lead migration of workloads from Cloudera to Snowflake

• Design Streamlit apps and define new capabilities and data products leveraging snowflake ML and LLOPS capabilities.

• Experience in evaluating the data technology platform including data governance suites, data security products

• Exposure to enterprise Datawarehouse solution like Cloudera, AWS Redshift and informatica tool sets- IDMC, PowerCenter, BDM

• Develop robust data models and data pipelines to support data transformation, integrating multiple data sources and ensuring data quality and integrity.

• Document architecture, data flows, and transformation logic to ensure transparency, maintainability, and knowledge sharing across teams.

• Strong knowledge of data lifecycle mgmt., data retention, data modelling and working knowledge of cloud computing, and modern development practices.

• Experience with data governance, metadata management, and data quality frameworks (e.g., Collibra, Informatica).

• Experience in converting policy/data conversion from legacy to modern platform

• Deep expertise in Snowflake (Snowpro advanced certification preferred), with hands-on experience delivering Snowflake as an enterprise capability.

• Hands-on experience with AWS Glue for ETL/ELT, Apache Airflow for orchestration, and dbt for transformation (preferably deployed on AWS ECS).

• Proficiency in SQL, data modeling, ETL/ELT processes, and scripting languages (Python/Java).

• Familiarity with data mesh principles, data product delivery, and modern data warehousing paradigms.


Apply

Exigences

Niveau d'études

non déterminé

Années d'expérience

non déterminé

Langues écrites

non déterminé

Langues parlées

non déterminé