This recruiter is online.

This is your chance to shine!

Apply Now

Description

Overview

You’ve got big plans. We have opportunities to match, and we’re committed to empowering you to become a better you, no matter what you do.

 

When you join KPMG, you’ll be one of over 219,000 professionals providing advisory and business enablement services across 147 countries.

 

With the support to do things differently, grow personally and professionally and bring your whole self to work, there’s no limit to the impact you can make. Let’s do this. 

 

The opportunity:

 

Join a growing cloud advisory focused on next-gen solutions in a fast-paced environment. We care about building great relationships and we’re looking for experienced candidates who are drawn to technology. This is a chance to get in on the ground floor and a unique leadership opportunity to be part of a team dedicated to developing tomorrow’s cloud-based solutions.

We are seeking passionate and driven individuals who bring thought leadership to the exciting space of cloud data architecture!


What you will do

  • You’re a seasoned leader with a track record for enabling the adoption of next-gen data technologies within regulated industries.
  • You’re a builder at heart and have designed, architected, implemented, and have a track record for deploying elegant solutions to complex problems.
  • In your career you’ve supported enterprises in modernizing their data platforms and BI systems.
  • You develop proof-of-concepts to demonstrate the possibilities.
  • You perform data analysis, map data elements and understand core BI concepts.
  • You’ve been significantly involved in the migration of on-premise data silos to cloud based data lakes within the context of hybrid cloud architectures.
  • You have served the needs of key stakeholders in this space including application developers, data scientists and risk controls partners.
  • You understand the importance of data governance and data stewardship and how these concepts translate into data architecture.
  • You have a broad and deep understanding of Canadian enterprise landscape and ideally have finance industry experience.
  • You have a clear and empathetic understanding of where Canadian enterprises are in their cloud journeys, what makes them tick and how to effectively drive change.

You’re ahead of the curve and understand where the industry is going. When working with clients you strive to bring everyone with you and elevate the teams you work with.

These roles can be filled in various locations across Canada, though you will be asked to work on projects outside your local office. We are a dynamic and innovative team and equally passionate about what we do and the quality of services we provide to our clients.


What you bring to the role

  • 10+ years of experience in SDLC with strong emphasis on current big data technologies including cloud based services. Examples: Spark, Scala, Spark Mlib, Hadoop, Tableau, Cassandra, DataBricks, Synapse, ASDL, DataFactory, Aurora, Dynamo, Kinesis, EMR, S3, Elastic Cache, Redshift, RDS & Airflow, etc.
  • 10+ years of factory, Informatica, Python, Cognos,
  • 10+ managing BI Frameworks, design principles, and standards. Working with Azure (Data Lake, Hadoop, Apache, Snowflake, SQL, Oracle, MySQL, PostgreSQL, Mongo DB, and Microservices architectures)
  • Good understanding of Data Governance and Data Lineage and how current technology solutions intersect with these concepts in related technology platforms.
  • Experience implementing Big Data and Data Warehouse Architectures including relevant current and legacy patterns: Star Schema, Snow flake Schema , Fact and Dimensional Tables, Physical and Logical Data Modeling using relevant tools
  • Architect, design & develop big data solutions including roadmap design and development, developing supporting infrastructure, organization structures that support big data initiatives.
  • Architecting and implementing analytics engines, in both batch and streaming scenarios, using Hadoop MR, Oozie, Spark SQL, Spark Mlib and Cassandra, sqoop, flume, kafka, Spark Streaming, Spark Mllib, etc. Candidate should also have good understanding of public cloud platforms that provide these or competing options.
  • Excellent understanding of Hadoop architecture and underlying framework including Hive, HDP, Pig, Flume, Storm and Map Reduce open source tools/technologies and storage management.
  • Extensive experience in data modeling, data architecture, solution architecture, data warehousing & business intelligence concepts and master data management (MDM).
  • Expertise in architecting big data pipelines covering key milestones: ingestion, staging, cleansing, transformation, model, analyse and report.
  • Experienced with NoSQL databases - Hbase, Cassandra & MongoDB, database performance tuning & data modeling.
  • Extensive knowledge in architecture design of ETL/ELT environments leveraging popular platforms e.g. Informatica Power Center, TeraData and large data volumes.
  • Experience in migrating data warehouses and databases into Hadoop/NoSQL platforms in cloud environments.
  • Experience in financial industry is a plus.

 Keys to your success:

  • Ability to work independently with minimal supervision as well as work effectively within a multi-disciplinary team.
  • Superior verbal and written interpersonal communication skill.
  • Excellent client service skills.
  • Well organized with good prioritization/workload management abilities.
  • Professional discipline and importance of outstanding work.
  • Commitment to self-learning and continuous skill and professional knowledge development.
  • Professional curiosity.

Learn more about where a career at KPMG can take you.


Our Values, The KPMG Way

Integrity, we do what is right | Excellence, we never stop learning and improving | Courage, we think and act boldly | Together, we respect each other and draw strength from our differences | For Better, we do what matters

KPMG in Canada is a proud equal opportunities employer and we are committed to creating a respectful, inclusive and barrier-free workplace that allows all of our people to reach their full potential. A diverse workforce is key to our success and we believe in bringing your whole self to work. We welcome all qualified candidates to apply and hope you will choose KPMG in Canada as your employer of choice.

If you have a question about accessible employment at KPMG, or to begin a confidential conversation about your individual accessibility or accommodation needs through the recruitment process, we encourage you to contact KPMG’s Employee Relations Service team for support at email: cdnersteam@kpmg.ca or phone: 416-777-8002 or toll free 1-888-466-4778 Option 3.

For general recruitment-related inquiries, please contact the HR Delivery Centre at cafmcdnhrsthotline@kpmg.ca.

  • 10+ years of experience in SDLC with strong emphasis on current big data technologies including cloud based services. Examples: Spark, Scala, Spark Mlib, Hadoop, Tableau, Cassandra, DataBricks, Synapse, ASDL, DataFactory, Aurora, Dynamo, Kinesis, EMR, S3, Elastic Cache, Redshift, RDS & Airflow, etc.
  • 10+ years of factory, Informatica, Python, Cognos,
  • 10+ managing BI Frameworks, design principles, and standards. Working with Azure (Data Lake, Hadoop, Apache, Snowflake, SQL, Oracle, MySQL, PostgreSQL, Mongo DB, and Microservices architectures)
  • Good understanding of Data Governance and Data Lineage and how current technology solutions intersect with these concepts in related technology platforms.
  • Experience implementing Big Data and Data Warehouse Architectures including relevant current and legacy patterns: Star Schema, Snow flake Schema , Fact and Dimensional Tables, Physical and Logical Data Modeling using relevant tools
  • Architect, design & develop big data solutions including roadmap design and development, developing supporting infrastructure, organization structures that support big data initiatives.
  • Architecting and implementing analytics engines, in both batch and streaming scenarios, using Hadoop MR, Oozie, Spark SQL, Spark Mlib and Cassandra, sqoop, flume, kafka, Spark Streaming, Spark Mllib, etc. Candidate should also have good understanding of public cloud platforms that provide these or competing options.
  • Excellent understanding of Hadoop architecture and underlying framework including Hive, HDP, Pig, Flume, Storm and Map Reduce open source tools/technologies and storage management.
  • Extensive experience in data modeling, data architecture, solution architecture, data warehousing & business intelligence concepts and master data management (MDM).
  • Expertise in architecting big data pipelines covering key milestones: ingestion, staging, cleansing, transformation, model, analyse and report.
  • Experienced with NoSQL databases - Hbase, Cassandra & MongoDB, database performance tuning & data modeling.
  • Extensive knowledge in architecture design of ETL/ELT environments leveraging popular platforms e.g. Informatica Power Center, TeraData and large data volumes.
  • Experience in migrating data warehouses and databases into Hadoop/NoSQL platforms in cloud environments.
  • Experience in financial industry is a plus.

 Keys to your success:

  • Ability to work independently with minimal supervision as well as work effectively within a multi-disciplinary team.
  • Superior verbal and written interpersonal communication skill.
  • Excellent client service skills.
  • Well organized with good prioritization/workload management abilities.
  • Professional discipline and importance of outstanding work.
  • Commitment to self-learning and continuous skill and professional knowledge development.
  • Professional curiosity.

Learn more about where a career at KPMG can take you.

  • You’re a seasoned leader with a track record for enabling the adoption of next-gen data technologies within regulated industries.
  • You’re a builder at heart and have designed, architected, implemented, and have a track record for deploying elegant solutions to complex problems.
  • In your career you’ve supported enterprises in modernizing their data platforms and BI systems.
  • You develop proof-of-concepts to demonstrate the possibilities.
  • You perform data analysis, map data elements and understand core BI concepts.
  • You’ve been significantly involved in the migration of on-premise data silos to cloud based data lakes within the context of hybrid cloud architectures.
  • You have served the needs of key stakeholders in this space including application developers, data scientists and risk controls partners.
  • You understand the importance of data governance and data stewardship and how these concepts translate into data architecture.
  • You have a broad and deep understanding of Canadian enterprise landscape and ideally have finance industry experience.
  • You have a clear and empathetic understanding of where Canadian enterprises are in their cloud journeys, what makes them tick and how to effectively drive change.

You’re ahead of the curve and understand where the industry is going. When working with clients you strive to bring everyone with you and elevate the teams you work with.

These roles can be filled in various locations across Canada, though you will be asked to work on projects outside your local office. We are a dynamic and innovative team and equally passionate about what we do and the quality of services we provide to our clients.

Read more

Requirements

Level of education

undetermined

Diploma

In progress

Work experience (years)

undetermined

Written languages

undetermined

Spoken languages

undetermined