This recruiter is online.

This is your chance to shine!

Apply Now

Int. Data Engineer to build a data pipeline to transfer the data from the enterprise data lake for enabling AI use cases - 19364

Toronto, ON
  • Number of positions available : 1

  • To be discussed
  • Contract job

  • Starting date : 1 position to fill as soon as possible

Int. Data Engineer to build a data pipeline to transfer the data from the enterprise data lake for enabling AI use cases - 19364


Duration of Contract: 7 months (until end of 2025)

Location: Canada (Remote)


As a Data Engineer, you will be responsible for designing, building and running the data driven applications which enable innovative, customer centric digital experiences.

You will be working as part of a friendly, cross-discipline agile team who helps each other solve problems across all functions. As a custodian of customer trust, you will employ best practice in development, security, accessibility and design to achieve the highest quality of service for our customers.

Our development team uses a range of technologies to get the job done: Google Cloud Platform(GCP), PySpark, Dataflow, BigQuery, Looker Studio, Google Cloud Scheduler, Shell scripting, Pubsub, Elasticsearch, LLMs, Gemini Pro, GitHub, Terraform etc to provide a modern, easy to use data pipeline.


You will be part of the team building a data pipeline to transfer the data from our enterprise data lake for enabling our AI use cases.

You are a fast learner, highly technical, passionate data engineer looking to work within a team of multidisciplinary experts to improve your craft and contribute to the data development practice.


Must-Have Skills:

  • Python on Visual Studio Code, Cline/Copilot or other AI based coding experience - 4+ years exp
  • Google Cloud - Vertex AI, M/L Pipelines, Gen AI Models, BigQuery, Dataflow, Terraforms, YAML, Pyspark
  • Extensive knowledge of building complex SQL queries and stored procedures
  • Tableau or Looker Studio
  • JIRA and Agile Methodology


Nice-to-Have Skills:

  • Google Workspace, App Scripts
  • JavaScript, React JS, Node JS
  • Telecom Domain Knowledge


You're the missing piece of the puzzle

  • A passion for data
  • Interest and ability to learn new languages & technologies as needed
  • Familiar with the execution sequence of ETL Flows using Google Platforms
  • Experience with Spark, Beam, Airflow, Cloud SQL, BigQuery, MSSQL
  • Basic understanding of data warehouse, data lake, OLAP and OLTP applications


Great-to-haves

  • Intermediate level candidates with 4-6 years of relevant experience
  • Experience with Big data related tools and technologies
  • Experience with SQL, Unix, Shell scripting
  • Experience with data visualization tools such as Tableau, Domo, Looker


Interview process:

  • Round 1: Off line coding assignment on Google Cloud, BigQuery and Python followed by Video call technical interview.
  • Round 2: Behavioral Questions Interviews with Leadership team
Apply

Requirements

Level of education

undetermined

Work experience (years)

undetermined

Written languages

undetermined

Spoken languages

undetermined