Ce recruteur est en ligne!

Voilà ta chance d'être vu en premier!

Postuler maintenant

Int. Data Engineer to build a data pipeline to transfer the data from the enterprise data lake for enabling AI use cases - 19364

Toronto, ON
  • Nombre de poste(s) à combler : 1

  • À discuter
  • Emploi Contrat

  • Date d'entrée en fonction : 1 poste à combler dès que possible

Int. Data Engineer to build a data pipeline to transfer the data from the enterprise data lake for enabling AI use cases - 19364


Duration of Contract: 7 months (until end of 2025)

Location: Canada (Remote)


As a Data Engineer, you will be responsible for designing, building and running the data driven applications which enable innovative, customer centric digital experiences.

You will be working as part of a friendly, cross-discipline agile team who helps each other solve problems across all functions. As a custodian of customer trust, you will employ best practice in development, security, accessibility and design to achieve the highest quality of service for our customers.

Our development team uses a range of technologies to get the job done: Google Cloud Platform(GCP), PySpark, Dataflow, BigQuery, Looker Studio, Google Cloud Scheduler, Shell scripting, Pubsub, Elasticsearch, LLMs, Gemini Pro, GitHub, Terraform etc to provide a modern, easy to use data pipeline.


You will be part of the team building a data pipeline to transfer the data from our enterprise data lake for enabling our AI use cases.

You are a fast learner, highly technical, passionate data engineer looking to work within a team of multidisciplinary experts to improve your craft and contribute to the data development practice.


Must-Have Skills:

  • Python on Visual Studio Code, Cline/Copilot or other AI based coding experience - 4+ years exp
  • Google Cloud - Vertex AI, M/L Pipelines, Gen AI Models, BigQuery, Dataflow, Terraforms, YAML, Pyspark
  • Extensive knowledge of building complex SQL queries and stored procedures
  • Tableau or Looker Studio
  • JIRA and Agile Methodology


Nice-to-Have Skills:

  • Google Workspace, App Scripts
  • JavaScript, React JS, Node JS
  • Telecom Domain Knowledge


You're the missing piece of the puzzle

  • A passion for data
  • Interest and ability to learn new languages & technologies as needed
  • Familiar with the execution sequence of ETL Flows using Google Platforms
  • Experience with Spark, Beam, Airflow, Cloud SQL, BigQuery, MSSQL
  • Basic understanding of data warehouse, data lake, OLAP and OLTP applications


Great-to-haves

  • Intermediate level candidates with 4-6 years of relevant experience
  • Experience with Big data related tools and technologies
  • Experience with SQL, Unix, Shell scripting
  • Experience with data visualization tools such as Tableau, Domo, Looker


Interview process:

  • Round 1: Off line coding assignment on Google Cloud, BigQuery and Python followed by Video call technical interview.
  • Round 2: Behavioral Questions Interviews with Leadership team
Apply

Exigences

Niveau d'études

non déterminé

Années d'expérience

non déterminé

Langues écrites

non déterminé

Langues parlées

non déterminé