Databricks Developer to Migrate the current data source from a tactical infrastructure 89233-1
S.i. Systèmes
Toronto, ON-
Nombre de poste(s) à combler : 1
- Salaire À discuter
-
Emploi Contrat
- Publié le 29 mai 2025
-
Date d'entrée en fonction : 1 poste à combler dès que possible
Description
Position Title: Developer
Line of Business: Global Asset Management group
Reason the role is open? Migrate the current data source from a tactical infrastructure, which involves direct pulls from virtual machines (VMs), to a strategic infrastructure utilizing Azure Data Factory (ADF) and Databricks.
What project will this contractor be working on? The QI Dashboard Project aims to enhance the existing data infrastructure by transitioning from a tactical setup to a more strategic framework. This transformation will facilitate improved data accessibility, processing, and visualization for business intelligence purposes.
Duration: 6 months
Remote/Hybrid: Hybrid. Core day every week is Thursday. Last Friday of every month without exception.
Office Location: 81 Bay 11th Floor
Job Description:
The QI Dashboard Project aims to enhance the existing data infrastructure by transitioning from a tactical setup to a more strategic framework. This transformation will facilitate improved data accessibility, processing, and visualization for business intelligence purposes.
Objectives
Data Source Transformation: Migrate the current data source from a tactical infrastructure, which involves direct pulls from virtual machines (VMs), to a strategic infrastructure utilizing Azure Data Factory (ADF) and Databricks.
ETL Process Implementation: Develop an Extract, Transform, Load (ETL) process to efficiently transfer data from various sources, including SFTP, web services, and databases, into a Delta Lake. This will enable seamless data consumption for business analytics.
Connectivity Assistance: Support the business in establishing a robust connection between Plotly Dash and Databricks, ensuring that data visualizations are powered by the latest data insights.
Databricks Setup and Management: Establish the Unity Catalog, configure clusters, define schemas, and set up permissions within Databricks to ensure secure and organized data management.
Expected Outcomes
The successful execution of this project will result in a more efficient data infrastructure, improved data accessibility for business users, and enhanced capabilities for data visualization and analysis through the QI Dashboard.
Must Have Requirement:
• 4-6 Years experience with ADF, Databricks and Azure Functions (C#)
• In depth development experience python/notebook on Databricks/Spark/Panda
• Experience in loading datasets from various sources using ADF to Databricks with Medallion architecture
• Experience on converting Databricks workspace from hive metastore to unity catalog & setup associated permissions
• Creating delta table schema/tables with proper design/properties & security permissions
• DevOps & deployment pipeline using Azure DevOps/GitHub Enterprise (especially on Databricks on cluster/secret management and code)
• In depth experience on Azure Entra ID for SSO/RBAC setup
• In depth experience on working with/deploying to Azure PaaS resources such as Azure SQL PaaS, ADF/SHIR, ADLS, Databricks, Synapse dedicated SQL databases
Nice to Have:
• Nice to have experience on Delta Live Table.
• Familiar to work in lock down windows environment for highly regulated industry
• Financial Services Industry experience is an Asset.
Exigences
non déterminé
non déterminé
non déterminé
non déterminé
D'autres offres de S.i. Systèmes qui pourraient t'intéresser