Ce recruteur est en ligne!

Voilà ta chance d'être vu en premier!

Postuler maintenant

Senior Application Specialist (Salesforce) to build reliable data pipelines while partnering with stakeholders to deliver impactful analytics and reporting

Ottawa, ON
  • Nombre de poste(s) à combler : 1

  • À discuter
  • Emploi Contrat

  • Date d'entrée en fonction : 1 poste à combler dès que possible

Our valued Telecommunications client is seeking a Senior Application Specialist (Salesforce) to build reliable data pipelines while partnering with stakeholders to deliver impactful analytics and reporting for our telecommunications client.


Responsibilities:

  • Design, develop, and maintain data pipelines and workflows for ingestion, transformation, and delivery of clean, reliable business data for analysis and reporting.
  • Collaborate with business teams to gather requirements, perform data validation, and support UAT/demos.
  • Extract, integrate, and transform data from diverse systems including Workday, Salesforce, on-prem and SaaS applications using APIs, JDBC/ODBC, and native/direct connections.
  • Write and optimize advanced SQL for data modeling, transformation, and cost-efficient query execution.
  • Build and optimize Power BI datasets, models, and dashboards for business insights and performance tracking.
  • Use Databricks Notebooks with Python and/or Scala for data preparation, automation, and analysis.
  • Monitor and optimize compute resources and job performance for cost control and efficiency.
  • Document data pipelines, transformation logic, and architecture for transparency and maintainability.


Must have skills:

  • 2 - 3 years in Salesforce Development
  • Salesforce Administration experience
  • Experience integrating data from Business Applications like Workday and Salesforce (via APIs, reports, or connectors).
  • Salesforce Declarative Tools (Flow Builder, Process Builder)
  • Advanced SQL skills for large-scale data transformation and optimization.
  • Ability to manage and transform data from on-premises and cloud systems.


Nice to have skills:

  • Strong hands-on experience with Databricks (including Delta Lake, Spark SQL, and Notebooks).
  • Strong working knowledge of Power BI (data modeling, DAX, dashboard design, publishing).
  • Fundamental knowledge of Apache Spark (architecture, RDDs, DataFrames, optimization).
  • Experience in query and compute cost optimization within Databricks or similar platforms.
  • Familiarity with data governance, security, and metadata management.
  • Exposure to CI/CD for data pipelines using Git or DevOps tools.
  • GenAI Agents and/or ML experience
  • Relevant certifications (e.g., Databricks Certified Data Engineer, Microsoft Power BI Data Analyst, Workday Reporting Specialist) are a plus.
Disclaimer:
AI may be used in evaluating candidates.
This posting is for an existing vacancy.
Apply

Exigences

Niveau d'études

non déterminé

Années d'expérience

non déterminé

Langues écrites

non déterminé

Langues parlées

non déterminé