Skip to content
mimi

Data BA

HireOn Tech

Canada · Hybrid Contract Today

About the role

Responsibilities

  • Design and maintain physical and logical data models for Capital Markets domains, including trade execution, portfolio positions, market data (ticks/pricing), and risk metrics.
  • Translate complex financial hierarchies (e.g., fund-of-funds, multi-asset class structures) into optimized Databricks structures.
  • Ensure data models support temporal requirements, such as Point-in-Time (PIT) analysis and "As-Of" financial reporting.
  • Lead the creation of comprehensive STM documents, detailing the journey from legacy financial systems and external providers (e.g., Bloomberg, Reuters, Aladdin) to ADLS Gen2.
  • Define complex transformation logic, business rules, and data enrichment steps within the STM to guide Data Engineering squads.
  • Map data lineage to ensure full traceability for regulatory compliance.
  • Design Delta Lake structures that prioritize "Shuffle-free" joins for massive capital markets datasets.
  • Implement optimized partitioning and Z-Ordering strategies specifically for time-series financial data to enable high-speed analytics.
  • Utilize Unity Catalog to govern data access and maintain a centralized metadata repository for the GWAM ecosystem.

Required Skills & Qualifications

  • Industry Expertise: 5+ years of experience in Capital Markets or Wealth Management, with a deep understanding of financial instruments and trade lifecycles.
  • Technical Modeling: 7+ years of experience in data modeling, with a mastery of Azure Databricks and ADLS Gen2.
  • STM Mastery: Proven track record of creating highly detailed Source-to-Target Mappings for complex data migration or integration projects.
  • Data Engine Proficiency: Expert-level Spark SQL and PySpark. Ability to optimize data structures for the Spark Catalyst Optimizer.
  • Storage Formats: Expertise in Delta Lake (ACID transactions, Time Travel) and Parquet optimization.
  • Governance: Hands-on experience implementing data governance and security via Unity Catalog.

Technical Stack

  • Compute: Azure Databricks (Jobs, SQL Warehouses).
  • Storage: ADLS Gen2 (Delta/Parquet).
  • Governance: Unity Catalog.
  • Analysis Tools: SQL, Python, Excel (for data profiling).
  • Documentation: Confluence/Visio for STM and ERDs.

Skills

ADLS Gen2Azure DatabricksBloombergConfluenceDatabricksDelta LakeExcelParquetPySparkReutersSpark SQLSQLUnity CatalogVisio

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free