Skip to content
mimi

Data Architect

Spruce InfoTech Inc.

Blue Bell · On-site Full-time Lead Yesterday

About the role

Role Overview

We are seeking a visionary Microsoft Fabric Architect & Administrator to lead our transition into a unified, AI-ready data ecosystem. In this role, you will be the primary owner of our Microsoft Fabric tenant—responsible for both the high-level architectural design of our OneLake data estate and the rigorous governance, security, and capacity management required to keep it running efficiently.

The ideal candidate bridges the gap between deep technical data engineering (Spark, SQL, Delta Lake) and strategic platform administration.

Key Responsibilities

Administration & Governance

  • Tenant Management: Configure and audit tenant settings, ensuring feature parity and security compliance across the organization.
  • Capacity Optimization: Monitor Fabric Capacity (F-SKUs) usage via the Capacity Metrics app; implement scaling strategies to balance performance and cost.
  • Security: Enforce fine-grained access control (RBAC), Row-Level Security (RLS), and Object-Level Security (OLS) across the Fabric ecosystem.
  • Purview Integration: Manage data discovery and lineage through Microsoft Purview to ensure data democratization remains compliant, design and enforce labels to ensure protection across Fabric.
  • Governance: Configure DLP policies to prevent unauthorized sharing or leakage of sensitive data from Lakehouses and Semantic Models.
  • Logging, Alerting & Capacity Management: Integrate Fabric with Azure Log Analytics for long-term auditing and trend analysis, proactive monitoring and manage throttling and smoothing events; identify "noisy neighbor" workloads and implement scaling or staggering strategies to ensure performance.
  • Power BI: Establishing standards and conduct training for non IT data access PowerBI users in support of our data governance programs

Architecture & Engineering

  • Design & Implement: Oversight over build out our Medallion Architecture (Bronze, Silver, Gold) using Fabric Lakehouses and Warehouses. Providing best practices to data developers to insure fabric capacity is not over consumed and is optimized.
  • Data Integration: Develop efficient orchestration patterns using Data Factory pipelines and Dataflows Gen2, utilizing Shortcuts to minimize data duplication. Design, implement and manage all connectivity to the MS Fabric including data load sources from SaaS providers
  • Performance Tuning: Optimize semantic models for DirectLake mode to provide near-real-time reporting without the overhead of traditional Import modes.
  • CI/CD & DevOps: Establish deployment pipelines using Git integration to manage lifecycle transitions between Dev, Test, and Production.

Required Qualifications

  • Experience: 5+ years in Data Architecture or Engineering, with at least 1-2 years of hands-on experience in Microsoft Fabric (or deep expertise in Power BI + Azure Synapse/Databricks).
  • Technical Mastery: Proficient in PySpark/Notebooks, T-SQL, and DAX.
  • Data Strategy: Proven track record of designing "OneLake" architectures that break down departmental data silos.
  • Certifications: (Preferred) DP-600: Microsoft Fabric Analytics Engineer Associate or AZ-305: Azure Solutions Architect Expert.

Main Skills

  • Microsoft Fabric
  • Dax
  • T-Sql
  • PySpark
  • Power BI

Requirements

  • Proficient in PySpark/Notebooks, T-SQL, and DAX.
  • Proven track record of designing "OneLake" architectures that break down departmental data silos.

Responsibilities

  • Configure and audit tenant settings, ensuring feature parity and security compliance across the organization.
  • Monitor Fabric Capacity (F-SKUs) usage via the Capacity Metrics app; implement scaling strategies to balance performance and cost.
  • Enforce fine-grained access control (RBAC), Row-Level Security (RLS), and Object-Level Security (OLS) across the Fabric ecosystem.
  • Manage data discovery and lineage through Microsoft Purview to ensure data democratization remains compliant, design and enforce labels to ensure protection across Fabric.
  • Configure DLP policies to prevent unauthorized sharing or leakage of sensitive data from Lakehouses and Semantic Models.
  • Integrate Fabric with Azure Log Analytics for long-term auditing and trend analysis , proactive monitoring and manage throttling and smoothing events; identify "noisy neighbor" workloads and implement scaling or staggering strategies to ensure performance.
  • Establishing standards and conduct training for non IT data access PowerBI users in support of our data governance programs
  • Oversight over build out our Medallion Architecture (Bronze, Silver, Gold) using Fabric Lakehouses and Warehouses.
  • Providing best practices to data developers to insure fabric capacity is not over consumed and is optimized.
  • Develop efficient orchestration patterns using Data Factory pipelines and Dataflows Gen2, utilizing Shortcuts to minimize data duplication.
  • Design, implement and manage all connectivity to the MS Fabric including data load sources from SaaS providers
  • Optimize semantic models for DirectLake mode to provide near-real-time reporting without the overhead of traditional Import modes.
  • Establish deployment pipelines using Git integration to manage lifecycle transitions between Dev, Test, and Production.

Skills

Azure SynapseDAXDatabricksDelta LakeGitMicrosoft FabricMicrosoft PurviewPower BIPySparkSQLT-SQL

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free