Data Operations Engineer
Deloitte
About the role
About
As a Data Operations Engineer on Converge for Healthcare's Expert Services team, you will play a hands-on technical role connecting client source data to the foundational data models powering Deloitte's Data Studio platform - a growing portfolio of healthcare provider analytics products including Revenue Intellect™, Care Intellect™, SMarT Rapid Analytics, and Supply Chain Intellect™.
In this role, you will work at the intersection of data engineering, cloud platform operations, and applied AI - designing and operating the cloud-native data pipelines that turn messy, real-world healthcare data into reliable, decision-ready analytics. You will work across both subscription-based product delivery and Deloitte Consulting engagements where Data Studio is embedded as a core enabler, partnering primarily with engineering, data, and product teams, and occasionally engaging directly with client data teams to resolve integration challenges.
This position is well suited for engineers who enjoy building durable data systems, working through ambiguity in real-world data, and applying emerging AI tooling to push the ceiling on what a small team can deliver - within a rapidly evolving healthcare analytics product ecosystem.
Recruiting for this role ends on 05/21/2026.
Work you'll do
As a Data Operations Engineer on Converge for Healthcare's Expert Services team, you will be responsible for:
- Data integration & pipeline engineering. Design, build, and optimize cloud-native ETL/ELT pipelines that ingest client source data and conform it to the Data Studio platform's foundational data model - making real-world healthcare data ready to power production analytics.
- Data validation, profiling & quality. Profile, validate, and QA large, complex healthcare datasets for accuracy, completeness, and conformance to platform standards; combine traditional debugging with LLM-enabled data exploration and ML-based anomaly detection to find and resolve issues faster than manual approaches allow, partnering with client and Deloitte teams as needed when integration issues require it.
- Analytics & insight enablement. Develop the analytics layer of the Data Studio platform - including BI dashboards, self-service reporting, and ML Lab workflows - putting validated, production-ready data in the hands of consulting teams and clients.
- Automation & orchestration. Implement and maintain workflow automation, monitoring, and alerting using event-driven architectures and orchestration tools, with the goal of building systems that run reliably without constant intervention.
- Product collaboration & solution evolution. Act as a hands-on technical voice into the Data Studio platform's evolution - translating real-world delivery learnings into concrete product, data model, and platform enhancement opportunities, and partnering with product and engineering teams to validate and pressure-test new capabilities before they ship.
A strong successful candidate will possess these skills:
- Expert SQL proficiency, including complex query authoring, data profiling, performance tuning, and query optimization across large-scale, messy datasets
- Strong Python proficiency for data wrangling, scripting, automation, and integrating ML/AI capabilities into data pipelines
- Hands-on experience designing and operating cloud-native data pipelines, with judgment around when to use which tool and how to debug distributed systems when things break; practical familiarity with AWS data services (e.g., Redshift, Glue, S3, Step Functions, Lambda) and exposure to AWS AI/ML services (e.g., Bedrock, SageMaker) a plus
- Sound data modeling judgment, including conforming heterogeneous source data to standardized analytics models without losing fidelity
- Demonstrated experience working with large, complex datasets across structured, semi-structured, and unstructured formats
- Forward-thinking engineering mindset, including fluency with modern code collaboration workflows (Git, pull requests, code review), practical use of AI-assisted development tools (e.g., Claude Code, GitHub Copilot), and curiosity about emerging AI/ML techniques such as agentic patterns, RAG, and vector databases
- Working familiarity with modern BI tools (e.g., Tableau, Power BI, Superset) and workflow orchestration platforms (e.g., Airflow, Step Functions)
- Strong ownership mindset and comfort with ambiguity - able to self-manage priorities, juggle concurrent workstreams, and adapt as priorities shift
- Clear communicator who works well across distributed engineering, product, and occasional client or consulting stakeholders, including across international time zones
- Awareness of Responsible and Trustworthy AI principles, including data privacy, bias mitigation, and governance in AI-driven workflows
- Working knowledge of healthcare data formats and interoperability standards (e.g., claims, remittances, EMR data, HL7, FHIR, X12 EDI), with practical experience handling their quirks, version differences, and typical data quality patterns
- Working understanding of the broader healthcare data ecosystem - including how revenue cycle, clinical, and operational datasets relate; how core coding systems (ICD, CPT, HCPCS, DRG) interact; and basic awareness of HIPAA and PHI handling considerations
The team
This role sits within the Converge for Healthcare Expert Services team, part of Deloitte Consulting's Innovation & Delivery Transformation (I&DT) practice. I&DT brings an engineering- and innovation-led mindset to how Deloitte builds, delivers, and scales technology-enabled solutions - organizing teams to move quickly from idea to implementation and operate effectively in a rapidly evolving, technology-driven market.
Converge for Healthcare is Deloitte's industry-focused asset studio for healthcare, responsible for developing and operating analytics, data, and AI-enabled products purpose-built for healthcare organizations. The Data Studio platform powers the Intellect product suite - including Revenue Intellect, Care Intellect, and Supply Chain Intellect - and serves as the foundational data and analytics layer across Converge for Healthcare's product portfolio.
Data Operations Engineers operate at the intersection of data engineering, product, and delivery - primarily collaborating with internal engineering, data, and product teams, and occasionally engaging with client teams and Deloitte Consulting practitioners to ensure data flows are reliable, performant, and continuously improving based on real-world delivery experience.
Qualifications
Required:
- Bachelor's degree in Computer Science, Information Systems, Engineering, Health Informatics, or a related technical discipline
- 3+ years of hands-on experience with data operations, ETL/ELT development, and cloud-native data integration
- 3+ years of expert-level SQL experience
- 2+ years of Python experience
- Ability to travel up to 15%, on average, based on the work you do and the clients and industries/sectors you serve
- Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future
Preferred:
- Master's degree in Computer Science, Engineering, Information Systems, or a related technical discipline
Compensation
The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Deloitte, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is $84,400 - $155,400.
You may also be eligible to participate in a discretionary annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance.
Skills
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free