Skip to content
mimi

Senior Operations & Data Engineer (Informatica Specialist)

Talent Groups

Denver · On-site Contract Senior $54 – $64/hr Today

About the role

Hybrid Details: Denver, CO - Hybrid

Duration: 12 months to start

Job Description

• The client is seeking a highly specialized Senior Operations and Data Engineer to serve as the primary administrator and technical lead for our Informatica ecosystem. • This role is a hybrid of platform operations and high-level data engineering, ensuring that sensitive state and federal data is managed within a secure, high-uptime, and cost-effective environment.

Preferred Qualifications

To be considered for this role, candidates should provide proof of the following:

• Active Informatica Certification: (e.g., Informatica Certified Professional in Cloud Data Integration) • Background Clearance Readiness: Absolute eligibility to pass the client's, FTI (Federal Tax Information), and CJIS (Criminal Justice Information Services) background checks.

Key Responsibilities

Platform Operations & Administration

• Informatica Mastery: Act as the lead administrator for Informatica environments; manage platform uptime, vendor escalations, and patch/versioning communications. • Environment Provisioning: Configure Informatica, including complex RBAC (Role-Based Access Control) and security permissions. • Governance & CI/CD: Implement and manage DataOps and CI/CD pipelines to automate deployments for the broader implementation team. • Financial Stewardship: Configure cost-management features such as Informatica resource monitors, budgets, and consumption tracking; consult on chargeback models. • Support business processes of data governance and management

Data Engineering & Transformation

• Pipeline Architecture and Metadata Ingestion: Develop robust ETL/ELT pipelines to ingest data from transactional systems (Line of Business) into the analytical Snowflake environment or to ingest metadata and data from Secure Agents into Informatica • Data Lineage Management:Configure and manage data lineage maps • Data Quality Profile Management: Design and deploy automated data cleansing and quality-check policies and processes. • Performance Engineering: Optimize data flows for specific latency and frequency requirements while maintaining credit efficiency.

Primary Deliverables

• Architectural Contributions: Design reviews, Architectural Plans, and Scope Documents. • Deployment Assets: New account/environment deployments, security schemas, and permission assignments. • Engineering Assets: Comprehensive ETL Pipeline Design Documents, Mapping Documents, and production-ready Pipelines, data quality profile scores • Product Backlog & Support Ticket Management; performance reports • Weekly Status Reports

#LI-Hybrid

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free