Skip to content
mimi

Lead Data Engineer

Enormous Enterprise LLC

Pennington · On-site Full-time Lead Yesterday

About the role

Company:

Enormous Enterprise LLC

Location:

Pennington, NJ

Pay:

DOE

Posted:

April 04, 2026

Apply

Description:

Lead Data Engineer

Location: Pennington, NJ ( 3 Days onsite 2 days remote)

Local for Local nearby Candidates

Visa: w2 with 12 months visa validity

Duration: 12 months Contract (with high possibilities of extension)

Description:

• We are seeking an experienced and results-driven Data Technology Leader to lead enterprise-scale data initiatives. • This role requires strong technical expertise in data preparation, orchestration, and integration, combined with leadership skills to manage multiple scrum teams and deliver major rollouts. • The ideal candidate will have hands-on experience with modern data tools, workflow automation, and architecture design for large-scale projects.

Key Responsibilities:

• Lead multiple scrum teams to deliver enterprise-level data solutions and architecture. • Administer and manage data preparation tools such as Alteryx, RapidMiner, and Tableau Prep. • Design, optimize, and tune SQL/PL-SQL queries for high performance. • Develop and maintain Python-based solutions for data processing and automation. • Implement and maintain CI/CD pipelines across multiple environments. • Mentor development teams on best practices for performance optimization and code quality. • Troubleshoot and resolve performance issues, ensuring scalability and reliability. • Collaborate in a shared services environment to support cross-functional initiatives.

Required Qualifications:

• Hands-on experience with workflow orchestration for data-driven solutions. • Expertise in optimizing data integration pipelines using ETL tools. • Strong knowledge of task dependency tuning and scheduling for scalability. • Proficiency with OpenShift containers and containerized deployments. • Experience with Airflow or similar batch processing tools. • Solid understanding of CI/CD automation and pipeline management. • Ability to lead large-scale architecture initiatives and enterprise rollouts.

Desired Skills:

• Familiarity with Operational Insights (preferred). • Experience in ETL automation and orchestration. • Strong Python programming and scripting skills. • Ability to mentor and guide teams in best practices. • Exposure to shared services environments and enterprise-level governance.

Apply

Requirements

  • Hands-on experience with workflow orchestration for data-driven solutions.
  • Expertise in optimizing data integration pipelines using ETL tools.
  • Strong knowledge of task dependency tuning and scheduling for scalability.
  • Proficiency with OpenShift containers and containerized deployments.
  • Experience with Airflow or similar batch processing tools.
  • Solid understanding of CI/CD automation and pipeline management.
  • Ability to lead large-scale architecture initiatives and enterprise rollouts.

Responsibilities

  • Lead multiple scrum teams to deliver enterprise-level data solutions and architecture.
  • Administer and manage data preparation tools such as Alteryx, RapidMiner, and Tableau Prep.
  • Design, optimize, and tune SQL/PL-SQL queries for high performance.
  • Develop and maintain Python-based solutions for data processing and automation.
  • Implement and maintain CI/CD pipelines across multiple environments.
  • Mentor development teams on best practices for performance optimization and code quality.
  • Troubleshoot and resolve performance issues, ensuring scalability and reliability.
  • Collaborate in a shared services environment to support cross-functional initiatives.

Benefits

Duration: 12 months Contract (with high possibilities of extension)

Skills

AlteryxRapidMinerTableau PrepSQLPL-SQLPythonOpenShiftAirflowCI/CDETL

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free