Skip to content
mimi

Telecom ETL Data Architect

Jobs via Dice

Philadelphia · On-site Full-time Senior Yesterday

About the role

About

Tata Elxsi brings together the best technology and user‑centric design expertise to help customers deliver innovative solutions and great consumer experiences. Our integrated Design and Technology teams help enterprises reimagine their products and services – from strategy, consumer research and insights, to service and experience design, technology implementation, integration, launch, and beyond.

Role Overview

We are looking for an experienced Telecom ETL Data Architect who can design, architect, and optimize large‑scale data pipelines within the telecom ecosystem. This role requires deep hands‑on skills in PySpark, Spark clusters, Databricks, and ETL orchestration, along with strong communication, articulation, and storytelling abilities. The ideal candidate will translate complex technical workflows into clear business narratives, influence stakeholders through structured communication, and derive use‑cases from domain data insights.

Key Responsibilities

Data Architecture & Engineering

  • Design and architect scalable, high‑performance ETL/ELT data pipelines using PySpark, Python, and Spark clusters.
  • Develop data models and frameworks aligned with telecom processes such as buyflow, billing, usage, customer management, and order processing.
  • Build, optimize, and monitor pipelines on Databricks (Delta Lake, Workflows, Cluster configuration).
  • Define and enforce ETL standards, data quality rules, and engineering best practices.

Domain & Business Integration

  • Work closely with business and product teams across telecom functions:
    • Customer / Account onboarding
    • Buyflow journeys
    • Billing & payments
    • Network provisioning
    • Customer service & troubleshooting
  • Translate domain processes into logical and physical data flows.

Communication & Storytelling

  • Clearly articulate technical solutions to non‑technical stakeholders.
  • Create data stories and high‑impact presentations connecting data insights to business outcomes.
  • Communicate architectural decisions, trade‑offs, and roadmap recommendations.
  • Present complex architecture in simplified, visual storytelling formats.

Orchestration & Operations

  • Implement and maintain job scheduling/orchestration using Rundeck or similar tools.
  • Build monitoring, logging, and automated recovery mechanisms for pipelines.
  • Ensure end‑to‑end performance tuning, cost optimization, and SLA adherence.

Collaboration & Leadership

  • Work with cross‑functional engineering teams, product owners, and solution architects.
  • Mentor junior data engineers and set coding and architecture standards.
  • Drive design sessions, code reviews, and architecture assessments.

Required Skills

Technical Skills

  • Strong expertise in PySpark, Spark SQL, Spark Streaming.
  • Proven experience building pipelines in Databricks (Delta Lake, Jobs, Clusters).
  • Solid Python programming and modular ETL development experience.
  • Experience with telecom systems, data domains, and operational workflows.
  • Knowledge of CI/CD, version control (Git), and cloud platforms (Azure/AWS/GCP).
  • Experience with orchestration tools such as Rundeck, Airflow, or Control‑M.

Soft Skills

  • Excellent articulation and structured communication.
  • Ability to simplify complex data concepts into business‑friendly narratives.
  • Strong problem‑solving, analytical, and decision‑making skills.
  • Ability to lead conversations with senior stakeholders.
  • A natural storyteller with the ability to craft architecture and data stories.

Preferred Qualifications

  • Background working with large telecom providers.
  • Experience with event‑driven architecture, Kafka, or real‑time streaming.
  • Understanding of telecom KPIs, customer journeys, and operational systems.
  • Experience in performance tuning of Spark workloads.

Benefits & Perks

  • Excellent healthcare options: medical, vision, prescription & dental.
  • Family focus & balance: medical, commuter & dependent FSA, competitive PTO, sick time, and Employee Assistance Program.
  • Financial security: competitive 401(k) match with Safe Harbor Plan.
  • Employee recognition programs.
  • Perks at work: exclusive one‑stop online discount marketplace.

Skills

AWSAzureDatabricksDelta LakeGitGoogle Cloud PlatformKafkaPythonPySparkRundeckSparkSpark SQLSpark Streamingevent-driven architecture

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free