Skip to content
mimi

Data Platform Engineer (DevOps + Data Engineering)

EpochGeo

Reston · Hybrid Full-time Entry Level $100k – $140k/yr Today

About the role

About EpochGeo

EpochGeo is looking for a Junior to Mid level Data Platform Engineer (DevOps + Data Engineering) to support one of the most exciting and growing automated analytic projects in the national security space.

You’ll work at the intersection of DevOps and data engineering across Airflow, Spark, Kubernetes, and modern table/storage patterns (Iceberg/Parquet/HDFS-like environments).

We are looking for a software engineer who will focus on devops and data engineering tasks to help us maintain our data platform.

We have built custom partitioning and data management solutions to optimize analytical queries over our data lake.

We have long term goals to develop an analytical agent that can assist users in working with the data.

Your Background

Your background likely includes:

  • 2-5 years of experience in software, data engineering, platform engineering, or DevOps.
  • Strong Python and SQL skills.
  • Experience with Linux and command-line troubleshooting.
  • Working knowledge of Kubernetes fundamentals (pods, events, logs, resources).
  • Familiarity with workflow orchestration (Airflow preferred).
  • Familiarity with distributed data processing (Spark preferred).
  • Strong debugging habits and a bias for practical, measurable improvements.

Nice to Have

  • Experience with Iceberg/Trino/Presto ecosystems.
  • Experience with HDFS or large-scale object/file data platforms.
  • Experience tuning Spark jobs (partitioning, shuffle, memory).
  • Experience with incident response and postmortem-driven improvements.
  • Exposure to LLMs, retrieval patterns, semantic search, or agentic data tools.

What you will be doing

  • Operate and improve batch data pipelines in Airflow and Spark.
  • Troubleshoot production incidents: failed jobs, OOMs, stuck deployments, and performance regressions.
  • Support Kubernetes/Helm operations for data services and scheduled workloads.
  • Improve observability, alerting, and runbooks so incidents are easier to detect and resolve.
  • Strengthen CI/CD and release quality for DAG/operator changes.
  • Partner with engineers on backfills, schema/table maintenance, and data platform migrations.
  • Contribute to foundational data and platform capabilities that will power an analytical AI assistant for internal users.

Additional Requirements

  • Must hold a TS/SCI clearance and be willing to submit for a CI Poly

Job Details

  • Job Type: Full-time
  • Pay: $100,000.00 - $140,000.00 per year

Benefits

  • 401(k)
  • 401(k) matching
  • Dental insurance
  • Flexible schedule
  • Flexible spending account
  • Health insurance
  • Health savings account
  • Life insurance
  • Paid time off
  • Parental leave
  • Referral program
  • Tuition reimbursement
  • Vision insurance

Experience

  • Python: 1 year (Required)

Security Clearance

  • Top Secret (Required)

Location Details

  • Ability to Commute: Reston, VA 20191 (Required)
  • Ability to Relocate: Reston, VA 20191: Relocate before starting work (Required)
  • Work Location: Hybrid remote in Reston, VA 20191

Skills

AirflowCI/CDHDFSHelmIcebergKubernetesLinuxLLMsParquetPythonSQLSparkTrinoPresto

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free