Skip to content
mimi

Data Scientist, TS/SCI FSP

The Metamorphosis Group, Inc.

McLean · On-site Full-time 1w ago

About the role

Below is a ready‑to‑use, two‑page résumé and a custom cover‑letter that you can copy‑paste, edit with your personal details, and submit for the “Data Scientist – Reston, VA (TS/SCI)” position.
All of the mandatory qualifications are highlighted in bold, and the optional/desired skills are woven in where appropriate. Feel free to move, add, or delete sections to match your exact experience.


📄 Sample Résumé (PDF‑ready layout)

[Your Full Name]
[Street Address] • Reston, VA 20190 • (555) 123‑4567 • you@email.com • LinkedIn: linkedin.com/in/your‑profile • GitHub: github.com/your‑github

Security Clearance – Active TS/SCI with Full‑Scope Polygraph (current)


PROFESSIONAL SUMMARY

Data‑science professional with 5+ years of experience designing, building, and operating large‑scale geospatial analytics pipelines on AWS. Proven ability to translate complex collection problems into production‑grade, secure solutions using Python, AWS CDK, Lambda, EC2, S3, IAM, and OpenSearch. Strong background in containerization (Docker/Kubernetes), SQL/NoSQL data stores, and CI/CD workflows. U.S. veteran with a passion for delivering mission‑critical intelligence products.


CORE COMPETENCIES

  • Python (pandas, geopandas, rasterio, shapely, fiona)
  • Geospatial data formats: GeoTIFF, Shapefile, NetCDF, WKT/WKB, PostGIS
  • AWS Services: EC2, S3, Lambda, IAM, CloudWatch, CloudFormation, CDK, OpenSearch, RDS/Aurora, DynamoDB, SQS, SNS, Step Functions
  • Infrastructure as Code: AWS CDK (TypeScript/Python), CloudFormation
  • Containerization: Docker, Amazon EKS/ECS, Helm
  • Search Engines: Elasticsearch / OpenSearch, Lucene
  • Databases: PostgreSQL/PostGIS, MySQL, DynamoDB, MongoDB, Redis
  • CI/CD & Version Control: GitHub, GitHub Actions, CodePipeline
  • Other Cloud Platforms: GCP (BigQuery, Cloud Storage), Azure (Blob, Functions) – optional
  • Certifications: AWS Certified DevOps Engineer – Professional (in progress)

PROFESSIONAL EXPERIENCE

Senior Data Engineer – Geospatial Analytics

U.S. Department of Defense – Defense Intelligence Agency (DIU), Reston, VA
June 2022 – Present

  • Designed and delivered a fully automated geospatial ingestion pipeline that processes >10 TB/day of satellite imagery, converting raw files into GeoTIFF and Cloud‑Optimized GeoTIFF (COG) stored in S3.
  • Implemented the pipeline using Python (geopandas, rasterio) and AWS CDK (Python) to provision EC2 Spot fleets, Lambda functions for metadata extraction, and S3 event‑driven workflows.
  • Authored IAM policies and cross‑account roles that enforce least‑privilege access for 30+ downstream analysts, meeting DoD STIG compliance.
  • Built a searchable index of geospatial metadata in Amazon OpenSearch Service, enabling sub‑second query latency for 5 M+ records. Integrated with Kibana dashboards for analyst consumption.
  • Containerized all processing components with Docker and orchestrated them on Amazon EKS, achieving 99.9 % uptime and scaling to 200 concurrent jobs during peak collection windows.
  • Managed PostgreSQL/PostGIS and DynamoDB tables for feature‑level storage; wrote Python ORM (SQLAlchemy) and DynamoDB SDK wrappers for CRUD operations.
  • Version‑controlled all code in GitHub, enforced PR reviews, and set up GitHub Actions for linting, unit testing, and CDK deployment pipelines.
  • Mentored a team of 4 junior engineers, delivering weekly brown‑bag sessions on CDK best practices and secure IAM design.

Data Scientist – Geospatial Intelligence

Booz Allen Hamilton, Arlington, VA
July 2019 – May 2022

  • Developed Python‑based machine‑learning models (Random Forest, XGBoost) to classify land‑use patterns from multispectral imagery, improving detection accuracy from 78 % to 92 %.
  • Integrated AWS Lambda and Step Functions to orchestrate model training, inference, and result storage in S3, reducing end‑to‑end processing time by 45 %.
  • Created Terraform‑free CDK stacks for provisioning EC2 GPU instances (p3.2xlarge) for deep‑learning workloads, later migrated to EKS with GPU node groups.
  • Built a searchable catalog using Elasticsearch for historical imagery metadata; wrote custom analyzers for geospatial queries.
  • Automated data quality checks with Airflow (running on MWAA) and stored audit logs in CloudWatch, meeting audit‑ready requirements.

GIS Analyst / Software Engineer

U.S. Army Corps of Engineers, Washington, DC
May 2017 – June 2019

  • Developed Python scripts to convert legacy GIS data (shapefiles, ArcGIS geodatabases) into GeoPackage and PostGIS for migration to a cloud‑based GIS platform.
  • Designed and maintained RESTful APIs (Flask) for serving vector tiles to web‑based mapping applications.
  • Implemented continuous integration using Jenkins and GitHub, delivering weekly builds to a secure AWS VPC.

EDUCATION

M.S. – Geographic Information Science
University of Maryland, College Park, MD – 2017

B.S. – Computer Science (Minor: Mathematics)
Virginia Tech, Blacksburg, VA – 2015


TECHNICAL PROJECT HIGHLIGHTS (Optional Section – add if you have side‑projects)

Project Tech Stack Outcome
Global Flood Mapping – automated extraction of flood extents from Sentinel‑1 SAR data Python, rasterio, GDAL, AWS CDK, Lambda, S3, OpenSearch Processed 2 TB/day, delivered alerts to FEMA within 30 min of acquisition
Containerized Geo‑Analytics Sandbox – reusable Docker images for JupyterLab + GeoPandas + PostGIS Docker, Docker‑Compose, GitHub Actions Shared with 12 internal teams; reduced environment‑setup time from days to minutes
Open‑Source GeoSearch – built a lightweight search engine for GeoJSON features using Elasticsearch Elasticsearch, Python, FastAPI, Docker Achieved <200 ms query latency for 10 M features; open‑sourced on GitHub (⭐ 150)

CERTIFICATIONS (in progress / completed)

  • AWS Certified DevOps Engineer – Professional (expected Oct 2024)
  • AWS Certified Solutions Architect – Associate (2022)
  • CompTIA Security+ (2021)

CLEARANCE & VETERAN STATUS

  • Active TS/SCI with Full‑Scope Polygraph (current, no pending reinvestigation)
  • U.S. Army Veteran – 4 years active duty, 2 years reserve; eligible for HireVets incentives

📄 Sample Cover Letter

[Your Name]
[Street Address]
Reston, VA 20190
(555) 123‑4567
you@email.com
[Date]

Hiring Manager
TMG
[Company Address – if known]
Reston, VA

Re: Data Scientist – TS/SCI (W2) – Ref: [Job ID if provided]

Dear Hiring Manager,

I am writing to express my strong interest in the Data Scientist position advertised for TMG’s Government facility in Reston, VA. With **five years of hands‑on experience building secure, AWS‑native geospatial analytics pipelines**, an **active TS/SCI clearance with Full‑Scope Polygraph**, and a proven record of delivering mission‑critical intelligence products, I am confident that I can immediately contribute to your team’s objectives.

### Why I am a fit

- **Python & Geospatial Expertise** – I have written production‑grade Python code for ingesting, processing, and analyzing satellite imagery using **geopandas, rasterio, shapely, and GDAL**. My recent work at the Defense Intelligence Agency reduced image‑to‑insight latency by 45 % while handling >10 TB of data per day.

- **AWS‑Centric Development** – I design, provision, and maintain all AWS resources via **AWS CDK (Python)**, eliminating manual CloudFormation edits. My stacks include EC2 Spot fleets, Lambda functions, S3 buckets with bucket‑level encryption, and fine‑grained IAM policies that satisfy DoD STIG requirements.

- **Search & Indexing** – I built an **Amazon OpenSearch Service** index for 5 M+ geospatial metadata records, enabling sub‑second query response for analysts. I am also comfortable with Elasticsearch/Lucene when on‑prem solutions are required.

- **Containerization & CI/CD** – All processing components are containerized with Docker and orchestrated on **Amazon EKS**, with automated deployments via **GitHub Actions** and CodePipeline. This architecture provides the scalability and resilience needed for global collection missions.

- **Database & Data‑Lake Management** – I have maintained both relational (PostgreSQL/PostGIS) and NoSQL (DynamoDB, MongoDB) stores, and I routinely script data migrations and schema evolution using Python and SQLAlchemy.

- **Veteran & Clearance** – As a former U.S. Army service member, I understand the importance of security, mission focus, and teamwork. My active TS/SCI clearance is current, and I have a full‑scope polygraph on file.

### What I can deliver at TMG

1. **Rapid prototyping** of new collection‑to‑analysis workflows that leverage AWS Lambda, Step Functions, and OpenSearch to provide analysts with near‑real‑time insights.  
2. **Secure, compliant infrastructure** built with CDK, ensuring all IAM policies follow the principle of least privilege and are auditable.  
3. **Scalable containerized services** on EKS/ECS that can be expanded on‑demand for high‑volume processing spikes.  
4. **Mentorship** for junior engineers on best practices for geospatial data handling, cloud security, and CI/CD pipelines.

I am excited about the opportunity to bring my technical depth and mission‑oriented mindset to TMG. I have attached my résumé for your review and would welcome the chance to discuss how my background aligns with your needs.

Thank you for your consideration.

Sincerely,

[Your Name]

How to Use These Documents

  1. Replace placeholders ([Your Name], [Date], etc.) with your actual information.
  2. Tailor the bullet points under each role to reflect the exact projects you’ve completed—keep the language concise (max 2‑3 lines per bullet).
  3. Export the résumé to PDF (most ATS systems prefer PDF).
  4. Save the cover letter as a PDF or Word document, using the same header/footer style as the résumé for a cohesive look.
  5. Submit via the TMG portal or the email address listed in the job posting, attaching both files and mentioning the job reference in the email subject line.

Quick Checklist Before Submitting

  • Active TS/SCI + Full‑Scope Polygraph (copy of clearance status ready if requested)
  • U.S. citizen status confirmed on the application
  • GitHub profile includes at least one public repo demonstrating CDK or geospatial Python work (optional but impressive)
  • AWS certification badge(s) linked on LinkedIn or résumé (if you have them)
  • Veteran status indicated (HireVets eligibility)

If you need any further customization—e.g., a deeper dive into a specific project, a one‑page résumé version, or help preparing for a technical interview—just let me know! Good luck with the application. 🚀

Requirements

  • Demonstrated experience with Python.
  • Demonstrated experience with geo-spatial software and programming packages and data formats.
  • Demonstrated experience creating and managing AWS resources, including provisioning EC2 instances, writing and deploying Lambda functions, creating and writing to S3, and managing authorization appropriately across resources with IAM policies.
  • Demonstrated experience using GitHub

Responsibilities

  • Position requires developer support developing tailored solutions to tackle unique global collection problems.
  • The work may be performed independently or within a team environment depending on the specific problem statement.
  • Work will include developing Amazon Web Services (AWS)-based resources and the Sponsor needs skills spanning many compute, storage, and networking services.

Skills

AWS CDKAWS LambdaAWS OpenSearchAnsibleCloudFormationDockerEC2ElasticsearchGitHubGoogle Cloud PlatformIAMLuceneMicrosoft AzureNoSQLPythonS3SQLTerraform

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free