Skip to content
mimi

Senior Data Engineer with Credit Risk

Apptad Inc

New York · Hybrid Full-time Senior Today

About the role

About

We are seeking a highly skilled and experienced Application Engineer and Data Architect to join our dynamic Team. As a senior member of the team, candidate will play a critical role in designing, implementing, and maintaining the application infrastructure. Candidate expertise will help drive innovative data solutions and ensure platform reliability, security, and performance.

Responsibilities

  • Lead architecture and technical design discussions, considering industry-standard technologies and best practices.
  • Support production operations and resolve complex production issues as a senior developer within the Credit Risk application team.
  • Design and implement batch and ad‑hoc data pipelines based on Medallion Lakehouse architecture using modern cloud data engineering patterns, primarily in Databricks.
  • Build and maintain data ingestion flows from upstream systems into object storage (e.g., S3, ADLS) using formats like Parquet, including advanced features such as partitioning, z‑ordering, and schema evolution.
  • Integrate with external XVA/risk engines and implement orchestration logic to manage long‑running external computations.
  • Model and optimize risk measures (e.g., EPE, PFE) for efficient querying and consumption by BI tools, notebooks, and downstream applications.
  • Ensure platform reliability, observability, security (IAM roles, OIDC/Bearer token authentication, encryption), and auditability.
  • Contribute to API design for internal and external customers, focusing on versioning, error handling, and SLAs, with proper documentation.

Requirements

  • 12‑15 years of work experience as an application developer.
  • AWS Certified Cloud Practitioner (or an equivalent cloud certification; Level to be confirmed).
  • Proficiency in REST API development using frameworks such as Django, Flask, FastAPI, or similar.
  • Strong domain expertise in Credit Risk and Counterparty Risk.
  • Expert‑level proficiency in Python, including experience with PySpark/Spark for data engineering and analytics.
  • Hands‑on experience with Azure Databricks, including Medallion Lakehouse Architecture.
  • Solid understanding of SQL, including joins, unions, stored procedures, and query optimization.
  • Familiarity with front‑end and back‑end development (experience is a plus).
  • In‑depth knowledge of CI/CD pipelines utilizing Git, Jenkins, and Azure DevOps.
  • Exposure to technical architecture design (preferred).
  • Experience in creating product specifications, architecture diagrams, and design documents.
  • Proven experience working in an Agile environment using tools like JIRA, Confluence, and Zephyr.
  • Strong communication skills to clearly articulate complex ideas.
  • Collaborative team player with a proactive, self‑starter attitude.
  • Demonstrated ability to quickly learn new technologies.
  • Passion for coding, development, and continuous improvement.

Preferred (but not required)

  • Advanced degree in Finance, Computer Science, or related discipline.
  • Experience with risk modeling and financial analytics.
  • Knowledge of deployment, operational support, and monitoring tools.

Requirements

  • 12-15 years of work experience as an application developer
  • AWS Certified Cloud Practitioner (or an equivalent cloud certification; Level to be confirmed)
  • Proficiency in REST API development using frameworks such as Django, Flask, FastAPI, or similar
  • Strong domain expertise in Credit Risk and Counterparty Risk
  • Expert-level proficiency in Python, including experience with PySpark/Spark for data engineering and analytics
  • Hands-on experience with Azure Databricks, including Medallion Lakehouse Architecture
  • Solid understanding of SQL, including joins, unions, stored procedures, and query optimization
  • In-depth knowledge of CI/CD pipelines utilizing Git, Jenkins, and Azure DevOps
  • Experience in creating product specifications, architecture diagrams, and design documents
  • Proven experience working in an Agile environment using tools like JIRA, Confluence, and Zephyr
  • Strong communication skills to clearly articulate complex ideas
  • Collaborative team player with a proactive, self-starter attitude
  • Demonstrated ability to quickly learn new technologies
  • Passion for coding, development, and continuous improvement

Responsibilities

  • As a senior member of the team, candidate will play a critical role in designing, implementing, and maintaining the application infrastructure
  • Candidate expertise will help drive innovative data solutions and ensure platform reliability, security, and performance
  • Lead architecture and technical design discussions, considering industry-standard technologies and best practices
  • Support production operations and resolve complex production issues as a senior developer within the Credit Risk application team
  • Design and implement batch and ad-hoc data pipelines based on Medallion Lakehouse architecture using modern cloud data engineering patterns, primarily in Databricks
  • Build and maintain data ingestion flows from upstream systems into object storage (e.g., S3, ADLS) using formats like Parquet, including advanced features such as partitioning, z-ordering, and schema evolution
  • Integrate with external XVA/risk engines and implement orchestration logic to manage long-running external computations
  • Model and optimize risk measures (e.g., EPE, PFE) for efficient querying and consumption by BI tools, notebooks, and downstream applications
  • Ensure platform reliability, observability, security (IAM roles, OIDC/Bearer token authentication, encryption), and auditability
  • Contribute to API design for internal and external customers, focusing on versioning, error handling, and SLAs, with proper documentation

Benefits

Max Salary: $145K/Annum

Skills

ADLSAPIAWS Certified Cloud PractitionerAzure DatabricksCI/CDConfluenceDatabricksDjangoFastAPIFlaskGitIAMJenkinsJIRAMedallion Lakehouse ArchitectureOIDCParquetPythonPySparkREST APIS3SparkSQLZephyr

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free