AG
Data Engineer
Ascii Group, LLC
Boston · On-site Contract $57 – $57/hr Yesterday
About the role
Key Responsibilities
- Design and develop scalable, high-performance data pipelines using Databricks and PySpark
- Build, optimize, and maintain ETL/ELT workflows for structured and unstructured data
- Implement data processing solutions using Delta Lake and lakehouse architecture principles
- Ensure data quality, integrity, and reliability through validation and monitoring frameworks
- Optimize data workflows for performance, scalability, and cost efficiency
- Collaborate with data engineers, analysts, and business stakeholders to understand data requirements
- Develop and maintain data models to support analytics and reporting needs
- Implement security, governance, and compliance best practices in data pipelines
- Troubleshoot and resolve data-related issues in production environments
Must Skills
- Databricks and Apache Spark (PySpark)
- SQL for data transformation and querying
- Delta Lake and lakehouse architecture
- ETL/ELT design patterns and data engineering
- cloud platforms (Azure, AWS, or Google Cloud Platform)
- data modeling, data warehousing, and big data technologies
Skills
Apache SparkAWSAzureDatabricksDelta LakeETLGoogle Cloud PlatformPySparkSQL
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free