JV
Data Engineer Python + AWS
Jobs via Dice
Atlanta · On-site Full-time Yesterday
About the role
About
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Compunnel Inc., is seeking the following. Apply via Dice today!
Job Summary
We are seeking a skilled Data Engineer with a strong background in Python and AWS to design, build, and maintain scalable data pipelines and data infrastructure. The ideal candidate will have hands-on experience working with large datasets, cloud-based architectures, and modern data engineering tools.
Key Responsibilities
- Design, develop, and optimize scalable data pipelines using Python
- Build and manage data infrastructure on the AWS cloud platform
- Develop ETL/ELT processes for ingesting, transforming, and loading data from multiple sources
- Work with large-scale structured and unstructured datasets
- Implement data quality, validation, and monitoring frameworks
- Collaborate with data scientists, analysts, and cross-functional teams
- Optimize data workflows for performance and cost efficiency
- Ensure data security, governance, and compliance standards are followed
Required Qualifications
- Strong programming experience in Python
- Hands-on experience with AWS services such as S3, Lambda, Glue, Redshift, EMR, and Athena
- Experience with ETL tools and data pipeline frameworks
- Solid understanding of SQL and database systems (RDBMS and NoSQL)
- Familiarity with data modeling and data warehousing concepts
- Experience with version control systems such as Git
- Strong problem-solving and analytical skills
Preferred Qualifications
- Experience with big data technologies such as Spark or Hadoop
- Experience with orchestration tools such as Airflow
- Exposure to containerization tools such as Docker or Kubernetes
- Knowledge of CI/CD pipelines and DevOps practices
- Experience working in Agile environments
- Experience with real-time data processing technologies such as Kafka or Kinesis
- Exposure to machine learning pipelines or analytics platforms
- AWS certifications such as AWS Certified Data Analytics or Solutions Architect
- Bachelors or Masters degree in Computer Science, Engineering, or a related field
Education
- Bachelors Degree
Certification
- AWS Certified Data Analytics
- Solutions Architect
Requirements
- Strong programming experience in Python
- Hands-on experience with AWS services such as S3, Lambda, Glue, Redshift, EMR, and Athena
- Experience with ETL tools and data pipeline frameworks
- Solid understanding of SQL and database systems (RDBMS and NoSQL)
- Familiarity with data modeling and data warehousing concepts
- Experience with version control systems such as Git
- Strong problem-solving and analytical skills
Responsibilities
- Design, develop, and optimize scalable data pipelines using Python
- Build and manage data infrastructure on the AWS cloud platform
- Develop ETL/ELT processes for ingesting, transforming, and loading data from multiple sources
- Work with large-scale structured and unstructured datasets
- Implement data quality, validation, and monitoring frameworks
- Collaborate with data scientists, analysts, and cross-functional teams
- Optimize data workflows for performance and cost efficiency
- Ensure data security, governance, and compliance standards are followed
Skills
AthenaAWSAWS GlueAWS LambdaAWS RedshiftDockerEMRGitHadoopKafkaKinesisKubernetesNoSQLPythonRDBMSS3SQLSparkAirflow
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free