Dynamic GCP Data Engineer Focusing on Data Pipeline Efficiency
Bell
About the role
About
Elevate your career as a GCP Data Engineer specializing in data pipeline construction and optimization. Join a hybrid role where your skills in ETL processes and data management will shine.
This position is ideal for a detail‑oriented data engineer with 1–3 years of experience in data engineering practices. You will be responsible for developing robust data management systems while collaborating with stakeholders to design efficient data solutions. Bring your SQL expertise and familiarity with Google Cloud Platform to this rewarding opportunity.
Key Responsibilities
- Develop and maintain efficient data management systems
- Implement ETL processes for seamless data movement
- Optimize data pipelines for performance and reliability
- Monitor and resolve data pipeline bottlenecks
- Collaborate with engineers to define data requirements
Requirements
- Bachelor’s degree in a relevant field
- 1–3 years of data engineering experience
- Proficient in SQL and Google Cloud Platform
- Hands‑on experience with Airflow
- Strong analytical and problem‑solving skills
Additional Information
Drive the future of data management with your innovative approach to pipeline development and data solutions.
#J-18808-Ljbffr
Requirements
- Proficient in SQL and Google Cloud Platform
- Hands-on experience with Airflow
- Strong analytical and problem-solving skills
Responsibilities
- Develop and maintain efficient data management systems
- Implement ETL processes for seamless data movement
- Optimize data pipelines for performance and reliability
- Monitor and resolve data pipeline bottlenecks
- Collaborate with engineers to define data requirements
Skills
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free