Data Engineer-Data Bricks
Birlasoft Limited
About the role
Area(s) of responsibility
Skills: Data Engineer- Data Bricks, Pyspark, SQL, Python
Experience: 4-7 years
Position Summary
The Data Engineer need to lead the design, development, and optimization of scalable data pipelines and analytics platforms. Need to work with large and complex datasets to help the organization make informed, data‑driven decisions. This role involves data collection, cleansing, analysis, development, testing, visualization, and reporting while collaborating closely with cross‑functional business and technology teams.
Key Responsibilities
Data Preparation & Management
- Clean, prepare, and validate data for analysis
- Acquire data from primary and secondary sources; build and maintain data systems and databases.
- Identify, analyse, and interpret patterns or trends in complex datasets
Data Analysis & Insights
- Perform exploratory and statistical data analysis to support business decisions.
- Provide insights that drive performance improvements, revenue optimization, and customer experience enhancements
- Support business case creation with data‑driven analysis.
- Conduct ad‑hoc analysis for leadership and product teams
Data Transformations
- Build large-scale data processing solutions using Databricks (Scala, Spark SQL, PySpark, Python).
- Design, optimize, and manage Snowflake data warehouse structures and workloads.
- Ensure data quality, performance, governance, and automation across pipelines.
- Build and optimize SSAS Tabular Models and support enterprise BI solutions.
- Lead end-to-end design and delivery of ETL/ELT pipelines using SQL, SSIS, databricks jobs.
Reporting & Visualization
- Develop high quality dashboards, reports, and visualization assets for stakeholders using Power BI, data bricks dashboards.
- Translate complex datasets into clear, concise business insights using tools such as Power BI.
Collaboration
- Work with technology, product, and management teams to define business needs and analytical requirements.
- Communicate results clearly to cross‑functional stakeholders
Continuous Improvement
- Recommend methods to improve performance of data loads, data collection, governance, and reporting systems.
Technical Skills
Required Skills & Experience
- Expertise in SQL and experience with large datasets, optimization.
- Good experience with BI tools (Power BI).
- Good experience in data bricks scala/ sparksql/ pyspark/python and data bricks dashboard.
- Good experience in Snowflake.
- Experience in SSAS (Tabular model) and SSIS (advanced ETL workflows, performance tuning).
- Strong statistical knowledge and analytical mindset.
Soft Skills
- Strong problem-solving and system-thinking mindset
- Excellent communication and stakeholder management
- Ability to handle multiple complex initiatives independently
Requirements
- Expertise in SQL and experience with large datasets, optimization.
- Good experience with BI tools (Power BI).
- Good experience in data bricks scala/ sparksql/ pyspark/python and data bricks dashboard.
- Good experience in snowflake.
- Experience in SSAS (Tabular model) and SSIS (advanced ETL workflows, performance tuning).
- Strong statistical knowledge and analytical mindset.
- Strong problem-solving and system-thinking mindset
- Excellent communication and stakeholder management
- Ability to handle multiple complex initiatives independently
Responsibilities
- Clean, prepare, and validate data for analysis
- Acquire data from primary and secondary sources; build and maintain data systems and databases.
- Identify, analyse, and interpret patterns or trends in complex datasets
- Perform exploratory and statistical data analysis to support business decisions.
- Provide insights that drive performance improvements, revenue optimization, and customer experience enhancements
- Support business case creation with data‑driven analysis.
- Conduct ad‑hoc analysis for leadership and product teams
- Build large-scale data processing solutions using Databricks (Scala, Spark SQL, PySpark, Python).
- Design, optimize, and manage Snowflake data warehouse structures and workloads.
- Ensure data quality, performance, governance, and automation across pipelines.
- Build and optimize SSAS Tabular Models and support enterprise BI solutions.
- Lead end-to-end design and delivery of ETL/ELT pipelines using SQL, SSIS, databricks jobs.
- Develop high quality dashboards, reports, and visualization assets for stakeholders using power BI, data bricks dashboards.
- Translate complex datasets into clear, concise business insights using tools such as Power BI.
- Work with technology, product, and management teams to define business needs and analytical requirements.
- Communicate results clearly to cross‑functional stakeholders
- Recommend methods to improve performance of data loads, data collection, governance, and reporting systems.
Skills
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free