Core Data Analyst
HireAlpha
About the role
Role
Data Analyst
Employment Type
Full Time / WFO
Location
Dubai
Notice Period
Immediate / 15 Days / 30 Days
Experience Level
3+ Years
Key Skills (Mandatory)
- Databricks + Azure
- Strong Banking Domain experience
- Data Migration, Data Mapping, Data Mart
- ETL, SQL, Apache Spark
- Experience in BRF OR Tax VAT Invoicing (any one)
Note
- Strictly looking for Data Analyst profiles
- Data Scientists and Data Engineers will not be considered
About the Role
As a data analyst your main role is understanding business requirements and provide data mapping and analysis that forms the blueprint of data flow from raw source systems to a customized data mart, dashboards or any other API required for data consumption. You will play a crucial role in the data management process, ensuring that data is collected, transformed, and made accessible for analytical purposes. Your expertise in data technologies will enable the organization to make informed decisions and derive valuable insights from large datasets.
Key Responsibilities / Accountabilities
- Understanding the project requirements, by conducting Business Discovery Sessions
- Work with internal teams like system support teams, DBA and MIS teams to translate the existing requirements into the desired mart and reports
- Prepare data mapping documents, needed for development of various layers of the data platform (raw – bronze, EDM – silver, project specific – gold)
- Support during various phases of the projects as needed by the data engineers, data modelers, QA testers and Power BI developers
- Collaborating with other developers, data analysts, and stakeholders to ensure that the software meets the needs of the business or organization
Core Skills
- Strong analytical and problem‑solving skills to effectively design data solutions.
- Banking Retail, Business & Wholesale Product Knowledge and understanding
- Attention to detail and a commitment to delivering high‑quality work.
- Excellent communication and teamwork skills to collaborate with cross‑functional teams.
- Ability to manage time effectively and prioritize tasks in a dynamic work environment.
Technical Skills
- Strong data querying and processing skills using SQL, Oracle, Apache Spark or similar language
- Data Visualization tools – Power BI, Business Objects, Crystal or similar tool
- Data Warehousing and ETL concepts
Competencies
- Expert‑level proficiency in SQL, Python with SAS/R/Spark as plus
- Deep understanding of distributed data processing frameworks, particularly Apache Spark.
- Experience with cloud platforms (Azure, AWS) — including Object storage, compute, networking, and data integration services.
- Familiarity with data modeling techniques, including schema design, partitioning, and performance tuning.
- Experience working with structured and semi‑structured data formats (e.g., JSON, Parquet, Avro, XML).
- Knowledge of data governance, quality assurance, and compliance best practices.
- Solid grasp of core computer science fundamentals (e.g., data structures, search algorithms, queues) relevant to performance‑critical data engineering
Minimum Formal Education
- Bachelors or master’s degree in computer science, Information Technology, or a related field.
- Demonstrated expertise in working with Data Bricks for data processing and analysis.
- Experience in implementing and optimizing data solutions using multiple programming languages in an Azure environment.
Certification Requirement
- Any Cloud Technology Certification an advantage
Experience
- Proven experience as a Data Analyst or in a similar role, with at least X years of hands‑on experience in developing data pipelines and managing data infrastructure within Azure.
- 3 years of experience in banking data
- Databricks and SQL experience required
- Experience in reporting, analysis and MIS teams preferred
- Worked on end‑to‑end Data Product deployment
Requirements
- Strong analytical and problem-solving skills to effectively design data solutions.
- Banking Retail, Business & Wholesale Product Knowledge and understanding
- Attention to detail and a commitment to delivering high-quality work.
- Excellent communication and teamwork skills to collaborate with cross-functional teams.
- Ability to manage time effectively and prioritize tasks in a dynamic work environment.
- Strong data querying and processing skills using SQL, Oracle, Apache Spark or similar language
- Data Warehousing and ETL concepts
- Expert-level proficiency in SQL, Python with SAS/R/Spark as plus
- Deep understanding of distributed data processing frameworks, particularly Apache Spark.
- Experience with cloud platforms (Azure, AWS) — including Object storage, compute, networking, and data integration services.
- Familiarity with data modeling techniques, including schema design, partitioning, and performance tuning.
- Experience working with structured and semi-structured data formats (e.g., JSON, Parquet, Avro, XML).
- Knowledge of data governance, quality assurance, and compliance best practices.
- Solid grasp of core computer science fundamentals (e.g., data structures, search algorithms, queues) relevant to performance-critical data engineering
- Databricks and SQL experience required
Responsibilities
- Understanding the project requirements, by conducting Business Discovery Sessions
- Work with internal teams like system support teams, DBA and MIS teams to translate the existing requirements into the desired mart and reports
- Prepare data mapping documents, needed for development of various layers of the data platform (raw – bronze, EDM – silver, project specific – gold)
- Support during various phases of the projects as needed by the data engineers, data modelers, QA testers and Power BI developers
- Collaborating with other developers, data analysts, and stakeholders to ensure that the software meets the needs of the business or organization
Skills
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free