Skip to content
mimi

Senior Analyst, BI & Data Insights

Synoptek

Remote (Global) Full-time Senior Today

About the role

Overview

We are seeking a highly skilled Senior Analyst, BI & Data Insights to join our growing team. In this critical role, you will play a key role in building and maintaining a best‑in‑class data platform that drives innovation and growth for our transportation and logistics business. You will leverage your expertise in end‑to‑end Azure data engineering, Snowflake, and a variety of data engineering tools to design, develop, and implement scalable data pipelines for ingesting, transforming, and loading (ETL/ELT) large datasets. You will partner closely with stakeholders across the organization to understand their needs and translate them into clear and concise data visualizations through data engineering solutions.

Responsibilities

  • Design and implement end‑to‑end data pipelines utilizing Azure services (Data Factory, Databricks, Synapse Analytics, Event Hubs) to ingest, process, and transform data from various sources (e.g., Oracle Mercury Gate, D365, NetSuite, telematics systems, sensors).
  • Leverage Kafka and StreamSets to ensure real‑time data ingestion and low‑latency processing for critical operational insights.
  • Develop and maintain data models that optimize data storage, retrieval, and performance within Azure Data Lake Storage and Snowflake.
  • Write, optimize, and troubleshoot SQL queries within the Snowflake environment.
  • Collaborate with business stakeholders to understand data requirements and translate them into actionable data solutions.
  • Implement data governance best practices to ensure data quality, consistency, and compliance with industry regulations.
  • Monitor and optimize data pipelines for performance, scalability, and cost‑effectiveness.
  • Develop and maintain data quality checks and data cleansing routines to ensure data integrity throughout the data lifecycle.
  • Develop reports and visualizations using Tableau to communicate data insights effectively to technical and non‑technical audiences.
  • Stay current with the latest advancements in Azure data services, data management practices, and transportation & logistics data solutions.

Qualifications

  • 4+ years of experience as a Data Engineer with a strong focus on Microsoft Azure data technologies.
  • Proven experience with data pipelines, data ingestion, transformation, and loading (ETL/ELT) processes.
  • In‑depth knowledge of data architecture, data modeling, and data governance principles.
  • Proficiency in SQL Server, MySQL, PostgreSQL, and NoSQL databases (DynamoDB, MongoDB).
  • 2+ years of experience with Snowflake data warehouse.
  • Experience with Python for data manipulation and automation.
  • Expertise in cloud‑based data warehousing solutions like Snowflake.
  • Working knowledge of data streaming technologies like Kafka and StreamSets.
  • Familiarity with data visualization tools (Tableau, Power BI, Qlik Sense) a plus.
  • Experience working in the transportation and logistics domain is highly preferred.
  • Excellent communication and collaboration skills to work effectively with cross‑functional teams.
  • Strong analytical and problem‑solving skills with a passion for data‑driven solutions.

Technical Skills

  • Cloud Platforms: Azure, AWS
  • Data Warehouse: Snowflake
  • Data Engineering Tools: Kafka, StreamSets, Databricks, Spark
  • Data Management: ETL/ELT, Data Architecture, Data Modeling, Data Governance
  • Programming Languages: Python, Java, Scala
  • Databases: SQL Server, MySQL, PostgreSQL, DynamoDB, MongoDB, etc.
  • Data Visualization Tools: Tableau, Power BI, Qlik Sense

Education

  • Bachelor’s degree in Computer Science, Information Technology, or related field from an accredited college or university (preferred).
  • In lieu of an undergraduate degree, the ratio is 1:1 – one year of college equals one year of work experience and vice versa.

Experience

  • At least 1 year of job‑related experience.
  • Familiarity with acquiring and managing data from various sources, including primary and secondary data sources.

Skills / Attributes

  • Clarity: Excellent communication skills; ability to speak the customer’s language and provide concise, well‑constructed responses.
  • OwnIT: Demonstrates integrity, innovation, and accountability in daily assignments.
  • Results: Solutions‑focused; resolves conflict quickly and precisely; proactively seeks opportunities to contribute to business goals.
  • Growth: Willing to learn, ask questions, and continuously improve; adaptable in a fast‑paced environment.
  • Team: Embraces customers and colleagues as team members; flexible, respectful, engaged, and collaborative.

Working Conditions

Work is performed primarily in an office or remote environment. May be subject to time constraints and tight deadlines. Occasional travel may be required. We live by the motto “work hard, play hard” and strive to support employees in both professional and personal goals.

EEO Statement

We are proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, age, veteran status, sexual orientation, gender identity, marital status, pregnancy, genetic information, or any other characteristic protected by law and will not be discriminated against on the basis of disability. Employment decisions are based on job‑related factors.

Requirements

  • 4+ years of experience as a Data Engineer with a strong focus on Microsoft Azure data technologies.
  • Proven experience with data pipelines, data ingestion, transformation, and loading (ETL/ELT) processes.
  • In-depth knowledge of data architecture, data modeling, and data governance principles.
  • Proficiency in SQL Server, MySQL, PostgreSQL, and NoSQL databases (DynamoDB, MongoDB).
  • 2+ Years of experience with Snowflake data warehouse
  • Experience with Python for data manipulation and automation.
  • Expertise in cloud-based data warehousing solutions like Snowflake.
  • Working knowledge of data streaming technologies like Kafka and StreamSets.
  • Familiarity with data visualization tools (Tableau, Power BI, Qlik Sense) a plus.
  • Experience working in the transportation and logistics domain is highly preferred.
  • Excellent communication and collaboration skills to work effectively with cross-functional teams.
  • Strong analytical and problem-solving skills with a passion for data-driven solutions.

Responsibilities

  • Design and implement end-to-end data pipelines utilizing Azure services, including Data Factory, Databricks, Synapse Analytics, and Event Hubs, to ingest, process, and transform data from various sources (e.g., Oracle Mercury Gate, D365, NetSuite, telematics systems, sensors).
  • Leverage Kafka and StreamSets to ensure real-time data ingestion and low latency processing for critical operational insights.
  • Develop and maintain data models that optimize data storage, retrieval, and performance within Azure Data Lake Storage and Snowflake.
  • Write, optimize, and troubleshoot SQL queries within the Snowflake environment.
  • Collaborate with business stakeholders to understand data requirements and translate them into actionable data solutions.
  • Implement data governance best practices to ensure data quality, consistency, and compliance with industry regulations.
  • Monitor and optimize data pipelines for performance, scalability, and cost-effectiveness.
  • Develop and maintain data quality checks and data cleansing routines to ensure data integrity throughout the data lifecycle.
  • Develop reports and visualization using Tableau to communicate data insights effectively to technical and non-technical audiences.
  • Stay current with the latest advancements in Azure data services, data management practices, and transportation & logistics data solutions.

Skills

AWSAzureDatabricksData ArchitectureData GovernanceData ModelingData WarehouseDynamoDBETL/ELTEvent HubsJavaKafkaMongoDBMySQLNoSQLPostgreSQLPower BIPythonQlik SenseScalaSparkSQL ServerStreamSetsSynapse AnalyticsTableauSnowflake

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free