G
Associate Data Engineer
Gallagher
Remote · US Contract Entry Level Today
About the role
Overview
The Associate Data Engineer reports to the Data Engineering Lead and supports the development and ongoing operation of enterprise data integrations and pipelines across Gallagher. This role partners with data engineers, data scientists, analysts, and application teams to deliver reliable, secure, and well-documented data solutions that enable analytics and product initiatives.
- Temp‑To‑Hire, W‑2 position (no 1099 or C2C)
- Fully remote, based in the U.S.
- Must meet U.S. eligibility requirements for work authorization
Responsibilities
- Develop integration workflows to ensure solutions are built accurately and according to spec.
- Develop and maintain requirements, design documentation, and test plans.
- Design and implement internal process improvements, including automating manual processes and optimizing data delivery.
- Coordinate with BI Engineers, Financial Applications, and Oracle HR teams around data management, reconciliation, and test data setup.
- Develop and maintain data pipelines to ingest data from a wide variety of sources (structured and unstructured) into Snowflake.
- Construct and maintain enterprise‑level integrations using Snowflake, Azure Synapse, Azure SQL, and SQL Server.
- Create data tools for analytics and data science teams to build and optimize products.
- Design analytics tools that leverage the data pipeline to deliver actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
- Assist with building Semantic Views and Agents using Cortex AI Functions in Snowflake to support deep learning and interactive user queries.
- Troubleshoot issues, conduct root‑cause analysis, and work with infrastructure teams to resolve incidents permanently.
- Partner with data and analytics teams to enhance functionality of data systems.
- Coordinate development and support with globally located resources.
- Understand existing integrations that send and receive data between Oracle, Concur, JDE, Corporate Data Platform, and other systems.
About You (Requirements)
- Relevant technical BS degree with 2+ years of experience, or Master’s degree in Information Technology, Data Science, Computer Engineering, or related field.
- 1+ years of writing SQL queries against any RDBMS with query optimization.
- 1+ years of experience with Snowflake, Azure Data Factory, and SQL Server.
- Strong experience with Python, Java, and XML.
- Familiarity with structuring a Data Lake for reliability, security, and performance.
- Familiarity with Medallion architecture, AI frameworks, AI data readiness, and machine‑learning algorithms.
- Ability to write effective, modular, dynamic, parameterized, and robust code following established standards.
- Strong analytical, problem‑solving, and troubleshooting abilities.
- Good understanding of unit testing, software change management, and software release management.
- Knowledge of DevOps, MLOps, and AIOps processes.
- Experience performing root‑cause analysis on data and processes to answer business questions and identify improvement opportunities.
- Experience working within an agile team (preferred).
- Excellent communication skills.
Compensation and Benefits
- Competitive base salary (range reflects low‑end to high‑end for the position).
- Medical, dental, and vision plans starting day one.
- Life and accident insurance.
- 401(k) and Roth options.
- Tax‑advantaged accounts (HSA, FSA).
- Educational expense reimbursement.
- Paid parental leave.
- Digital mental health services (Talkspace).
- Flexible work hours (availability varies by office and job function).
- Training programs.
- Gallagher Thrive program – health challenges, workshops, and digital fitness programs.
- Charitable matching gift program.
- Additional benefits may apply based on job level.
Requirements
- A relevant technical BS Degree with 2+ years of experience or Master’s Degree in Information Technology, Data Science, Computer Engineering or related.
- 1+ years of writing SQL queries against any RDBMS with query optimization
- 1+ years of experience leveraging technologies such as Snowflake, Azure Data Factory, and SQL Server.
- Strong experience with Python, Java, and XML.
- Familiarity with structuring a Data Lake for reliability, security, and performance
- Familiarity with Medallion architecture, AI frameworks, and AI data readiness, Machine learning algorithms.
- Skills to read and write effective, modular, dynamic, parameterized, and robust code, and establish and follow already established code standards.
- Strong analytical, problem-solving, and troubleshooting abilities
- Good understanding of unit testing, software change management, and software release management.
- Knowledge of Dev-Ops, ML-Ops and AI-Ops processes.
- Experience performing root cause analysis on data and processes to answer specific business questions, and identify opportunities for improvement.
- Experience working within an agile team is preferred.
- Excellent communication skills
Responsibilities
- Develop integration workflows, to make sure the solutions are built accurately and according to spec.
- Develop and maintain requirements, design documentation, and test plans.
- Seek out, design, and implement internal process improvements: automating manual processes, optimizing data delivery.
- Coordinate with BI Engineers, Financial Applications, and Oracle HR teams around data management, reconciliation, test data setup, etc.
- Develop and maintain data pipelines to ingest data from a wide variety of data sources (structured and unstructured) into Snowflake.
- Construct and maintain enterprise-level integrations using the Snowflake platform, Azure Synapse, Azure SQL, and SQL Server.
- Create data tools for data analytics and data science team members that assist them in building and optimizing our product into an innovative industry leader.
- Design analytics tools that utilize the data pipeline to deliver actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
- Assist with building Semantic Views, and Agents using Cortex AI Functions in Snowflake to support deep learning and interactive user queries.
- Troubleshoot issues, helping to drive root-cause analysis, and work with infrastructure teams to resolve incidents and arrive at a permanent resolution.
- Partner with data and analytics teams to strive for greater functionality in our data systems.
- Coordination for development and support with globally located resources.
- Understand the layout and working of existing integrations that send and receive data between Oracle, Concur, JDE, Corporate Data Platform, and other systems.
Benefits
dental_coveragehealth_insurance
Skills
Azure Data FactoryAzure SQLCortex AI FunctionsData LakeJDEJavaMachine learning algorithmsMedallion architectureOraclePythonRDBMSSQLSQL ServerSnowflakeXML
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free