AI Data Analytics Engineer || Stamford, CT (Hybrid)
Jobs via Dice
About the role
AI Data Analytics Engineer
Location: Stamford, CT (Hybrid)
Contact: Phone + Skype
About the Role
AI Data Analytics Engineer, you will build and maintain the data foundation that powers decision‑making across the firm. You’ll sit at the intersection of analytics, data engineering, and automation—owning the work required to collect, standardize, and operationalize data across systems so teams can trust and use it consistently.
You’ll partner with stakeholders across investing, portfolio, operations, and leadership to translate business questions into durable data products: standardized metrics, curated datasets, reliable pipelines, and clear insights. You’ll also bring AI fluency to accelerate analysis and automation while maintaining strong validation and controls.
Responsibilities
- Define and standardize key metrics, data definitions, and reporting logic to create a shared source of truth across teams.
- Identify data gaps and recommend process improvements to strengthen data capture upstream.
- Build and maintain pipelines to ingest, clean, transform, and validate data from systems across (CRM/investment pipeline, portfolio/company data, operational tools, third‑party/market data, etc.).
- Design and maintain analytical layers and “systems of record” (curated datasets, metric layers/semantic models, documentation) that enable consistent reporting and self‑serve analytics.
- Develop and operate ETL/ELT workflows, including API‑based ingestion, scheduled refreshes, monitoring, and data quality checks.
- Build dashboards and visualizations that turn complex datasets into clear, actionable insights for stakeholders across the firm.
- Automate recurring data preparation and reporting workflows; apply AI where appropriate to speed analysis and improve data operations (e.g., assisted investigation, summarization, anomaly surfacing), with rigorous validation.
- Work cross‑functionally to translate requirements into technical specifications and deliver working data products—datasets, metrics, dashboards, automations, and system improvements.
- Communicate clearly with technical and non‑technical stakeholders, documenting assumptions, definitions, and logic.
Requirements (What You’ll Bring)
- Proficiency with data warehouses and lakehouses, and designing curated datasets for analytics and reporting.
- Strong SQL and experience with relational databases; ability to debug issues end‑to‑end from source to output.
- Experience building and maintaining ETL/ELT workflows, including orchestration, monitoring, and data quality validation.
- Experience integrating data via APIs (building/configuring ingestion patterns and maintaining connectors).
- Demonstrated ability to build dashboards and visualizations that drive decisions.
- Ownership‑driven mindset: proactive, detail‑oriented, and comfortable operating in a fast‑moving environment with evolving priorities.
- Familiarity with crypto markets and on‑chain analytics (protocol metrics, wallet/contract activity, indexers/data providers).
- Experience building and maintaining metric layers/semantic models and documentation that scale across teams.
- Practical AI fluency: experience applying AI to accelerate analysis and automation, with strong validation habits and good judgment.
“Believe you can and you’re halfway there.” – Theodore Roosevelt
Recruiter: Yogesh Sharma | Lead Tech Recruiter
Company: An -E Verified Company
Email: (E:)
Phone: +1 (P:)
Apply via Dice today!
Requirements
- Proficiency with data warehouses and lakehouses, and designing curated datasets for analytics and reporting
- Strong SQL and experience with relational databases; ability to debug issues end-to-end from source to output
- Experience building and maintaining ETL/ELT workflows, including orchestration, monitoring, and data quality validation
- Experience integrating data via APIs (building/configuring ingestion patterns and maintaining connectors)
- Demonstrated ability to build dashboards and visualizations that drive decisions
- Ownership-driven mindset: proactive, detail-oriented, and comfortable operating in a fast-moving environment with evolving priorities
- Familiarity with crypto markets and on-chain analytics (protocol metrics, wallet/contract activity, indexers/data providers)
- Experience building and maintaining metric layers/semantic models and documentation that scale across teams
- Practical AI fluency: experience applying AI to accelerate analysis and automation, with strong validation habits and good judgment
Responsibilities
- AI Data Analytics Engineer, you will build and maintain the data foundation that powers decision-making across the firm
- You’ll sit at the intersection of analytics, data engineering, and automation—owning the work required to collect, standardize, and operationalize data across systems so teams can trust and use it consistently
- You’ll partner with stakeholders across investing, portfolio, operations, and leadership to translate business questions into durable data products: standardized metrics, curated datasets, reliable pipelines, and clear insights
- You’ll also bring AI fluency to accelerate analysis and automation while maintaining strong validation and controls
- Define and standardize key metrics, data definitions, and reporting logic to create a shared source of truth across teams
- Identify data gaps and recommend process improvements to strengthen data capture upstream
- Build and maintain pipelines to ingest, clean, transform, and validate data from systems across (CRM/investment pipeline, portfolio/company data, operational tools, third-party/market data, etc.)
- Design and maintain analytical layers and “systems of record” (curated datasets, metric layers/semantic models, documentation) that enable consistent reporting and self-serve analytics
- Develop and operate ETL/ELT workflows, including API-based ingestion, scheduled refreshes, monitoring, and data quality checks
- Build dashboards and visualizations that turn complex datasets into clear, actionable insights for stakeholders across the firm
- Automate recurring data preparation and reporting workflows; apply AI where appropriate to speed analysis and improve data operations (e.g., assisted investigation, summarization, anomaly surfacing), with rigorous validation
- Work cross-functionally to translate requirements into technical specifications and deliver working data products—datasets, metrics, dashboards, automations, and system improvements
- Communicate clearly with technical and non-technical stakeholders, documenting assumptions, definitions, and logic
Skills
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free