Spatial AI Engineer
BigGeo
About the role
About BigGeo
BigGeo is the Spatial Cloud.
We help companies manage and access the world’s spatial data.
Any size, any slice, any insight.
Delivered in seconds.
We’re building something that hasn’t existed before: a new layer of the internet where the “where” and “when” behind every decision is instantly clear, programmable, and actionable. Our platform removes the complexity that has kept spatial data locked in silos for decades and replaces it with speed, precision, and control.
We’re a Calgary-based company, early and moving fast, with real customers, real infrastructure, and a clear point of view on where the world is going.
Why BigGeo Exists and Why People Build Here
Most companies are spatially blind. They know what their data says, but not where or when things actually happen. That gap costs real money, creates real risk, and limits what AI can actually do in the physical world.
BigGeo exists to close that gap.
We’re not building another tool. We’re building the rails that connect the planet’s moving data to the systems that run the world. That’s a big problem, and it takes people who care about doing things right, not just fast.
People build here because:
- The problem is real and the category is open. We’re not competing for the middle of an existing market, we’re defining a new one. Your work shapes what the category becomes.
- Your fingerprints are on the architecture. We’re at the stage where the decisions you make today become the foundation tomorrow. What you ship matters.
- We run on clarity, not politics. We move with purpose. No bureaucratic drag, no HiPPO decisions, just a team that agrees on the mission and gets to work.
- You’ll grow fast because the problems are hard. Spatial data at scale is a genuinely difficult domain. If you want to be stretched, you’ll be stretched.
- We’re building for longevity. We’re not chasing hype cycles. We’re building infrastructure, the kind that compounds in value over time and earns the trust of the companies that depend on it.
The Role
The Spatial AI Engineer builds the systems that let AI models, applications, and intelligent agents understand and reason about the real world through spatial data. You will design and implement machine learning systems that operate directly on spatial datasets inside The Spatial Cloud, turning raw location and time data into intelligence that applications and AI agents can act on in seconds.
This role sits at the intersection of machine learning, spatial computing, and large-scale data infrastructure. You will build the models, pipelines, and inference services that make spatial intelligence operational: not research artifacts sitting in notebooks, but production systems answering real questions at global scale.
You will work alongside Core Systems Engineers building the Spatial Cloud platform, data platform engineers managing global spatial datasets, and product teams shipping spatial intelligence capabilities to developers and enterprises. The problems are hard, the datasets are enormous, and the impact is visible, because what you build is used in real-world environments.
If you want to build AI systems that reason about the physical world and shape a new category of infrastructure as it takes form, this is the place to do it.
What You Will Build and Own
As a Spatial AI Engineer, you will contribute to and own systems that include:
- Spatially-aware machine learning models that incorporate geometry, location, and temporal context as first-class inputs.
- AI-powered spatial analytics and pattern detection systems that find signal in global-scale geospatial data.
- Spatial reasoning systems that understand how places, movements, and events relate across space and time.
- Training and evaluation pipelines for spatial AI models, including dataset management, labeling workflows, and reproducible experiments.
- Real-time spatial inference services that deliver model outputs to applications and agents with low latency at scale.
- APIs and services that let developers, applications, and AI agents query spatial intelligence directly from The Spatial Cloud.
Key Responsibilities
Spatial AI Model Development
- Design, train, and iterate on machine learning models that operate on spatial and spatio-temporal datasets.
- Build models that detect patterns, relationships, and anomalies across geospatial signals, from dense urban data to sparse global datasets.
- Experiment with spatial reasoning approaches that incorporate location, geometry, and temporal context as explicit features, not afterthoughts.
- Evaluate model accuracy, calibration, reliability, and operational behavior against real production workloads.
Data Engineering and Pipelines
- Build pipelines for ingesting, cleaning, transforming, and preparing spatial datasets for machine learning.
- Manage training datasets, versioning, and evaluation frameworks with the rigor of a production system.
- Ensure spatial data pipelines are scalable, reliable, and observable as datasets and usage grow.
AI System Integration
- Deploy models into production systems used by applications, developer APIs, and AI workflows.
- Build inference services capable of delivering spatial insights in real time, with predictable performance characteristics.
- Integrate AI capabilities directly with The Spatial Cloud’s data and compute infrastructure, so intelligence lives where the data does.
Performance and Scalability
- Optimize AI models and inference pipelines for large spatial datasets and high-throughput query patterns.
- Make deliberate trade-offs across latency, cost, accuracy, and operational complexity.
- Ensure spatial AI systems scale with growing datasets, growing users, and growing use cases without constant rework.
Collaboration and Ownership
- Partner with Core Systems Engineers building the spatial compute layer and data platform engineers managing large spatial datasets.
- Work closely with product teams to translate real customer problems into model behavior and service design.
- Own systems end to end: design, build, ship, measure, and improve.
Advanced AI Skills
In this role, that means:
- Using modern coding assistants (such as Claude, ChatGPT, Cursor, and Copilot) to accelerate implementation, refactor work, debugging, and testing.
- Using AI to accelerate ML experimentation: drafting training scripts, generating evaluation frameworks, exploring alternative model architectures, and stress-testing your own assumptions.
- Using AI for data exploration and engineering: generating pipeline scaffolding, writing SQL, reasoning over schemas, and summarizing datasets before committing to modeling approaches.
- Using AI agents and workflows to automate repetitive engineering tasks so more of your time goes to high-impact design, modeling, and systems work.
- Bringing strong judgment about when AI output is good enough, when it isn’t, and when to push back. Your name goes on the system, not the tool’s.
What You Bring
Required:
- 3 to 7 years of experience building machine learning systems or AI-driven data products in production.
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- Strong programming experience in Python and deep familiarity with modern machine learning frameworks (PyTorch, TensorFlow, or equivalent).
- Experience building and deploying production machine learning models and inference systems, not just notebooks or prototypes.
- Hands-on experience working with large datasets and distributed data processing pipelines.
- Solid grasp of machine learning evaluation, model lifecycle management, and responsible experimentation.
- Demonstrated ability to collaborate across engineering, data, and product teams and to own outcomes, not just tickets.
- Working knowledge of SQL and comfort operating in cloud-native environments.
- Experience using AI development tools (such as Claude, ChatGPT, Cursor, and Copilot) to accelerate engineering work.
Nice to Have:
- Experience working with geospatial or location-based datasets in a production context.
- Background in spatial analytics, geospatial modeling, or spatial statistics.
- Familiarity with spatial indexing techniques (such as H3, S2, quadtrees, R-trees) and common geospatial data formats (GeoJSON, GeoParquet, PMTiles, FlatGeobuf, or similar).
- Experience building AI systems that interact with structured data platforms, data warehouses, or lakehouse architectures.
- Experience with real-time inference systems, streaming pipelines, or event-driven architectures.
- Experience with performance-critical programming in Rust or Go.
- Contributions to open-source AI, geospatial, or data infrastructure projects.
Success Measures
First 30 days:
- Onboarded onto the Spatial Cloud stack, datasets, and current modeling systems. Running local and cloud experiments against real spatial data and contributing code reviews. Clear picture of the current AI roadmap, key systems, and the people you work with most closely.
First 60 days:
- Shipped meaningful improvements to an existing model, pipeline, or inference service. Led design discussions on at least one model or system component and have a clear point of view on trade-offs. Helping shape how AI capabilities get exposed to product surfaces and developer APIs.
First 90 days and beyond:
- Owning one or more production spatial AI systems end to end: model, data pipeline, evaluation, and inference service. Driving measurable improvements in accuracy, latency, cost, or coverage against well-defined baselines. Influencing the spatial AI roadmap, including which problems we choose to solve and how. Known inside and beyond the engineering org as a technical leader in spatial AI at BigGeo.
Company Overview
Please visit http://biggeo.com for more information
Skills
Don't send a generic resume
Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.
Get started free