The most common question I hear from engineering teams beginning to build spatial capabilities is some variation of: “We cannot afford enterprise GIS, so what can we actually do?”

The answer, which consistently surprises people, is: almost everything you would want to do. The open source geospatial ecosystem, combined with cloud-native infrastructure, has reached the point where the capabilities that justified six-figure enterprise GIS licences can be replicated at a fraction of the cost. In many cases, the open source alternatives are not merely adequate substitutes — they are technically superior.

This article describes the architecture patterns and technology choices that enable high-capability spatial systems at low cost.

# The True Cost of Enterprise GIS

Before examining the alternatives, it is worth understanding what you are replacing. Enterprise GIS licensing is expensive in ways that are not always transparent.

A typical ArcGIS Enterprise licence for a 25-user organisation costs between £30,000 and £100,000 per year depending on the module selection. Each additional module — Network Analyst, Spatial Analyst, Image Analyst — adds cost. Server-side licences for ArcGIS Server add cost. ArcGIS Online credits for cloud services add cost. Certification-required professional services add cost.

The total cost of ownership for a moderately configured enterprise GIS environment often reaches £200,000–£500,000 per year when you include licences, infrastructure, professional services, and the staff time spent on administration and maintenance. Large government programmes can spend an order of magnitude more.

Against this baseline, an open source spatial architecture on cloud infrastructure typically costs £5,000–£30,000 per year for an equivalent capability — a saving of 80–95%.

# Core Architecture Principles

# Pay Only for What You Use

Traditional GIS requires you to provision infrastructure and licences for peak load, even if that peak load occurs only occasionally. Cloud-native architecture enables you to pay only for what you actually use.

Serverless compute (Lambda, Cloud Functions) costs nothing when it is not running. Managed database services can be scaled down during off-hours. Object storage costs scale linearly with actual data volume. This pay-per-use model eliminates the waste inherent in provisioned infrastructure.

# Separate Data from Process

In monolithic GIS, data and processing are tightly coupled — the data lives in the GIS server, which is also responsible for processing and serving it. Separating storage from compute is the fundamental move that makes cloud-native architectures cost-effective and flexible.

Store data in object storage (S3, GCS, Azure Blob) in open, cloud-native formats. Process that data using ephemeral compute that spins up, does its work, and shuts down. Serve results from lightweight services that scale independently of the processing layer.

# Build on Open Formats

Proprietary spatial formats are expensive: they require expensive software to read and write, they create lock-in, and they cannot be served efficiently from commodity infrastructure. Open formats — GeoJSON, GeoPackage, Cloud-Optimised GeoTIFF, GeoParquet, PMTiles — work with any tool, run on any infrastructure, and enable data portability.

The switch from proprietary formats to open formats is often the highest-impact step in a migration towards a low-cost spatial architecture, because it unlocks access to the full open source toolchain.

# The Minimum Viable Spatial Stack

For a team starting from scratch, the minimum viable spatial stack that covers the most common use cases:

Spatial database: PostGIS on a managed PostgreSQL service. For AWS, Amazon RDS for PostgreSQL is the standard choice; for GCP, Cloud SQL for PostgreSQL; for Azure, Azure Database for PostgreSQL. A db.t4g.small instance on RDS with PostGIS costs approximately £25/month for development use.

Data processing: Python with GeoPandas, Shapely, and Rasterio for spatial transformation and analysis. These run in any Python environment — Lambda functions, containers, or development machines — and have zero licensing cost.

Tile serving: PMTiles files hosted on S3 or Cloudflare R2, served with appropriate CORS headers. For data that changes frequently, a small pg_tileserv instance on a t3.micro EC2 instance costs approximately £8/month.

Frontend mapping: MapLibre GL JS. Free, open source, and WebGL-accelerated. The only cost is CDN bandwidth, which is typically a few pounds per month.

Total infrastructure cost for a development environment: approximately £40–£100 per month. Production, with higher availability requirements, might run £200–£500/month. Compare to £30,000+ per year for an equivalent enterprise GIS setup.

# Open Data Sources: The Hidden Asset

One underestimated element of a low-cost spatial architecture is the availability of high-quality open data. The era of paying for base data — administrative boundaries, road networks, elevation data, population data — is largely over, at least for the major data categories.

OpenStreetMap is the most important open spatial dataset in the world. It is a freely editable map of the world, maintained by millions of contributors, and available under an open licence. The data quality in developed countries now rivals commercial alternatives for most use cases. GeoFabrik provides regularly updated downloads of OpenStreetMap data by country and region.

For routing and geocoding based on OSM data, the open source tools OSRM (routing), pgRouting (SQL-based routing in PostGIS), and Nominatim (geocoding) provide the same core functionality as commercial services, at the cost of operational complexity.

Natural Earth provides free, public domain cultural and physical vector and raster data at three scales. Country boundaries, coastlines, populated places, rivers, roads — all available for download, no licence required.

Copernicus (the EU Earth Observation programme) provides free access to Sentinel satellite imagery at 10-metre resolution, updated every 5 days for most of the globe. The imagery is radiometrically calibrated and analysis-ready. Combined with open source tools like the Sentinel Hub Python client or the Google Earth Engine API, it enables systematic monitoring applications that would have required expensive commercial imagery a few years ago.

OS OpenData (UK), DATA.gov (USA), and their equivalents in most developed countries provide national-level datasets — elevation models, administrative boundaries, transport networks — under open licences.

Global elevation data is available freely from SRTM (30-metre resolution, global), ALOS World 3D (30-metre, improved accuracy), and Copernicus DEM (30-metre, European calibration).

# Architectural Patterns for Common Use Cases

# The Static Tile Serving Pattern

Problem: Serve a large spatial dataset as web map tiles to many concurrent users, at low cost.

Solution: Generate a PMTiles archive using Tippecanoe, upload to S3 with public read access and CORS enabled, serve from a CDN (CloudFront, Cloudflare). Configure MapLibre GL JS to use the PMTiles protocol handler.

Cost: S3 storage (~£0.02/GB/month) + CDN transfer (~£0.08/GB). A 1GB PMTiles archive served to 10,000 users per month at 10MB per session: approximately £1/month in egress costs.

No servers required. Infinitely scalable. Suitable for any data that can be pre-generated.

# The Event-Driven Processing Pattern

Problem: Process spatial data as it arrives (incoming sensor data, uploaded files, API events) without provisioning always-on infrastructure.

Solution: Store incoming data in S3. Configure S3 event notifications to trigger a Lambda function. The Lambda function performs the spatial processing using GDAL, Shapely, or PostGIS (via a connection to RDS) and writes outputs back to S3 or the database.

Cost: Lambda pricing is effectively free at low to moderate volume (the first 1 million requests per month are free). For a pipeline processing 100,000 events per month at 5 seconds each: approximately £1/month in Lambda costs.

Scales to zero when there is nothing to process. No idle infrastructure.

# The Analytical Query Pattern

Problem: Run ad-hoc spatial queries against large datasets, without provisioning a large database server that runs continuously.

Solution: Store data as GeoParquet files in S3. Use DuckDB (with the spatial extension) or AWS Athena to run SQL queries directly against the Parquet files, without loading them into a database.

import duckdb

conn = duckdb.connect()
conn.execute("INSTALL spatial; LOAD spatial;")

result = conn.execute("""
    SELECT
        ST_AsText(geometry) as wkt,
        population,
        name
    FROM read_parquet('s3://my-bucket/census.parquet')
    WHERE ST_Contains(
        ST_GeomFromText('POLYGON((-0.5 51.2, 0.2 51.2, 0.2 51.7, -0.5 51.7, -0.5 51.2))'),
        geometry
    )
    AND population > 10000
""").fetchall()

DuckDB runs entirely in-process — no server required. Athena charges per query based on data scanned. For analytical queries that run occasionally, this is far cheaper than a continuously-running database.

# The Importance of Right-Sizing

A common mistake in spatial architecture is over-engineering. Teams coming from an enterprise GIS background sometimes replicate the complexity of the old architecture using new tools — running GeoServer where pg_tileserv would suffice, using a large RDS instance where a small one would work, building complex processing pipelines where simple scripts would do.

The right spatial architecture for a given problem is the simplest one that meets the requirements. For a team that needs to:

  • Store and query a few hundred thousand features
  • Serve them as map tiles
  • Run analysis a few times per week

The right architecture is: a small managed PostGIS instance, Tippecanoe for tile generation, S3 for storage, MapLibre GL JS for rendering. Total cost: £50–£100/month. The temptation to build a more “enterprise” architecture before you have the scale to justify it is expensive and counterproductive.

Build for your current scale. Design for the scale you realistically expect. Do not architect for the scale you might hypothetically reach someday.

# When Enterprise Tools Are Worth the Cost

Honesty requires acknowledging that enterprise spatial tools are sometimes genuinely the right choice:

Regulatory requirements: Some industries have compliance requirements that effectively mandate support contracts with large software vendors. The cost of a support contract is real insurance against regulatory risk.

Highly specialised capabilities: Some GIS workflows — complex cartographic production, specific surveying and photogrammetry pipelines, specialist data formats in defence — remain better served by commercial tools.

Organisational risk tolerance: A small team without deep GIS engineering expertise may find that the operational complexity of a cloud-native open source stack exceeds what they can manage reliably. The managed services model of enterprise GIS, with vendor support and professional services, has genuine value for these organisations.

The key is making the decision deliberately, with full awareness of the cost and a clear view of what enterprise tools genuinely deliver that open source alternatives cannot.


Related reading: From Monolithic GIS to Cloud-Native Spatial Intelligence · Cloud-Orchestrated Geospatial Workflows · The Open Source Geospatial Stack