Open data pipeline
Import-ready map layer
This panel checks our own GeoJSON file and the public source registry. It is ready for MIT miner output, player-verified coordinates, or any export with explicit reuse permission.
wikily-gg/subnautica2-miner can export world placements, locations, biomes and boundaries from a local game install.
Loading our POI GeoJSON dataset.
Run npm run import:map -- /path/to/miner-output after generating markers.geojson or world_map.json.
Use external maps to compare coverage and UX. Import only licensed, self-extracted or user-submitted data.
Waiting for imported GeoJSON.
Data source registry
Loading source registry.
Imported sample
- No imported coordinates yet.
Public data source found
The usable source is wikily-gg/subnautica2-miner, a public MIT-licensed extractor. It reads a local Subnautica 2 install and exports structured data such as world_map.json, locations.json, regions, boundaries and Leaflet-ready GeoJSON.
The repository does not ship the generated map database, so there is no coordinate dump to paste today. The right move is to wire our site to accept its output, then import generated data from a copy of the game or another explicitly licensed export.
| Source | Can use? | Action |
|---|---|---|
| wikily-gg/subnautica2-miner | Yes, extractor code is MIT | Use as import pipeline and keep attribution. |
| Generated miner output | Yes when we generate or receive it with permission | Import markers.geojson or world_map.json. |
| Official images/trailers | Yes for source-backed context | Use as visuals, not coordinate data. |
| Competitor live databases | Reference only unless licensed | Do not paste their marker DB into our data file. |
What other sites are doing
subnautica2.gg exposes very large public category counts for locations, resources, blueprints and creatures, but no reuse license or data export was found. MapGenie and Wikily are useful live references. GameMappers credits MapGenie, which confirms that several guide sites are embedding or referencing the same commercial map rather than publishing open data.
| Site | Map approach | What we should do |
|---|---|---|
| subnautica2.gg | Public map page with thousands of categorized markers visible in search snippets. | Use as research signal, not imported data unless a license appears. |
| MapGenie | Dedicated interactive world map with filters and progress tracking. | Reference UX patterns and link out. |
| Wikily | Live map plus public MIT miner repo under the same brand. | Use the MIT miner pipeline, not the live site's private database. |
| subnautica2map.com | Source-tagged POIs, confidence labels and cautious map-size notes. | Copy the evidence model, not the coordinates. |
Import workflow
The site now has its own map data files under assets/data and a Node importer. When we have miner output, run npm run import:map -- /path/to/miner-output-or-web-data, then rebuild. The importer writes assets/data/subnautica2-poi.geojson for the front end.
Until data is imported, the public page shows the pipeline status and does not invent coordinates. After import, the same component can list feature counts and sample points without changing the page template.
| Input file | Expected content | Output |
|---|---|---|
| markers.geojson | Leaflet-ready point features | Direct normalized POI GeoJSON |
| world_map.json | Placed actors with XYZ | Point GeoJSON with category/name/depth |
| locations.json | Named POIs with XYZ | Named location layer |
| meta.json | Bounds and layer manifest | Future map projection metadata |