Cognitive City Navigator

Semantic Urban Intelligence with Knowledge Graphs and Neural Retrieval

code View Code

Overview

Cognitive City Navigator is an experimental system that explores how semantic reasoning, graph structures, and neural representations can be combined to support intelligent decision-making in urban environments.

Instead of treating a city as a collection of static routes and locations, the system models it as a living knowledge graph—where transportation, points of interest, and contextual signals interact dynamically. The goal is not just navigation, but context-aware guidance that adapts to user intent and real-world constraints.

Motivation

Traditional navigation systems expose users to fragmented information: static timetables, live vehicle maps, and separate weather or POI searches. This fragmentation creates decision fatigue, especially in unfamiliar cities.

Cognitive City Navigator investigates how a semantic abstraction layer—built on top of raw transport and geospatial data—can reduce this cognitive load by answering higher-level questions like: “What is the best decision to make right now, given my intent and the city’s current state?”

System Architecture

The system is designed as a modular pipeline with three core components:

1. Semantic Representation Layer (Knowledge Graph)

At the core lies a Neo4j knowledge graph that models the city as interconnected entities rather than isolated records. Examples of modeled relationships include:

  • Routes operating on stops
  • Walkable proximity between stops
  • Points of interest located near transit nodes

This graph structure enables relational queries such as smart transfers, proximity-based recommendations, and multi-hop reasoning across the transport network.

2. Neural Retrieval & Inference

Unstructured user intent is processed using sentence-level embeddings to capture semantic meaning beyond keyword matching. These embeddings are used to:

  • Match user intent to relevant routes, areas, or POIs
  • Rank candidate decisions based on semantic similarity
  • Bridge natural language queries with structured graph data

This hybrid approach combines neural flexibility with graph precision.

3. Context-Aware Application Layer

The application layer integrates static transport data (GTFS), real-time vehicle feeds (GTFS-RT), and contextual signals such as weather. A lightweight inference pipeline dynamically injects graph-derived context into a language model, allowing the system to generate structured, situation-aware recommendations rather than free-form text.

Key Features

  • Semantic City Graph: A labeled property graph encoding transport topology, walkability, and spatial proximity.
  • Context-Aware Recommendations: Suggestions adapt to real-time conditions such as weather, live vehicle positions, and user location.
  • Hybrid Reasoning Pipeline: Neural embeddings for intent understanding combined with symbolic graph traversal for constraint satisfaction.
  • Real-Time Responsiveness: Live vehicle data is continuously ingested and filtered to reflect the city’s current state.

Technical Stack

Core Logic Python
Knowledge Graph Neo4j
Neural Models Sentence Transformers, PyTorch
Data Sources GTFS, GTFS-RT, Geospatial APIs
Deployment Docker, Streamlit

Design Takeaways

  • Knowledge graphs provide interpretability and structure that pure neural systems lack.
  • Neural embeddings excel at translating ambiguous human intent into machine-usable signals.
  • Combining symbolic and neural approaches enables more robust urban reasoning.
  • Cities are better modeled as systems of relationships, not flat datasets.

Future Directions

  • Graph-based learning (GNNs) for predictive congestion modeling.
  • Deeper multimodal integration (bikes, scooters, pedestrian flows).
  • More explicit uncertainty modeling in recommendations.