Why Hospitals Should Forget Maps and Go Fully Egocentric
Allocentric vs egocentric perspective
On average, 40%-60% of hospital visitors are new. These are people who have not been to the hospital in the past 12 months for an outpatient appointment. You can safely assume that they are barely, if at all, familiar with the environment. Their spatial knowledge deficit usually includes a lack of destination knowledge, route knowledge, and survey knowledge.
Wayfinding tools exist to compensate for that deficit. But simply providing destination knowledge is not enough. That is like telling a firefighter there is still someone inside a burning house, without telling them how to get there.
So you always need to compensate at least for the lack of survey knowledge as well. However, you really cannot expect patients to understand the layout of a hospital before they can find their appointment. That is far too much to ask.
This means, the simplest way to help is by supporting route knowledge. Which is exactly why we always say: give patients pure egocentric tools.
A pure egocentric wayfinding tool is a navigation aid that gives directions entirely from the user’s own perspective. It focuses on first-person, turn-by-turn guidance without relying on a map or an abstract overview. It reflects how we naturally move through space — based on where I am and which way I am facing.
What makes a tool purely egocentric?
A pure egocentric wayfinding tool is built entirely around the user’s own perspective. That means:
Turn-by-turn instructions based on where the user is right now. For example: “Turn left at the elevator” or “Walk straight for 20 meters.”
Lines on the floor to follow. Very impractical in hospitals, but could work very well in the final 25 yards/meters
No map. No floor plan, no top-down view. Just simple steps.
Visuals that match what you actually see, whether that is photos or augmented reality overlays.
Orientation-aware. Instructions automatically adapt to the direction the user is facing. No guessing, no flipping the map in your head.
A few examples
Photo-based navigation, like what we do with Eyedog, where users follow a sequence of images taken from their own point of view. What you see is what you follow.
AR-based navigation, where arrows or markers are overlaid on the live camera view, guiding users exactly where to walk, based on their real-world surroundings.
Voice guidance, especially helpful for visually impaired users, where a voice describes the next step in real-time, for example, “The hallway curves left in 5 meters.”
And even static signage falls under egocentric navigation. It gives instructions based on where the user is and what they are seeing in that moment. It is more like passive egocentric guidance, versus active, personalized egocentric systems. The signs provide instructions from the user’s current position and perspective. Users do not need to consult a map or imagine the layout; they simply follow local cues as they encounter them. Navigation unfolds sequentially, based on what the user sees in the environment.
Contrast with Allocentric Tools
Allocentric tools, like map-based tools, show a route on top of the map on sometimes your position in relation to an external spatial reference, often as a dot on a floor plan.
But the problem is this:
They require users to mentally rotate, align the map with reality, and reason their way through the layout. That takes effort, even if it includes a planned route on that map. And in a hospital setting, that kind of cognitive load is exactly what people do not need.
In complex environments where people need to move quickly, in and out, where so-called ‘directed wayfinding’ really matters, egocentric tools are simply the better fit
So skip the maps in hospitals. Too complex. Too much hassle.
That is why we built Eyedog the way we did. A pure egocentric tool. It just works.
Egocentric wayfinding support
This is not a navigation system, this is a routeplanning system