Indoor Mapping - coming soon in 3D
Over the last few months we’ve been rounding off R&D on our indoor mapping technology for the eeGeo Mapping Platform.
While there’s a bit more work to do before we’ll be offering indoor maps in our mobile SDK, we’re really pleased with the progress to date. So now seems like a good time to share some of what we’ve built and learned so far.
Is it still useful to distinguish between outdoor and indoor maps?
A fundamental issue we face is deciding is how and when indoor maps should be shown.
I guess the bare minimum requirement for indoor mapping would be a 2D floor plan attached to each building footprint. The laziest UX would be to display the image modally, full screen whenever the building is selected. From the outset we felt this wasn’t going to cut it and that interiors should get treated as a first class citizen in our maps rather than a bolt-on afterthought.
We’ve found from our user testing that the tactile experience of browsing around the map is really important to our users. We felt that shouldn’t be lost when moving indoors. By the time the user sees an indoor map, they’re already familiar with the view and control scheme that we provide outdoors. If they can retain the same control scheme indoors, the whole user experience will be much smoother.
Initially we considered a strongly modal “fade to black” when moving between outdoor and indoor map views. Some of us thought this would be cool and kinda retro, like the old Legend of Zelda games, others felt it is important that indoors maps are displayed in the context of the outdoor world surrounding them.
Displaying indoor maps in context also makes a sense when we consider larger, public buildings like shopping malls, conference venues and sports stadiums. The way we use and navigate these buildings is more closely aligned to the way we use an outdoor area like a city centre than to the way we use smaller buildings.
By way of example, take the annual Mobile World Congress conference that we visited early this year. This conference is enormous, with 1,900 companies spanning 98,000 square metres of space across spanning 8 halls and numerous outdoor spaces. When we navigate around a venue like this, the distinction between interior and exterior becomes less important as we’re more interested in visiting particular exhibition stands. The specific building that a stand is located is of secondary importance.
Building 3D interior models, automatically
One of the most successful things we’ve achieved with our maps is the way users interact with them. The combination of a solid 3D model and accessible, tactile camera controls encourages users to spend time exploring the map. We think this offers a whole host of opportunities to use maps not only as a canvas by as a primary user interface to a host of geo-located services.
To that end we’ve been working hard to carry the successes of our outdoor maps over to our indoor mapping. Aside from user controls, this means having a 3D model of the interior that people actually want to explore. Happily, there’s been a lot of carry over from the knowledge we’ve built up in creating the automated 3D model building tools we use for our outdoor maps.
Lighting for indoor maps
A key difference indoor and outdoor environments is how they are illuminated.
Outdoors (even here in Dundee sometimes!) the principle source of illumination is the sun, resulting in prominent directional shadows. On mobile devices we show this in our exterior models using a fairly standard 3-point lighting model, and the venerable stencil shadowing technique.
Interior lighting is very different, there are typically more light sources, they are less strongly directional and more of the illumination comes from light that has been bounced off other surfaces. To account for this we’ve used ambient occlusion to illuminate our interior models. This gives us localised shading where there are nooks and crannies in the model, giving the model a feel of solidity. The screenshots below show a before and after for a section of our test model.
Scale: Still an issue for indoor mapping
When we started thinking about the kind of interiors that we might want to feature in our map, we soon came to the conclusion that a lot of the commercial interesting venues are really big. For instance Westfield Valley Fair shopping mall has plans to reach 2.2 million square feet of retail space by 2017.
While the fraction of the world covered by interior maps is pretty small right now, we’re already at the point where the scale of the data warrants consideration. We’ve been working with Micello on this project and they already have thousands of interior floor plans available. So we at least need a solution that can cope with at least that kind of volume. The processing certainly needs to be able to scale beyond a development laptop ;-)
Scale isn’t the only technical challenge that indoor map share with exteriors. Here are a few more that occurred to us:
- Some buildings might be too big and too complex to load quickly as single monolithic model.
- We need to be able to perform collision queries against them so users can pick and interact with them.
- We need to able to render them at good frame rates on broad range of low end mobile devices.
- Entities within the interior need to be labelled, in some views those labels might overlap and become difficult to read.
Once we came to the realisation that the usage, technical and scaling challenges for interiors are very similar to those we face for exteriors we wondered whether the solutions are the same, or at least similar?
As it turns out we’ve been able to leverage many of the systems we already have in place for exterior maps, albeit with a some additions and modifications.
For instance, we build our exterior maps in the cloud using the AWS / Hadoop / MRJob stack and we’ve continued this for interiors. Much of our 3D model compression, streaming and rendering technology is happily doing double duty for interiors.
We’re also using our smart labelling system to resolve the tightly clustered labels we find indoors into something that our users can actually read. Here’s a screenshot of a shopping mall to demonstrate.
What do you think?
We’ll be drawing the R&D phase of the project to close over the next few weeks so expect to see an MVP of indoor mapping in our Recce apps soon. In the meantime we’d love to hear your comments on the work to date and any feedback or suggestions.
We now have an API for users to create and submit their own indoor maps! Read more in the blog post, or go straight to the API documentation.