This concept was inspired on the basis of the google live translate feature. Its ambition is to provide you a window through to the world from your perspective.
Interested in applying this to our concept, I investigated to see if we could provide such an experience to museum visitors through existing technology. The challenge for us was to implement this idea on a physical 3d tablet replica. Some solutions I came across were:
– Project tango (Google): A phablet with special sensors able to map the environment as a 3d model, problem is we don’t own one, and it is only meant for spatial
– Accelerometer sync with 3d model: this would be the ideal solution as it would enable an ‘exact’ augmented replica. Problem is it isn’t easy, and would may be a risky path to try.
– 2d image mapping: this could work great as there turns out to be a lot of preexisting apps and sdks to implement such a method. Problem is it is quite difficult to use on a 3d surface as the shadows and light greatly influence performance.