A device equipped with various sensors may be used to observe its position in space and/or with respect to its user, hence introducing a novel user interface. One may consider the screen as a \peephole" into a larger virtual environment, such as a map. When presenting large amounts of data, the conventional approach is to move the environment behind the peephole using, for instance, a joypad. Instead, a dynamic form of peephole navigation proposes that by physically moving the device, the virtual environment stays in place whereas the peephole is shifted above it. A map could easily be navigated by simply moving the device in the desired direction. This paper proposes a series of experiments to ascertain the viability of dynamic peephole navigation using camera- and accelerometer-based scene analysis on hand-held devices.