The National Center for High-performance Computing (NCHC) has developed large tiled-display-walls (TDW), which are capable of very high resolution image display. Traditionally, the user interface with these large display walls includes a keyboard and mouse—just like those used with a regular PC. However, to take full advantage of such super high resolution display walls, a dynamic controller must be used so that a user can firmly engage with the space. Given the proliferation of new sensor-enabled, network-connected, portable devices, such as smart phones and tablets, this project explored a novel controller application using such a device. A tablet was chosen specifically for its large form factor and high computing power.
NCHC has a large dataset involving the Morakat Typhoon and its aftermath. Our application uses the tiled-display-wall to display a "before" image of an area in Kaohsiung, Taiwan. The "after" image is displayed on the tablet. The goal is that the user can hold up the tablet to the display wall and see the “after” image of the area on the wall which is covered by the tablet. By moving the tablet around the wall, the image on the tablet updates to always show the "after" image of the area behind the tablet. This project uses the Acer Iconia Tab A500 a 10.1 inch tablet running Android 3.0 (Honeycomb) and comes equipped with a gyroscope, accelerometer, compass, and both rear and front facing cameras. Also the Iconia boasts a Dualcore 1GHz ARM Cortex-A9 processor, a ULP GeForce GPU and a Tegra 2 T20 chipset. While our software reads out all of the sensors, currently only accelerometer and camera data are used for most of the application. The result is that the application works, but the tracking of the tablet location in front of the display wall needs improvement. Future work will include combining different sensor data for more robust findings, as well as filtering to smooth out jittery data.
PARTICIPATING RESEARCHERS: NCHC: Fang-Pang Lin; Calit2/UCSD: Jurgen Schulze