Boxlab Visualizer



Below are some general guidelines foor implementing the boxlab visualization tool. It is expected to be flexible enough to support multible end user populations (e.g. pattern recognition, home ethnography, product interaction). Video instructions will be provided as tutorials for different scenarios of use.

Possible Visualization Modalities:
  • floorplan mapping of sensor activations
  • 3d mapping of sensor activations
  • visual representation of ambient environmental conditions
  • photos associated with object usage
  • trajectory/movement vectors in the home
  • timeline plotting of accelerometer/motion data
  • timeline plotting of biological signals
  • reconstructed bodily motion (stick figure, 3D)
  • inferred activity "hotspots" (kitchen, bathroom, etc.)
  • map of GPS location
  • graphical presentation of qualitative data (affect, etc.)

Video Playback:
  • multiple selectable camera views can be arranged to reflect user's preferred orientation (e.g. handling left/right transitions between frames)
  • visualized frame rate by camera view (a color that represents the frame rate, to help the user see which frame(s) have more activity)
  • double-clicking on a video thumbnail will zoom that view to full screen
  • double-clicking full screen will restore previous video arrangement
  • auto-arrange option will allow visualization of all sources
  • play head will have label indicating current play time
  • tick marks or coloring under play bar will indicate density of sensor activations
  • right-clicking on a specific video stream will seed annotation with room source notation

Audio Playback:
  • switch for auto-select causes audio selection to be linked to video with highest frame rate
  • maximum of 2 audio streams can be selected simultaneously
  • audio wave form may be graphically notated to indicate high levels of activity

Sensor Playback:
  • right clicking on sensors will bring up contextual search menu (search forward, back, or value)
  • ambient sensors will include graphic representation of max/min values
  • sensor floor plan may feature translucent overlays indicating room level hot-spots.

Annotations:
  • annotation interface will always be visible
  • clicking on a sensor on the floor plan will activate annotation entry for the time of next or last activation
  • when video is playing back, annotation entries will be automatically selected by time code.
  • some annotations may be superimposed over video display
  • annotations may be stored and synchronized in a separate database server
  • redundant annotation should be supported to permit calculation of intercoder reliabilities

Data Structure:
  • data must be able to be indexed by room level
  • user should be able to define their own ways of specifying what data they want to download (e.g., room level, time, sensor type)
  • summary/average metadata should be computed during data accuisition (to facilitate rapid visualization)
  • daily activity summary may be accesible through community generated URLs
  • data must indicate if a particular set of data has or has not been annotated for specified activities

Server Issues:
  • data may be stored as fine-grained elements each with a persistent URL
  • "snatch-it" applet will allow background downloading of large segments of data
  • if local copy of file is unavailable, system will default to data server