TVisionarium Mk II (AKA Project T_Visionarium)

An inside view into iCinema's Project T_Visionarium:
(We tend to drop the underscore though, so it's referred to as TVisionarium or simply TVis).

 * NEW: EXHIBITION PIX *

This page summarises (for the moment below the video) the contribution I made to TVisionarium Mk II,
an immersive eye-popping stereo 3D interactive 360-degree experience where a user can
search through a vast database of television shows and rearrange their shots in
the virtual space that surrounds them to explore intuitively their semantic similarities and differences.

It is a research project undertaken by iCinema, The iCinema Centre for Interactive Cinema Research at
the University of New South Wales (my former uni) directed by Professor Jeffrey Shaw and Dr Dennis Del Favero.
More information about the project itself, Mk I and the infrastructure used is available online.

I worked directly with Matt McGinity (h4rdc0r3 graphics/VR guru), as well as the l337 likes of
Jared Berghold, Ardrian Hardjono and Tim Kreger many times into the night and through the next day.

Here is an insider's view of TVis in operation:
(Watch it on youtube.com to leave comments/rate it if you like.)

We recently made the nightly news (well, Jeffrey, Jared and Ardo did - along with our creation):
(Watch it on youtube.com to leave comments/rate it if you like.)

<!--

This video can be downloaded in the following formats: AVI, WMV, MPG and MOV (and larger MOV).

-->

Here is another insider's view 'officially' made by iCinema:
(Watch it on youtube.com to leave comments/rate it if you like.)


I wrote the following major system components:

    1) A multi-threaded, load balancing MPEG-2 video decoder engine, which (for extra punch) comes with:
    • Automatic memory management & caching
    • Level-Of-Detail support and seamless transitions
    • Continuous playback or shot looping (given cut information)
    • Asynchronous loading and destruction (go critical sections!)

    Initial tests indicate it can play over 500 videos simultaneously on one computer (with 2 HT CPUs and 1GB of RAM at the lowest LOD). TVisionarium is capable of displaying a couple of hundred videos without any significant degradation in performance, but there's so much still to optimise that I would be surprised if it couldn't handle in excess of 1000.

    With my latest optimisations, TVisionarium is able to play back 1000 shots simultaneously! (Videos to come...)

    (Watch it on youtube.com to leave comments/rate it if you like.)

     

     

    <!--

    Watch video above, another (1), another (2) and another (3).

    -->

    2) (Yet another) distributed communications library:

    Although we use an existing system to efficiently send smaller pieces of information to each node in the cluster simultaneously (via UDP), there is a large amount of data that must be transmitted using a guaranteed protocol (ie: TCP). This library boasts:

    • Asynchronous I/O
    • Overlapped I/O
    • Smart memory management
    • Automatic master/slave/stand-alone configuration
    • Automatic reconnection on failure
    • Support for remote monitoring of the end application through (yet another) serialisation system I also wrote

    3) Integrating these components into the actual system running on top of Virtools Dev in a clustered environment, which required me to output some seriously cool code to achieve:

    • 'SynchroPlay' - the clustered video loading & synchronisation protocol/system to ensure videos would start playing back at the same time
    • 'Real-time playback' mode on both the master and slaves to ensure decoded video frames remain in lock-step with each other, despite high computational loads and late frame deliveries
    • Video playback 'Simulation mode' on the master node so it can spend its CPU time controlling the renderers instead of worrying about the video frames themselves
    • Physics-based modelling of video window layout in the 3D environment
      (The following video demonstrates how one half-side of the windows, which are modelled as spheres, arrange themselves around the master window.)

       

       

    4) Various other utilities and apps that aren't worth going into much detail about (eg: DirectShow-based frame extractor, TVisionarium/MPEG-2-based frame extractor, stand-alone MPEG-2 video player that was used as the testing environment for the aforementioned video engine, etc).