Here are some excerpts of the presentation Matt Robert and I gave at the October 2010 meetup of Dorkbot Sydney.
If you wish to see all of the photos from the set-up phase prior to the presentation on the roof of my apartment block, please have a look at the album in my gallery.
I wrote a Windows-based video analysis and processing framework to underpin the research I undertook for my undergraduate thesis.
Some of the features it boasts:
For my undergraduate honours thesis I conducted research into the unbuffered real-time detection of commercials on television with a view to muting the volume when ads are being broadcast. The research itself dealt with examining the features required to enable robust real-time detection. I developed a sophisticated video analysis and processing framework to underpin experimentation and compilation of results.
The following screenshots show the system running live (click on one to see the full-res image):
An inside view into iCinema's Project T_Visionarium:
(We tend to drop the underscore though, so it's referred to as TVisionarium or simply TVis).
This page summarises (for the moment below the video) the contribution I made to TVisionarium Mk II,
an immersive eye-popping stereo 3D interactive 360-degree experience where a user can
search through a vast database of television shows and rearrange their shots in
the virtual space that surrounds them to explore intuitively their semantic similarities and differences.