San Francisco & Silicon Valley:
New York, New York:
SCOTSUXX:
Aviation Mapper is LIVE! Click here to use it.
The Aviation Mapper launch video
Running live at Sydney Airport streaming AvMap via 3G mobile Internet
Running the Aviation Mapper desktop app
Presentation at Dorkbot Sydney Finale 2011
Presentation at Ruxcon 2011: "Hacking the wireless world with Software Defined Radio
RFMap featured in GQ Australia (April/May 2012)!
You can also have a listen to this podcast of a radio interview I did on with Ian Woolf on 2SER:
"Balint Seeber Mashed Up Radio"
UPDATE: I have written an extensive guide with plenty of screenshots explaining how to use the map to the full. The contents of the guide can be found to the left of this text (at the top of this page's left-hand column). Read it if you wish, or dive right in. Alternatively, open two tabs/windows and go through the guide while trying it out live.
The Australian Geographical Radio Frequency Map is a site that overlays all registered RF transmitters on top of Google Maps. Generic antenna sites are shown with the red RF icon, while mobile base stations are represented by the carrier that operates from the site (often multiple carriers do, but currently the first is chosen when determining the icon). A mouse-over will give you the site's description, and a click will tell you who broadcasts from there, and at what frequencies. Much more is to come...
Although people by-and-large won't be terribly interested in the positions of every antenna in the country, they might be interested in checking how close they are to their nearest mobile base station, how good a carrier's coverage is in a particular area, and for prospective spectrum purchasers to assess possible interference, etc. However, having said all that, it is primarily in the ham spirit.
Here is every transmitter site cached in the database, which is fully searchable from the web interface (shown below) by location, site name, client details, frequency range, emission designator and callsign:
Here is a browser preview:
Here is what it looks like on the iPhone:
And on the iPad:
UPDATE: NISRP has now been added to the offical Winamp plugin catalogue! |
More information is available on my wiki.
The mapping software brings all the data together and presents it on a map. Position and measurements are logged to a SQL database, and can be reviewed at will. The 'Level Layer Manager' allows customisation of data shown on the map. For example, one can choose to plot all measurements made on a particular ARFCN, and then further refine that to one cell by specifying the BSIC.
Two streams of data are of interest: trace information from the mobile and position information from a GPS receiver.
To acquire position information, I used a commercial Navman device, which was modified to boot into WinCE and transmit NMEA data over a TCP connection (via Bluetooth Dial-Up Networking and GPS2Blue) to a virtual serial port on my laptop, which in turn was connected to gpsd. Full details can be found here.
An old Nokia mobile with DCT-3 firmware (for network monitor mode) is required. The phone needs to be connected to a computer via its FBUS serial interface, so some level conversion hardware is also required. Since most modern laptops do not have an external serial port, an RS-232 to USB converter is also a good idea.
This experiment involved acquiring CellID and signal strength information from the GSM cellular network, tracking one's position while acquiring this data, and finally presenting it nicely. It is summarised in the following pictures (full details are described in the sub-sections found top-left):
Navman GPS receivers, and the like, are great, until you actually want to use their received GPS data on a computer in real-time. Luckily there are plenty of resources to do this (NavmanUnlocked, the forum, MioPocket, GPSPasSion and GPSUnderground). In addition, I recommend SiRFTech for GPS testing. There are many tools available too, such as SSnap, which is extremely useful to track registry and filesystem changes. This is especially good when creating a one-off .reg file that you can import after a hard reset to restore the state of WinCE (in particular Bluetooth pairings).
Here, I give a quick guide to turning a Navman S150 into a Bluetooth GPS receiver that one can use with gpsd on a Bluetooth-enabled computer.
Behold the trusty S150 running WinCE Core 5 and PNADesktop (which is launched from \Program Files\Navman\appstartupsec.ini - the other apps, e.g. SmartST, are manually disabled):
I wrote the following major system components:
[Initial tests indicate it can play over 500 videos simultaneously on one computer (with 2 HT CPUs and 1GB of RAM at the lowest LOD). TVisionarium is capable of displaying a couple of hundred videos without any significant degradation in performance, but there's so much still to optimise that I would be surprised if it couldn't handle in excess of 1000.]
With my latest optimisations, TVisionarium is able to play back 1000 shots simultaneously!
While profiling the system, total CPU usage averages around 90-95% on a quad-core render node!
This indicates that those optimisations have drastically minimised lock contention and support far more fluid rendering.
Have a look at TVis in the following video:
This is an in-development 'video tube' test of the video engine:
(Watch it on youtube.com to leave comments/rate it if you like.)
T_Visionarium was officially launched on 08/01/2006 as part of the 2008 Sydney Festival. Please read my blog post about it. Here are some pictures:
The festival banner:
Crowd before the speeches:
The digital maestros (Matt McGinity & I):
This series of pages summarises the contribution I made to TVisionarium Mk II, an immersive eye-popping stereo 3D interactive 360-degree experience where a user can search through a vast database of television shows and rearrange their shots in the virtual space that surrounds them to explore intuitively their semantic similarities and differences.
It is a research project undertaken by iCinema, The iCinema Centre for Interactive Cinema Research at the University of New South Wales (my former uni) directed by Professor Jeffrey Shaw and Dr Dennis Del Favero. More information about the project itself, Mk I and the infrastructure used, is available online.
I was contracted by iCinema to develop several core system components during an intense one month period before the launch in September of 2006. My responsibilities included writing the distributed MPEG-2 video streaming engine that enables efficient clustered playback of the shots, a distributed communications library, the spatial layout algorithm that positions the shots on the 360-degree screen and various other video processing utilities. The most complex component was the video engine, which I engineered from scratch to meet very demanding requirements (more details are available on the next page).
Luckily I had the pleasure of working alongside some wonderfully talented people: in particular Matt McGinity (3D graphics/VR guru), as well as Jared Berghold, Ardrian Hardjono and Tim Kreger.
Thanks to the generosity of Aras Vaichas, I came into possesion of an old (1992) 60x8 dual-colour LED display. As it was just the display itself (no manual, instructions, software, etc) I set about reverse engineering the board. Using my multimeter I re-created the schematic for the board and found all the relevant datasheets online. Having figured out how to talk to the display, I interfaced it via the parallel port and wrote some control software for it. Once I could display various test patterns (multi-colours sine waves), I 'net-enabled' the software so that the display could be controlled over a network via UDP packets - the resolution is so low that the entire LED configuration fits into a single packet! Finally, I wrote a plugin for Winamp that streams the frequency analysis of the playing song to the display, which produces results like this:
Although there exists a plethora of programs that count lines of code, I thought I would write my own. It is designed to analyse C/C++ code and ignore whitespace, //
and /**/
comments (both the single and multi-line sort). It also counts the number of FIXME's one has left in their code. Other languages (eg: Javascript, assembly) that also use such commenting conventions are compatible too.
A system for distributed extensible particle simulations over multiple computers. Unfortunately I haven't exactly got around to distributing it. Although thanks to the generosity of Steven Foster, the many lab computers at my school are waiting. My report details the process and simulation design.
I have been continually developing a Verlet-integration based particle system inside Teh Engine and have produced a number of interesting results. The two main themes of simulated phenomenon are tornados and cloth. You can read more about these individual experiments in the next sections, as well as watching videos of the results.
An excellent resource for Verlet-integration can be found at Gamasutra.
Here are some stills:
SKIP THE CHIT-CHAT, LET ME USE IT*!
* Please note: WebRadio is only available when I have the computers and radios switched on. (And I don't usually do this as electricity does not grow on trees and fire is bad. Did I mention I have to pay for uploads too?) If it says it "Can't connect to the server" and you'd like to give it a whirl, please do not hesitate to email me (bottom of front page) and I'll switch it on for you.
Dr John Smith told me about these types of receivers, so I thought I'd build the bare-bones-basic one.
Unfortunately it's not possible to use it anywhere near urbanised areas due to the 50Hz mains interference 'hum' that swaps out mother nature.
I wrote a Windows-based video analysis and processing framework to underpin the research I undertook for my undergraduate thesis.
Some of the features it boasts:
For my undergraduate honours thesis I conducted research into the unbuffered real-time detection of commercials on television with a view to muting the volume when ads are being broadcast. The research itself dealt with examining the features required to enable robust real-time detection. I developed a sophisticated video analysis and processing framework to underpin experimentation and compilation of results.
The following screenshots show the system running live (click on one to see the full-res image):
The last time I visited Wairunga in New Zealand I set about developing two control systems to aid in the running of the farm:
An inside view into iCinema's Project T_Visionarium:
(We tend to drop the underscore though, so it's referred to as TVisionarium or simply TVis).
This page summarises (for the moment below the video) the contribution I made to TVisionarium Mk II,
an immersive eye-popping stereo 3D interactive 360-degree experience where a user can
search through a vast database of television shows and rearrange their shots in
the virtual space that surrounds them to explore intuitively their semantic similarities and differences.