Posts Tagged ‘c++’

A Hybrid Transport Infrastructure

Monday, May 31st, 2010

This multi-layered communications infrastructure has been chiefly utilized for its real-time multi-channel media processing and transport layers. The architecture of the system is multi-pier, trading off on the benefits of UDP, TCP and RTP/RTCP protocols for a very powerful media transport framework. The transport layer abstracts RTP, UDP and TCP transport mechanisms to simplify application integration. The application layer allows multi-channel real-time media processing applications to be built by linking together modules into graph structures (The application layer is similar to D2K in this respect). This project was the result of various lines of development while at NCSA beginning with my remote augmented reality work with Motorola and continuing with my work on collaborative video avatars embedded in the virtual reality space of Virtual Director.

Visualizing the Global

Monday, May 31st, 2010

Visualizing the Global | Computer Modeling, Ecology, Politics

I Organized this cross disciplinary seminar at UIUC. I also developed a couple of demos including a 20-node tiled visualization of various layers output from the Parallel Climate Model AND an interactive visualization of global energy consumption from 1965-2002 for 73 countries driven by the data provided by the BP Statistical Review of World Energy. The Parallel Climate Model visualization involved a distributed python backend which consumed, processed and loaded model data to a projected display cluster. To accomplish the data-visualization and navigation on 20 parallel display nodes, I utilized Partiview, an open-source, C++ based, interactive data visualization tool written by Stuart Levy.

Global Climage Model Visualization

Stephen Hawking Collaboration

Monday, May 31st, 2010

The Universe: Distributed Virtual Collaboration and Visualization, a Collaboration with Stephen Hawkings lab for the International Grid Conference in Amsterdam

I was project manager for our groups efforts at this conference. We facilitated a collaboration between Stephen Hawkings lab in Cambridge, England and Mike Norman in San Diego, via the International Grid (iGrid) conference in Amsterdam. For this we first traveled to Cambridge and setup Stephen Hawkings lab with our Virtual Director remote virtual collaboration software and we had the honor there of meeting with Stephen Hawking, the beginning of a long relationship of collaborations on big science visualization projects.

From the StarLight website: “Virtual Director and related technologies enable multiple users to remotely collaborate in a shared, astrophysical virtual world. Users are able to collaborate via video, audio and 3D avatar representations, and through discrete interactions with the data. Multiple channels of dynamically scalable video allow the clients to trade off between video processing and scene rendering as appropriate. At iGrid, astrophysical scenes are rendered using several techniques, including an experimental renderer that creates time-series volume animation using pre-sorted points and billboard splats, allowing visualizations of very-large datasets in real-time.” (http://www.startap.net/starlight/igrid2002/universe02.html)

And from one observer as reported in Calit2 news: “The Universe was the entry in the grand slam broadband sweepstakes from the National Center for Supercomputing Applications (NCSA) at University of Illinois at Urbana-Champaign, in collaboration with UCSD, USC, and the Stephen Hawking Laboratory at Cambridge University. It was my favorite — and clearly the biggest crowd pleaser. We wore 3-D glasses and viewed a latency-free video conferencing window that moved around showing video textures on floating cubes as areas of interest.”

Intellibadge | Tracking a Major Conference

Monday, May 31st, 2010

“IntelliBadgeâ„¢, an NCSA experimental technology, is an academic experiment that uses smart technology to track participants at major public events. IntelliBadgeâ„¢ was first publicly showcased at SC-2002, the world’s premier supercomputing conference, in the Baltimore Convention Center. This was the first time that radio frequency tracking technology, database management/mining, real-time information visualizations and interactive web/kiosk application technologies fused into operational integrated system and production at a major public conference.”

I was involved in this project at myriad levels, including setup and administration of linux machines, integration of my collaborative video streaming software, multi-threading the application, mirroring code for this P2P architecture, and developing post-conference data analytics with custom Java-based software.

Motorola Mirage – Remote Augmented Reality

Monday, May 31st, 2010

Sound familiar? This highly experimental collaboration with Motorola Labs preceded by nearly a decade the recent smart-phone release by the same name. In 2000-2001, a server 100s of miles away at Motorola Labs would augment in real-time the view of a user who wore head-mounted-display and head-mounted mini-camera. The users position and orientation in the room was inferred remotely from markers in the environment and an updated 3D virtual scene was generated, returned and composited into his view. In 2001 this roundtrip transport and processing benchmarked at less than 1/20th of a second. In other words, at a time when the average web page took 8 seconds to load (http://en.wikipedia.org/wiki/Network_performance), our app was remotely augmenting reality with a latency barely perceptible to humans. This project pushed the existing technologies to the limit: windows, networking, compression, multi-threaded media transport and processing. I generally prefer a Unix-like environment for software development. However, when hardware driver support and availability demanded that Windows be the target OS for our Augmented Reality platform, I made the transition without going into Linux withdrawal.