Posts Tagged ‘workflow’

Electronic Publishing on Rails

Monday, September 6th, 2010

I am currently part of a small but dynamic development team at Common Ground Publishing. We are building an application suite in Ruby on Rails to replace a single monolithic legacy application. Most of my efforts at Common Ground have centered on building their online authoring and publishing environment with active test deployments in Illinois schools. Since starting with Common Ground, I have been involved in every stage of their application lifecycle: gathering user requirements, UI design, database and data model design, implementation, migrating legacy data, testing and production deployment into schools. Key to this process has been its cyclical nature – the developers and designers along with stakeholders attend test-bed deployments to better understand user requirements and gather usability insights. These lessons learned are fed back into our agile development cycle as prioritized backlog items so that we iterate dynamically toward an optimal solution.

For more information on Common Ground software and activities, see their website:
http://commongroundpublishing.com/software/

Pentaho Business Intelligence Plug-in

Tuesday, June 1st, 2010

I hi-light this project because it demonstrates my ability to quickly grok a large unfamiliar codebase even with little documentation and to make meaningful modifications and contributions to that code. In this case I wrote a Java based plug-in for an open source Business Intelligence Suite by Pentaho Corporation. Grokking the internals of this powerful system was non-trivial but was aided by my experience as the designer of D2K, another data-flow RAD environment for data-integration and data-mining.

A Hybrid Transport Infrastructure

Monday, May 31st, 2010

This multi-layered communications infrastructure has been chiefly utilized for its real-time multi-channel media processing and transport layers. The architecture of the system is multi-pier, trading off on the benefits of UDP, TCP and RTP/RTCP protocols for a very powerful media transport framework. The transport layer abstracts RTP, UDP and TCP transport mechanisms to simplify application integration. The application layer allows multi-channel real-time media processing applications to be built by linking together modules into graph structures (The application layer is similar to D2K in this respect). This project was the result of various lines of development while at NCSA beginning with my remote augmented reality work with Motorola and continuing with my work on collaborative video avatars embedded in the virtual reality space of Virtual Director.

D2K – Datamining Infrastructure

Monday, May 31st, 2010

I was the orginal architect and author of this 100% Java data-mining system.  Once known as D2K (Data to Knowledge), this system was most fundamentally a model for designing custom data-mining solutions.  It was as well a rapid application development environment for the development of those solutions with a powerful run-time environment. I wrote the original prototype for D2K while working for the Automated Learning Group at NCSA.  Tom Redman (from the Mosaic project) would soon join the team to create the interface and RAD component of the system.  David Tcheng’s ideas were the intellectual foundations of many of the algorithms implemented within the system.  I did 2 more major rewrites of the infrastructure during my time in the ALG during which time this small research group grew from 3 to well more than a dozen people increasingly focused on some aspect of D2K.  D2K quickly turned into a flagship effort of NCSA and certainly of ALG and subsequently become the central tool for a startup company specializing in real-time analytics: River Glass.  There is now a project underway to develop a next generation evolution of this software, a semantic-driven system called Meandre of which I am only an interested observer.