Hudson gets a new trick - Apache Hadoop
Hudson, the continuous integration platform which allows building and testing to be spread over a cluster of systems, has a new trick up it's sleeve, with the release of a Hadoop plug-in. Apache Hadoop is a platform for distributing compute tasks over a cluster of systems. With the plug-in, Hudson system administrators can simply add the ability to perform map-reduce based computations to their existing clusters.
The plug-in's author, Kohsuke Kawaguchi, wrote it as a hobby project, with the aim of using the 30-40 node cluster he administers to analyse the access logs of java.net. He has a number of ideas for other uses though, such as doing data analysis on build information, persisting test results, storing artefacts on the cluster, or more complex distributed unit testing.
(djwm)