Follow this story.
- Sun introduces the T2 with many CPUs on one chip - an impressive and appropriate work of technology.
- Tim Bray visits the question of how to make use of many CPUs for generic tasks, and poses the Wide Finder as an example task.
- Readers create a variety of interesting solutions over the next two months, for which Tim posts 15(!) updates on his weblog.
As sketched above, a very successful topic. The only problem is that the example task is too lightweight to need distribution. Tim Bray used his soapbox to stir interest in a good question, though not with the right example. A reasonable mistake - better to pose the question at the right time, with risk of asking an imperfect question, rather than wait an indefinite time (maybe forever if otherwise busy) until your question is perfect.
What does bother me a great deal is no one asked whether the example task was suitable for distribution! A relatively small set of measurements show that a simple Perl script can process the file data as fast as it can be read off disk, and will be done before all the fancier solutions (those that require the file data already in memory) can get started.
Why did no one ask this question earlier?
Readers of Tim’s weblog picked up the challenge and came up with a variety of interesting answers … that solved the wrong problem. Presumably a good fraction of the readers are young and in or just out of school. That these folk charged off with great energy in the wrong direction - does not bother me much. Surely some portion of Tim’s readership have a bit more experience. The performance of disk and file systems are not too difficult to understand. A relatively small set of measurements can clarify what sort of solutions fit the problem.
Does this specific case offer a more general clue? There are quite a number of open source projects that start with one or a few good ideas, then charge off (with great energy, initially) in what proves to be not really the right direction. Are we looking at the same root cause?