We've been working on this question for quite a while, can the small amounts of processing power from the Web Workers API turn a website into a distributed supercomputer?
So far we've tried a number of experiments with relatively small websites, by including an html line that causes the visitor to request [MapReduce problems and solve them, but are still pretty early stage (details [URL="http://crowdprocess.com"]here](http://en.wikipedia.org/wiki/MapReduce)). With web workers it can be limited to a small percentage of the processor and not affect performance.
If it works it can be a way to get rid of the god forsaken adds telling me i won an iphone on every website, but we are not yet sure how this will react to latency and Amdahl's Law if it gets scaled up. Has anyone worked with this kind of thing before?