As I continue reading Seth Lloyd's fabulous book Programming The Universe, a thought strongly takes me over and I cannot get rid of it. Is the W3C interpretation of Semantic Web a realizable plan according to the second law of thermodynamics?
In general, the central point of the W3C interpretation of Semantic Web is to upgrade the present Web of linked documents to the future Web of linked data. From the view of useful information (i.e., entropy with respect to the second law of thermodynamics), entropy of the system (i.e. the entire Web) must be inevitably decreased.
Certainly, however, World Wide Web is an open system in contrast to a closed system. Thus the mentioned systematic decrease of entropy does not directly violate the second law of thermodynamics. However, in order to truly decrease the entropy of an open system and maintain a low level of entropy systematically, the system must continuously be added useful work from outside of the system. To World Wide Web, human mind is the only available resource for the external useful work, and we must add an assumption that humans are not part of World Wide Web (will this assumption be hold all the way to the future?).
Even with all the previous assumptions, the W3C interpretation of Semantic Web still has a few intrinsic difficulties. In his book, Seth pointed out that the cost to decrease the entropy of a system is significantly high, let it alone that the W3C interpretation of Semantic Web asks not only to maintain the low entropy environment for long time (and potentially forever) but also to continuously push the systematic total value of entropy being lower and lower. It thus must demand unbelievable large amount of human mind to be the external useful work so that the goal can be fulfilled. In the other words, this requirement fundamentally contradicts to the optimistic declaration popularly among Semantic Web researchers that we may figure out a few low-cost, fabulous Semantic-Web killer applications and suddenly the dream of Semantic Web comes true.
The second law of thermodynamics tells that such a type of low-cost, fabulous Semantic-Web killer application simply may not exist. Or otherwise, they are typical perpetual motion machines of the second kind. Based on the second law of thermodynamics, it is theoretically impossible to build these perpetual motion machines. (If someone believes it to be possible, he will eventually discover the need of the amount of external input being quickly beyond the original expectation, just like Powerset has experienced.)
Does this observation sentence the death of Semantic Web? I still don't think so. To make Web content be more machine-executable is not impossible. However, we may need to have new thoughts of how semantics might be cost-efficiently added to the Web. In the other words, whatever tools we build should not violate the second law of thermodynamics, or otherwise the business would not be sustained (such as Powerset).
More and more, I tend to the vision of human-directed Semantic Web in contrast to the traditional vision of machine-enhanced Semantic Web. By this new vision of Semantic Web, we abandon the assumption that humans are excluded from the Web. By contrast, we allow humans and the Web together be a comparatively closed system in contrast to the Web alone be a open system. In this comparatively closed system (it is not a truly closed system since it still excludes the supporting equipments such as power plants), we may experience controlled systematic increase of entropy in exchange of a few local, conditional decrease of entropy. It would be a much less perfect Semantic Web than the W3C interpretation. But it would be much more executable, realizable, and it will bring concrete benefits for human users. I look forward Imindi to be the first real-world example of this vision.
- a philosophic view of the second law of thermodynamics
- entropy (Wikipedia)
- perpetual motion (Wikipedia)