Wednesday, January 26, 2011

The Horatio Principle and Logic on the Web

“There are more things in heaven and earth, Horatio, than are dreamt of in your philosophy”, said Hamlet. By it Ken Binmore concluded:

The Horatio Principle:
Some events in a large world are necessarily nonmeasurable.

The Web is a large world. Hence we may suspect the applicability of Horatio Principle onto the Web. If this suspicion could be valid, there are a few interesting consequences. For example, it is unsafe for us to adopt an arbitrary definition on the Web without information loss. The Web expands everyday!

One essential assumption of the Semantic Web is that we can safely share terms each other defines online as long as we have coded them properly according to a well-designed logic description format. By Horatio Principle, however, such an assumption is questionable if the Web expands indefinitely, which we generally agree. Horatio Principle does not claim the Semantic Web approach being unrealistic. By contrast, it says that we might still have overlooked its complexity though we have already been very serious about the problem.

The Web versus the Semantic Web is similar to the world versus the world in microscale. In theory the world (observed in marcoscale) is the same world as it is in microscale because to the end there is only one world where we live. But indeed the two views of the world have many subtle but crucial different facets. At the macroscale, nearly all things in the world can be well determined and measured by the mechanical laws. At the microscale, however, many things become uncertain. A famous example is Heisenberg uncertainty principle. We can contrast the Web and the Semantic Web similarly. At the macroscale nearly all objects on the Web can be soundly defined based on the computable logic. At the microscale, by contrast, those logic rules perfectly sound in the macroscale start to be questionable when the target objects become more and more fine-grained. An effect similar to Heisenberg uncertainty principle emerges. Some events/definitions become nonmeasurable; the Horatio Principle takes action. Is this a small shadow over the great Semantic Web project?

The more I think of the Web 2.0 hype and the Semantic Web initiation, the more I feel that the Semantic Web research starts a journey to explore the micro-structure of the Web while Web 2.0 is a continuation of the ordinary macroscale journey of the Web. From now on, we must take this distinction into our mind and use different types of thinking to observe the two approaches. With hesitation, I used to avoid directly speaking Semantic Web being a direct successor of Web 2.0. Now the feeling becomes even stronger since they are studying different aspects of the Web. It is the same as avoiding claiming the quantum mechanics being a direct successor of the class mechanics. The relation between the two are not a relation of succeeding, but a relation between two sibling views onto the same target.

Without any kind of proofs at the moment, however, I want to make a bold claim as the following and make an intuitive explanation afterward.

The claim: In a web of fine-grained, linked data, one cannot simultaneously accomplish the following two requests: (1) precisely measure the definition of a data, and (2) precisely compare the data to another definition.

Before explaining the claim, let's take a brief review of a classic explanation of Heisenberg uncertainty principle. According to Heisenberg, in order to measure the position/momentum of a particle one must shoot a photon at the target. In our classic world, this is certainly not a problem at all since the mass of the target is generally million times, if not billion times, greater than the mass of a photon. The introduction of a new photon into the system thus affects no measurement result in the system. Therefore, we can precisely measure the position and the momentum of the target simultaneously. But such an argument becomes invalid in the microscale when the mass of the target is close to the mass of a photon. The introduction of a photon significantly affects the original state of the system. As the result, it becomes impossible for any observer to identify the original position and the original momentum of the target simultaneously.

I analogize the former situation onto the Web. Anytime we measure the definition of an object we actually assign a few fine-grained new properties to the target such as measured-by and several central attributes about the observer. This property-assignment impact actually does nearly nothing to the target in the macroscale Web because every object is defined coarse-grained. By coarsely defined, the newly introduced object properties caused by measurement generally do not affect us retrieve the definition of the target and compare it to another definition simultaneously. When we study an object in a web of linked, fine-grained data, however, the situation changes. One cannot omit the impact of a new measurement to the interpretation of an object. In fact, every new measurement enriches the intent of an object non-neglectably. Therefore, the simultaneously measurement to the precise definition of the data and its precise comparison to another given definition is impossible.

It is not a proof. But I incline to the so-called "measurement=creation" principle, which has been studied for explaining Heisenberg uncertainty principle, in the Semantic Web thinking. When we go to measure a definition in the microscale, we actually are creating in contrast to querying or retrieving a meaning. This is how the microscale world is different from the macroscale world. And this is going to be how the microscale Web would be different from the macroscale Web.

1 comment:

e-Definers Technology said...
This comment has been removed by the author.