Friday, February 09, 2007

Subjectivity, science and measuring

Since having been involved in a serious attempt to create a natural language understanding system in the 1980s, I have been interested in understanding and modeling the inherent subjectivity of understanding. Philosophical discussions within hermeneutics have pointed out related themes for some time already. However, it seems that much of the scientific practice still goes on as if we could assume that objective knowledge exists, ready to be represented with formalisms such as predicate logic. This is not true, to use this problematic word: Symbolic logic is actually of little use if we wish to deal successfully with many of the central issues in epistemology. The relationship between what is called reality and the expressions used to describe it is highly complex. One phenomenon widely recognized is vagueness or fuzziness, a concept the scientific history of which is discussed by Rudolf Seising is his interesting article. Fuzziness is, though, only one aspect among many. The dynamics and the adaptive processes in different levels of abstraction need to be taken into account. It is not realistic to say that language exists somehow miraculously without the community that learns it, uses it and molds it.

An illustration about the relationship between the "reality" in different levels
(an image, a model) and the idea of representing the reality as symbolic structures.

One of the consequences of taking the complexity of natural language understanding seriously is that the methods for philosophy, science and technology of epistemology need to include statistics, probability theory, dynamic systems theory, statistical machine learning, simulation of agent communities, etc.

Another very important consequence is related to the practices in and basic assumptions related to science: we should not consider science as a practice that aims at collecting a set of true propositions of the world. Our collaboration with researchers like Prof. Mika Pantzar has brought into our attention the idea of considering human activities as practices. It might be interesting to consider: what is "sciencing"? (rather than: what is science?) The role of language is hugely central within the practices of science. The results of science are reported and distributed mainly using language. If the statistical nature of understanding language among human beings (even among specifically educated scholars) is to be taken seriously, this should also be reflected in our practices in science. How to do it? That is worth another story.

The discussion above is largely motivated by the complex relationship between "raw perceptions" of the world and the descriptions of it using natural language. A related theme is measuring. In science and technology measuring is a central task. In quantum physics one can consider, e.g., the Heisenberg uncertainty principle, or the Kochen-Specker Theorem. In the 6th Framework Programme, the EU Commission has launced an interesting initiative within New and Emerging Technology called Measuring the Impossible:
"Measurement underpins science, but how do you measure the subjective? Can science use advances in technology to uncover the quirks and imponderables of the human mind, and how people interact with the world? The answer is that science is trying to. A new set of EU research projects is looking at the interface between different disciplines and the human experience; somewhere between psychology, engineering and physiology comes Measuring the Impossible."


Martin Wurzinger said...

It may well be that elements from statistics, probability theory, dynamic systems theory, etc etc are deemed necessary when dealing with the complexity of natural language.

But isn’t this view a corollary of considering language from the top down? What if one approaches the phenomenon from the bottom up - then language could be seen as an emergent system in its own right, re-representative of content which has been processed from input as a consequence of affinity relationships amongst conceptual domains. The feedback loop with the outside ensures a certain standardisation.

Under this view the relationships suggested by the illustration of ‘aircraft <> fuel’ become just another functional type.


Unknown said...

Considering language from bottom up and its emergent qualities is definitely important. A central aspect of the illustration, not expressed very explicitly, is the idea that nothing ensures that the conceptualization into entity-relationship structures would become standardized. A certain degree of standardization emerges thanks to the feedback loop as mentioned in the previous comment but the standardization is a matter of degree. For each person the mapping between raw perceptions and their linguistic descriptions is unique. Of course, the mappings resemble each other in most of the cases so much that useful communication is possible. Moreover, in bidirectional communication, different mappings can be become revealed and meaning negotiations take place. The problem is greater when written texts and mass media are considered. In summary, assuming that one can build a theory of meaning without considering the phenomena "under" the symbolic level seems to be highly problematic.

Martin Wurzinger said...

I find the choice of the expression ("under") interesting.

If it is understood as "under the auspices of" then the divergence resulting from the mapping between perception and language, mentioned in the previous comment, would mitigate against a comprehensive theory.

But what if ("under") is taken literally - below the conceptual level of communication? In this case the theory would focus on the general dynamics giving rise to the phenomenon of symbolisation.

Or is this what was meant anyway?