Technorati Profile
 

Andere Logs:
Schockwellenreiter
Daring Fireball
IT&W
Lessig Blog
WebDev
Turning the Tide
Blendobox
Dienstraum
Scripting News
Denkzeug Blog
 

Was der Krieg im Irak kostet:
(JavaScript Error)
Details gibt's hier.

denkzeug
mind, music, metaphor - & more

Mapping Abstract Worlds

Von Christoph Pingel, am 19.2.03 um 23:46:58 Uhr.

Abstract:
There seems to be something about New Media that invites attempts to ‘objectify’ the workings and contents of the mind, our associations as well as our most abstract ideas. How are these attempts motivated, both in the structure of our cognitive capacities and in social realities? And where do they lead? The text applies the theory of conceputal metaphor as developed by George Lakoff to these questions and outlines potential dangers of ill-chosen metaphoric models. The paradox result is that the attempts to communicate most effectively via hypertext structures and participatory software can lead to digital representations of mental procedures and objects that close themselves against further interpretations..

Keywords:
Metaphor, objectifiying the mind, conceptual theory of metaphor, new media, real abstractions

Introduction

Back in 1994, in the night after I had been browsing the web for the first time with a telnet application - the only way my provider was able to provide back then - I woke up in the middle of the night with a vision: Why shouldn't it be possible to represent the workings of my mind, with all these reoccurring thoughts that I liked to call 'strange attractors', with all the associations, conclusions, emotion-driven objections and enthusiasms, in a hypertext structure, even integrating what was already there - websites about certain theories, pieces of fiction, product descriptions - and thus creating a representation or extension of my mind that seamlessly integrated the process of thinking with its results, the texts, and the results of other people's thinking, making it visible and approachable for everybody. It felt like a revelation, but of course I was not the first one to come up with this idea. In 1991 for example, Heiko Idensen had published an article titled 'Ideas as Objects' that dealt with exactly the kind of hypertext literature I had imagined, albeit not in the context of the internet back then.

And the idea to 'objectify the mind' in an upcoming technical medium is even older, as Lev Manovich mentions in his recent book 'The Language of New Media'. One of his examples are Sergei Eisensteins »Notes for a film of 'Capital'« from 1928: »[...] Eisenstein speculated that film could be used to externalize - and control - thinking. [...] In accordance with the principles of 'Marxist Dialectics' as canonized by the official Soviet philosophy, Eisenstein planned to present the viewer with the visual equivalents of thesis and anti-thesis so that the viewer could then proceed to arrive at synthesis, that is, the correct conclusion, as pre-programmed by Eisenstein.« (p.58) Another informing example from Manovichs text is Francis Galton, who invented composite photography in the 1870s; he not only claimed to depict 'ideal faces' by his method, but also proposed to rename abstract ideas 'cumulative ideas' in an analogy along the lines of his photographical method. This thread has been picked up about a decade ago when designers created 'ideal faces' by blending dozens of faces using Photoshop software and found out that the most average face was commonly seen as the most pretty (while Hollywood takes care that the most average faces become the most widespread). Ortega y Gasset seems to argue in a similar vein when he holds that the beauty of a face depends on how close it comes to its own platonic ideal.

Most probably what Galton had on his mind when he considered abstract ideas to be 'cumulative' by nature was similar to the 'prototypes' of recent cognitive psychology or perhaps an extensionalist semantic theory according to which the meaning of a class descriptor is the set of items it refers to. But obviously he was too enthusiastic about his analogy to consider that there are abstract ideas like 'happiness', 'number' or 'state' that don't lend themselves to composite photography as easily.

But now that we have the World Wide Web, a new 'Existenzform' for abstract ideas has been found. Brands, markets, communities, organisations are mapped onto an intermediate layer of semi-abstraction that is located right between the heaven of platonic ideas and physical reality. One might argue that this exactly has been their mode of existence ever since abstract entities were inventend: there has never been such a thing as the 'physical realisation' of a brand or an organisation. And yes, they belong to this famous third realm beyond mind and matter that has been called 'Geist' (Hegel), 'the world of interpretation' (Royce), 'thirdness' (Pierce), or 'world 3' (Popper). But I will argue that the very nature of 'world 3' is changing considerably and with severe consequences for our 'being in the world' as individuals and as societies when machines are not only becoming part of, but very much the core of the abstract worlds we used to inhabit intellectually. My point here is to show how incredibly strange, unprecedentent, almost bizarre this change is.

In Part 1 I will give some examples, looking at the kind of abstract/digital mappings we find in our 'digitized' world so far; in Part 2 I will introduce a theory of metaphoric mappings that could help to explain the motivation and the exact functioning of the mind/new media mappings; Part 3 will deal with the pitfalls we have to be aware of - more specifically, I will try to show that the 'embodiment' of abstract terms or interpretations in a digital semantic sphere of its own bears the danger of mixing descriptive with prescriptive models, so that the 'interpretative software' we write and use tends to become a self-fulfilling prophecy and a closed system without any external point of observe or even critizise it.

1 A Layer of 'Real Abstractions'

Despite of all the talk about the 'digital divide', in our everyday understanding, the internet is thought to be pretty much co-extensive with 'the world', at least the social world as we know it - every single institution or event we may develop an interest in is expected to have a virtual (web) 'site' or at least a page where it is represented. The common background knowledge that 'servers' are machines that usually rest in some basement (or under the desk) at a specific geographic location (not to far from the person or institution they serve) contributes to this notion; of course the picture is wrong, as the recent breakdown of one of Germany's biggest internet providers (STRATO) has shown - a considerable part of Germany's internet rests in the rooms of this company.

Nevertheless, the mapping of the 'real' onto the 'virtual' world goes on - and is percieved as such. When the World Wide Web was invented in the early 90s as a way to leverage communication among highly specialized scientific communities, the first web masters - probably unconsciously - discovered an interesting analogy between the institutions they worked for and the computers they worked with: hierarchy. The hierarchic file system that all commonly used computer operating systems share to the present day lends itself perfectly to depict the heirarchic structures of large institutions like universities or corporations. The model has been adopted with such ease and almost without reflection to such an extent that only recently 'usability engineers' came up with the idea that it could be a mistake to confront visitors of a web site with the organigram of the respective institution - because this is potentially what they are least interested in. Yahoo, one of the first and most successful orientation devices for internet users, adopted the hierarchic model as well and extended it 'upwards' to provide a comprehensive set of topics and subtopics that cover 'alltheweb' (the latter being the name of another search engine). Hartmut Winkler rightfully critized this as a pre-modern endeavour - the re-introduction of a medieval world view where every piece of information - and the thing this information is about - has one and only one place in a gigantic hierarchy. But this time, god is absent - the game begins exatcly one level below: with "sports", "entertainment" and similar exhaustive categories. The world of newsgroups (with its perhaps most famous alt.binaries) emphasizes the structural similarity of the UNIX file system and a world divided into top level and lower level topics even more clearly, while the recent trouble regarding new top level domains shows clearly that decision makers in industry and politics as well as groups of interested users are quite aware of the things next important to god. Metaphorically, this can be read as a hint that the grand narrations of our times have more to do with the properly ordered access to information than with the question of what exactly constitutes a human way of living.

A more detailed account of the abstract becoming concrete reveals an intimate relation between economic necessities and technological opportunities. While this may hardly come as a surprise, it is nevertheless noteworthy how exactly the 'mapping' takes place. An example: Traditional marketing theory has defined a 'brand' as constituted of price, availability, consumer-oriented communication and consistent quality. While in traditional economic settings, this must be viewed not as a descriptive, but as a normative definition (and a constant cause for managerial headaches), e-commerce promises to blur the difference between norm and description; and this might be the very reason why the so-called 'new economy' has attracted as much attention. For marketing people, as soon as they discovered it, the internet was like a dream come true. Internet + Amazon + DHL = the perfect realization of the abstract concept of 'brand'. Of course, there are no guarantees that this will work in any case, and it is hardly surprising that books, software and CDs are still the best selling articles in electronic commerce - goods with which it's hard to disappoint the customers in terms of quality (at least for the customer it's possible to have a pretty good idea of what you will get before you buy), mostly due to the 'ideal' nature of the product, and logistics are hardly an issue. One may wonder if garden-mould or furniture will ever make good e-commerce products. But even if the questions of availability, logistics, and quality remain unresolved, the fact that branded products get their respective space on the net where *the* idea of the product can be (and, very much to the concern of web designers, has to be) displayed has turned large parts of the web into something that depicts the (abstract) 'sphere' of commerce in a unique and unprecedented way. Where price matters, similar developments can be observed: the companies themselves are being evaluated and re-evaluated in real time by a global group of shareholders in such a concrete and immediate way that it must be regarded as dangerous just how perfectly a computerized stock market realizes the centuries-old abstract concept of offer and demand. Markets were less volatile back in the days of non-real-time decision making when the stock market and its actors where more loosely coupled than today.

Brands are only one kind of ideas that have acquired this new kind of electronic concreteness through their being written down, illustrated, and then fixed to an IP-address. Social institutions that had remained faceless for decades suddenly have to follow suit - governments, secret services and social initiatives alike are forced into their 800 x 600 pixels (or more, if the web site is well done and 'scales'), much like, as Günther Anders has observed, 'the world out there' was forced into the format of a cathode-ray tube when television became popular in the 1950s. While major publishing houses used to be remembered and identified not only by their products, but according to the architectural shape of the buildings they were housed in, nowadays they are primarily perceived as the institution behind a certain web interface, and the 'depth' of the web site indicates how richly structured the organization is within itself. Local administrations present themselves on the web in a mixture of 'customer' orentation and their implicit organisational passion for clear-cut realms of responsibilities. For a growing number of residents, the web is becoming the primary interface of interaction with a growing number of commercial and public organisations.

Even one of the most central of our abstract ideas, the idea that political power should be exerted by the people, is currently being incorporated in a digital medium; several institutions and initiatives in Europe and the US are researching the practical and theoretical implications of internet elections. At the University of Osnabrück, Germany, the Forschungsgruppe Internetwahlen works on a project called I-VOTE [http://www.i-vote.de]. From the mission statement: »Ihre Zielsetzung besteht in der Klärung der technischen, juristischen und politischen Voraussetzungen für die Durchführung von rechtskräftigen Wahlen im Internet [...].« (»It aims to clarify the technical, legal and political requirements for legally valid elections in the internet [...].) While technical and legal issues are addressed throughout the work of the initiative at a very sophisticated level, the mission statement does not mention any reason why it should be desirable in the first place to hold elections via internet at all.

2 Why Map the Mind?

What is it then that makes hypertext seem so much more appropriate to depict the workings of the human mind than linear text? What makes the idea of electronic elections so tempting and so obvious that it doesn't even have to be justified in the general layout of a research project? How are our ideas of democracy, commerce or community structured so that the lend themselves to being realized in computer based media, and what about the internet and its technical infrastructure makes it look appropriate for this very task? Is there a general trend, or do we have to look for individual reasons for each of the described phenomena? While there are certainly special reasons for digitizing abstract ideas in each case that differ from project to project, it will become obvious that there is a general trend. Let's look at technology first.

I propose four key features of current internet technology that make is appropriate for 'mapping the mind': Unique Resource Locators, ubiquitous access, the use of algorithms, and dynamic storage. Unique Resource Locators, commonly referred to as adresses, are necessary to map the world as a spatially extended structure of documents and programs. The common background knowledge that machines, databases and users are spread all over the world, together with the technical feature of unique adresses provide the cognitive link between the web and the globe. Access is the necessary condition for software-embedded models of user actions to become themselves the 'field' for these very actions. Without access, they remain just that: models. With users logged in, they become market places, auction houses, game halls or virtual counters of local administrations. Algorithms provide the computer logic for these institutions or 'virtual organizations' to work. Dynamic storage, the fact that documents and resources always change, differentiates the web from a distributed traditional library.

It may be tempting to see these features as 'literal' features of computing machinery per se, and of course it's true that most users and programmers (apart from the small group who participated in the development of these technologies) are simply presented with 'literal' interfaces defined entirely in parameters and the allowed datatypes. But it must be emphasized that computing has been a metaphoric endeavour right from the beginning and on every level. Not only have most developments taken place in the context of a certain military, scientific or economic purpose, but the ways in which the tasks were described and fundamental questions were asked always contain metaphoric elements. Manovich mentions that even the Turing Machine, the chiffre of the digital computer as a general purpose - and as such un-interpreted - algorithmic device, looks »suspiciously like a film projector« (p.24). So the idea that we have a 'pure', 'abstract' technology that waits for its interpretation in form of application programs is at least partly misleading. Consider the concept of »memory« or the fact that the networking protocols we use every day are organized in »layers« wich operate »on top of each other« - there's nothing in software that even remotely resembles verticality. Verticality and the idea of increasingly abstract software constructs that use the repspective »underlying« protocol layer is, of course, a metaphor derived from our experiences with solid things in a three-dimensional where the »lower« construction often serves as a »foundation« for the upper ones. In the world of computing and software as we know it today, there's arguably not a single domain(!) that does without this kind of metaphoric models. Tell me if you find an article in a computer magazine that talks about technology in a purely literal manner.

If we follow metaphor researchers George Lakoff and Mark Johnson who hold that our whole conceptual system is based on metaphor and that we use metaphors in our everyday and scientific thinking all the time, it comes as no surprise that an abstract artform like the design of computers and software is highly metaphor-driven. Without metaphoric descriptions it would be extremely hard for programmers to conceptualize what they do. I noticed recently that there's even a mailing list especially for C++ programmers who use the Lakoff/Johnson approach to resolve software design questions.

At the core of the Cognitive Theory of Metaphor is the concept of metaphoric mapping: The 'source domain' of the metaphor, often a well-known, concrete concept or situation, is mapped to the target domain, a new or highly abstract or very complicated or otherwise problematic phenomenon. In this mapping, the features of the source domain are implicitly mapped to the target domain. So if we say that a love relationship is - metaphorically - a journey, we know that it has a beginning and an end, there can be interesting and less interesting times, there can be problems that we encounter, and the general feeling is to be on a certain 'way'. The metaphor commonly describes a highly detailed, sophisticated situation in terms of a more wellknown, simpler one that can be grasped at a glance, including the ways to deal with it, the feelings it provoces, and so forth. World society under the the conditions of global communication is a village, the internet is an information highway, self-replicating computer programs are viruses, nations are containers and immigration is a natural disaster. As you can see from these examples, metaphors are not necessarily true, in fact, they are neither true nor false, but they can be more or less appropriate in what they say about the target domain. In the case of immigration as a natural disaster, for example, one of the many entailments of the metaphor is that immigrants are particles that metaphorically react to physical forces and themselves produce 'immigration pressure', rather than conscious beings who behave according to reasons. While this maybe reflects the emotions of the xenophobic part of the German population, it is certainly wrong and makes the metaphor a cynical one.

However, as we can see from this rather political example, there is something dangerous about metaphors: Experience shows that they effectively close themselves against attempts to interpret them further; in public discourse, issues are commonly seen as settled as soon as a comprehensive metaphor is found. Is Irak Vietnam? No. Saddam is Hitler. End of discussion. Surprising as this may sound, it makes good sense in the context of the cognitive theory of metahor. A strong interpretation of metaphor theory as a general theory of cognition might even hold that the very act of understanding is equivalent with finding a metaphoric description that I feel comfortable with. Look at the discussion about the interpretations of quantum theory for an example.

The situation is a little bit different if we use metaphors not as tools for interpretation of social or political issues, but as tools to build human computer interfaces or software logic. There, designers and software developers provide the metaphors that others will use to make sense of the machine. And theorists are pretty much in the position of Boris Groys when he was asked to talk about the movie "matrix": he said it's hard to interpret because any interpretation will only find what the authors intentionally put into the work. It should be quite easy to interpret the history of 'metaphoric computing' by reconstructing the metaphoric mappings that took place a either step. We could ask the people who invented TCP/IP about the 'layer model' of networking where protocols are layered on top of each other. The history of the desktop metaphor and how it relates to real offices with real files and folders and why it made sense at that time to apply it, that's all common knowledge. And it's obvious that it can't have been a trivial task to invent 'streaming' media and teach a packet-based data protocol to behave more like an analog signal (again). But now, a different kind of difficulties arises, because people will actually use (or misuse) the metaphors that were provided as interfaces. Or they will be trapped in inappropriate metaphors and have to swallow absurdities like throwing a disk away to get it out of the computer. And metaphors will appear that were invented unconsciously or only for marketing reasons and create misleading expectations (there's a whole collection of internet buzzwords that belong to that category - push, portal, eyeballs).

Now I'm back in the very center of the discussion about 'mapping abstract concepts' and 'mapping the mind': Trying to give democracy, brands, organisations, dialectical thinking or my personal assiciations a new form of existence in participatory digital media takes place in a highly uncertain mixed reality - information architects metaphorically interpret these intellectual entities, and their interpretations become tools, become interfaces that effectively shape and predetermine the way that users after me will make sense of their world, will work and live. And all of this takes place in the unstable and yet self-contained world of metaphor where - to put it dramatically - truth is nothing but the feeling to have the right impression is everything.

If you think that this is farfetched, let me give you one last example: Filtering software. In this case, filtering software for personal computers. When I first discovered the RSAC filter in Internet Explorer 4, I seriously thought that someone had made a joke. Perhaps this »Dr. Donald F. Roberts of Stanford University, who has studied the effects of media on children for nearly 20 years«, as a text box reads, whose work provides the basis for this »Recreational Software Advisory Council rating service«. The high praise for Mr. Roberts' achievements already gives me quite a mixed presentiment. In the preferences dialog of Microsofts web browser, I can - or at least it seems so - choose the level of 'violence', 'sex', 'nudity' and 'language' I want to expose myself to. Each of these four categories has pop-up menus that neatly list increasing 'levels' of the respective ingredients. In the aforementioned text box in the lower part of the window, a detailled description of the chosen setting is given. For 'language: expletives' it reads: "Expletives; non-sexual anatomical references". The whole thing strongly reminds me of the stories of medieval christian scholars who dwelt in detailled discussions about exactly how sinful certain sexual activities are. Now, it seems, these discussions have been directly translated into software. I have no idea how this software is supposed to work, but it's obvious that it simply can't work. It shows a totally misguided conception of 'media contents'; in the world view of this filtering mechanism, web pages are containers that 'contain' certain doses of these predefined forbidden substances (violence, sex, nudity and language) that have to be kept away. Theory, art, allusions, meta-language don't exist or are ignored. The implementation of morals into software fails; due to the wrong metaphor of a web page as a container and words as substances, as it first seems, but most probably more appropriate metaphors would simply show that it's impossible to leave moral judgements to computers.

3 Closed world

Why has it become so obviously important to objectify the mind? Lev Manovich suspects that it's the modern society's demand for standardization that is extended to the most private phenomenon: human thinking itself. »The subjects have to be standardized, and the means by which they are standardized need to be standardized as well. Hence the objectification of internal, private mental processes, and their equation with external visual forms that can be easily manipulated, mass produced, and standardized on their own. The private and individual are translated into the public and become regulated. (...) The very principle of hyperlinking, which forms the basis of interactive media, objectifies the process of association, often taken to be central to human thinking.« Hartmut Winkler on the other hand emphasizes the utopian wish to to overcome societal particularisation in a kind of ideal, extended language as the root of externalization tendencies of the mind: »Die 'Wünsche', die sich an das Datenuniversum knüpfen, zielen (...) auf eine neue Sprache ab: eine Sprache, die der Arbitrarität entkommt und ihrem doppelten Schrecken von Willkür und historischer Determination, die der unendlichen gesellschaftlichen Differenzierung standhält und dennoch ihre Einheit bewahrt, deren Wuchern limitiert ist durch ein Skelett letztlich sehr weniger rationaler Prinzipien, und eine Sprache schließlich, die das Schwirren der sich ausdifferenzierenden Medien in einem einheitlichen 'Tableau' um Stillstand bringt.« [The 'wishes' that are related to the data-universe target to a new language: a language that escapes the double horror of arbitrariness and historic determination, a language that gets to terms with the never ending process of the differenciation of society and yet keeps its unity, a language the exuberances of which are limited by a skeleton of very few rational principles, a language finally that stops the humming of self-differenciating media in a unified 'tableau'.]

I hope to have shown that at least two other reasons have to be taken into account for a complete picture of new media as externalized mind. As far as pragmatics and social entities are concerned, new media as we know them today allow for a complete blurring of the borders between action and interpretation; the very models that once were used to *understand* social processes and the actions of individuals now become the conceptual basis of the software tools that allow for those very actions take place right there in the sphere of electronic media, making an after-the-fact interpretation seem obsolete.

Insofar as individual thinking as a way to make sense of the world and to gain certainty is concerned, the externalization of individual thought processes seems to have another reason: It is seen as a way to make up for the lack of a common language and value system. Since 'correct' mutual understanding via 'Letztbegründung' and a set of shared values must fail under the condition humaine postmoderne, the detailled observation of someone else's thought processes as documented in hyperspace becomes the only way to gain at least relative certainty. If this is an adequate account, we will see more attempts in this direction as the social fragmentation into independent and highly individual ways of codeifying one's thoughts and one's lifestyle proceeds.

Both trends result in a communicative space that closes itself against interpretation, rendering the very idea of a "deeper" meaning fruitless. In a sense, the metaphoric mappings from mental to digital I have talked about can be seen as a modern version of magical thinking which, according to Freud, consists in a mistaken application of mental associations (via similarity and spacial and temporal neighborhood) to the world. A direct 'swapping' of 'contents' from one mind to another - the exchange of 'materialized' ideas via electronic extensions of the mind: all of this happens in a space without an outside, without an 'unmarked space', to use the language of systems theory. Here, the mentalist approach that John Durham Peters forcefully attacks in his history of the idea of communication is driven to its extremes, a certainty of mutual understanding devoid of any possible irritation is achieved - which will finally lead to an equilibrium of mental contents where everything has been said and is immediately accessible by everyone: the heat-death of communication. Insofar as the way there is to be characterized as electrified magical thinking, we should remember that according to Freud an adult who is not able to leave the childhood state of magical thinking is called a neurotic.

4 References

Anders, Günther: Die Antiquiertheit des Menschen, Band 1
Freud, Sigmund: Totem und Tabu, Fischer Taschenbuch Verlag, Frankfurt 1956
Hayles, Katherine N.: How We Became Posthuman, MIT Press, Boston, 1997
Lakoff, George, and Johnson, Mark: Metaphors We Live By,
Lakoff, George: Women, Fire, And Dangerous Things,
Manovich, Lev: The Language of New Media. MIT Press, Boston, 2000
Peters, John Durham: Speaking into the Air, University of Chicago Press, Chicago, 1999
Winkler, Hartmut: Docuverse. Zu einer Medientheorie der Computer. München, 1997