Johanna Drucker


(for the MLA,Washington, D.C., December 29, 1996)

There are a wide variety of ways in which language and information intersect in

traditional and electronic media. My basic inquiry has two parts to it: what

constitutes the "information" of language and how does this "information"

change in moving language from a material to an electronic environment?

My old favorite topic -- the materiality of signification -- describes the ways in

which material substrate and visual/typographic/written (and by extension,

verbal) style encodes history, identity, and cultural value at the primary level of

the mark,/letter/material substrate (and in non-written form, the qualities of

voice, tone, tenor, rhythm, inflection, and etc.). I'm not going to go back over

that ground today since in fact I am more interested in exploring the

"intimations of im-materiality" of my title (and if you haven't read my books

The Visible Word, or The Alphabetic Labyrinth, or work by Susan Howe or

Jerome McGann or Roland Greene or Marjorie Perloff or any of the many other

people working in this area this isn't going to be the moment you get the brief

overview/encapsulated Cliff Notes version of "MATERIALITY AS

SIGNIFICATION"). But for those of you who doubt, here's your pictorial

proof: try translating this little item into Futura Bold and see what happens to the

meaning here intended. Some metaphor that would be -- or this one -- smooth

and equal sailing to up the pole of virtue or vice -- with equal opportunity to sin

or be saved in the Pythagorean paths of Vice and Virtue (Y).

At the secondary level language contains information as format, using

spatial arrangement as a way of constituting meaning. A familiar example is the

outline form where headings, subheads, and sub-sub-subheads demarcate a

discourse into conceptual spaces and territories. Elaborately structured

hierarchies in this vein -- descriptive systems of cosmological breadth and

ambition -- develop in the middle ages (as in this work illustrating hte concepts of

Petrus Ramus) and blossom in the late Renaissance work of such ambitious

polymath scholars as Athanasius Kircher and Bishop John Wilkins, whose Essay

Towards a Real Character and Philosophical Language, includes a full outine of

all aspects of the universe -- part of his scheme to represent all of knowledge/the

world (in his work collapsed without argument) in a corresponding system of

notation. This may sound old and quaint and strange and remind us of ideas

which stretch back into antiquity about linking language and knowledge in a

guaranteed system (whether according to an atomistic logic or Adamic naming)

but when we stretch this concept forward to the all-elastic present such a

positively logical linguistic attitude turns out to underly one of the major strains

of Artifical Intelligence research.

For now, consider simply that this relational/structural/schematic aspect

of materiality uses spatial relations as significant, as part of meaning. The old

memory theaters, also devised in antiquity and perfected in conceptual/practical

terms in the Renaissance, serve as another instance of intertwining meaning and

spatialized relations -- and here "space" is meant as something schematic,

metaphoric, and abstract simultaneously. Basically the relations among linguistic

components can be mapped in the following ways: hierarchically in an outline,

spatially according to the descriptive coordinates of solid geometry (with a

fourth dimension suggested), in tree diagrams, in grids, in various indexed

charts, and two-dimensional graphs, or according to an icongraphic form (as in

the case of certain concrete poems using shape to contribute to meaning). When

these concepts of schematization and spatialization intersect with electronic

media, they can, potentially, expand into the multi-dimensional structures

available in hypertext and internet architecture. The challenge is making spatial

organization clear enough -- logically, conceptually, metaphorically, visually --

for it to be useful rather than confusing.

As we move into considering the language to code/code to language

relationship, or the intimations of immateriality, another key theme will be that

of basic binarism: In another wonderfully persistent trajectory visual language is

treated as the product of a binary code. This concept doesn't need the

electronic environment -- quite the contrary, it's these precedents which allow

for the electronic use of binarism to carry nice, profound, philosophical weight.

After all -- in a conception which is fully fleshed out by Renaissance type

designer Geoffrey Tory -- the construction of all letters -- the full set of symbols

of human language and thus of cosmic as well as human thought -- are

comprised at base of two elements -- the I and O. For Tory these are essential

elements -- the masculine principle of the vertical thrust and the feminine

principle of procreative fullness. Translating this into binary code, the I/O basis

of all electronic activity, one can read machine language in the tradition of these

combined "essences" or -- according to a more contemporary deconstructive

logic -- as pure difference, as the fundamental non-essential and differentiating

binarism which brings the possibility of signification into being.

(Now some of you, I know, are already better at reading binary code than

listening to outmoded speech patterns, so for this section there will be a

simultaneous translation by flashlight.) The essence of the "immaterial"

electronic universe as we know it is this binary code, but it isn't essential to

"computer" functions -- unless they are electronic which wasn't always so:

Analog machines capable of performing computational functions -- running

punch-programs off cards -- were used for running looms in French mills in the

early 19th century -- and mechanical automatons had used encrypted sequences

of "instructions" on interlocking gears for a good long time before that. But the

introduction of electricity and the reduction of all/any "info" to be stored and

all/any instructions to be given to a binary code combined a level of

abstractability with a potential for rapidity of processing which led to the

modern computer.

Signal Formation: Claude Shannon, working as an assistant to the inventor

Vannever Bush and operating the switches on his differential analyzer in the late

1930s, realised that the electromechanical relays in the switches could encode

the configurations of data directly, thus paving the way for the translation of

mathetmatical information into electrical form. Data became available to binary

encoding as a result -- and any communicative message, Shannon realized to the

delight of Bell Labs where he was working, could be sent as a simple electrical

signal as a result. One could translate anything into such a signal -- but

computational operations were more complex than mere communication -- and

data had to be conceptualized, not merely translated, to function in an electronic

environment. Here again language comes under reconsideration -- gets put, in

fact, into a certain bondage in order to function according to a machine

acceptable decorum.

Rules and Regulations: In a highly constrained, rule-bound, and logical form,

natural language can serve as the basis of programming language, itself encoded

in a binary numerical system. The leap from numbers connected to cogs and

axels to sequences of interconnected switches would have had very little impact

if it hadn't been for two things: the possibility of logic, using "natural" language in

constrained form, to function as a set of precise instructions translatable into

mathematical equivalents, and the possibility for these mathematical equivalents

to be encoded in a binaristic form corresponding to the fundamental on/off of

current in an electrical gate/synapse/circuit.

George Booles' 1845 Laws of Thought realized the age-old philosophical

belief in the possibility of finding a set of logical rules and laws which

corresponded to the operations of the human mind. Gottlob Frege built on

Boole's system, adding predicates to Boole's set of terms, thus moving closer to

what their predecessor Gottfried Leibniz had envisioned as "a true reasoning

calculus." The collapse of mathematical and linguistic terms according to a

philosophical belief in real logic (and logic as real) was actual as well as

metaphoric -- and within the (many) constraints of logic language is able to

"perform" functions as precise as those of any other calculating system. The

philosophical underpinnings of such an approach show through in Rudolf

Carnap's 1928 book title, The Logical Structure of the World. Whatever one

thinks about the intellectual validity of presuming logic in the actual organization

of the "world" -- or even in the human system of knowledge which describes it --

the linguistic properties of the lineage stretching from Leibniz to Boole to Frege

and Carnap (with all their own -- many -- differences such as the more atomistic

conceptions of Leibniz/Boole and more gestalt oriented notions of Carnap) --

this work provided a means whereby linguistic terms could be made compatible

with -- even the basis of -- computational acts. It was this basic rule-boundedness

which allowed Alan Turing and John Von Neumann to interlink the concepts of

"reasoning calculus" with that of the "automata" of computational machines --

as Turing realized that logical/mathematical symbols could represent any kind of


Languages and their Evolution:

Generations of computer languages now exist -- beginning with the earliest

versions from the big old mainframes of the 1940s/1950s -- machine languages

so coded that their workings are in some cases as indecipherable as Indus Valley

script to the contemporary eye -- a fact made poignant by the current about to

be a plague situation I refer to as the Millennial Bug: access to this code has been

lost -- death/retirement of original programmers who never wrote it down

anywhere -- so that the problem of debugging the turn of the century melt-dwon

of 00 digits in all date-dependent data bases is truly mind-boggling.

There are several levels of languages, as well as types of languages, in

computer programming: ultimately all computer languages have to translate

into machine language: binaristic sequences which give specific instructions to

data stored in various address locations to perform particular tasks in a

particular sequence. Compiled and interpreted languages each organize the

relation between commands and data according to distinct specifications, but an

assembly language is required to translate this program code to the correct

machine address so that the data can be located and the functions performed.

Such symbolic assembly languages evolved in the mid-1950s, but it took until

the 1960s for higher level interpreted languages (sucha s COBOL an

FORTRAN) to evolve. Compiled languages allow for little human intervention

in the course of the carrying out of the program. By contrast interpreted

languages are not entirely in machine code, - they have a front-end interface

which can be manipulated by the user throughout. These higher level languages

allow the user to take advantage of interpretive techniques to build the concepts

as you go (R.McK Wood p.186 in Reichardt). But all of these levels of

accessibility are illusory in the sense that they are all equally constrained. If

today a high level language contains a simple "Delete" command, then ten years

ago that read as: Execute Command D on Files G, H. and/or something like

del.exe.bat* to *. At that point the combination of syntactics and mnemonics

(that is sequence and terms) involved is hardly more flexible, even if slightly

more user friendly, than the assembly level: L 3,x, M 2,y, A 3,w, ST 3,2 or the

machine level: 41 2 OC1A4 3A 2 OC1A8 1A 3 OC1AO and 50 3 OC1 A 4.

Even a quick reading of these shows how much ths stuff is not really

"language" as we know it -- which is just the point -- Machine language, computer

language, programming languages, are all able to contain information -- to

function as a descriptive metalanguage which is not information, but rules highly

constrained and specific. and thus able to describe information and encode it,

but not embody it in material form.

AI: Knowledge based (Neural Nets) and Logic Based (Symbomlic Processors) :

Data processing on a massive level or Rule bounded-ness -- how does the

human mind make the leap from experience to generalized conceptualization?

Are rules of logic endemic to the structure of knowledge and operations of

thought, as per some complex symbolic linguistics, or do concepts emerge

through the processing of massive amounts of data, through perception, into

higher levelsof pattern recognition in which case "thinking" can not be contained

within "logic" but has to let logic emerge from its evolution? This debate, the

basic dialogue in contemporary Artificial Intelligence research, returns us to the

basic language/information problem. For what is being filtered OUT of langauge

when it enters the systemic logic of the electronic environment?

Whipped by the hot lash of illogic, the unneat and indecorous aspects of

language wiggle free from their bonds clamoring for their right to be recognized

within that universe of sense which is not all common sense or intractable

systemic logic but is also stuff and non-sense -- the information of sensation, of

space, of material -- crying in the dul-brained electronic universe to be heard.

While all this historical and descriptive material opens all kinds of

interesting possibilities for investigation, it is not particularly polemical. I would,

however, like to raise two slightly polemical points. The first is that it is

often/only at the expense of much of what is "information" in language -- its

complexity in material, syntactic, poetic, or even vernacular form -- that

language functions in the electronic environment. And secondly, since within

electronic production (even say keyboard to printer) there is no necessary

relation between input and output in the process of encoding the keystroke into

digital form language is denuded of its material history which is lost. The

"immaterial" is that gap of transformation -- like that which used to exists for the

typesetter between the reading of a line and its setting into hot type -- and also

exists between the material of text becoming that of sound, of sound to mind, of

eye to voice, of hand to type -- which is a basic characteristic of the way

language is information in electronic form as well. It always precipitates back

into material -- mutated, transformed, rewrit, as it were. Language is not ever an

ideal form. It always exists in some phenomenal form. I've spent a long time

insisting on the value of materiality, but I'm also interested in the freedom from

fixed relations of materiality and what that offers. Ultimately, one of the

intimations of immateriality is the way it promises to change material form --

and as such offers possibilities for reconceptualization of language as information

in the traditional media as well as in hypertext and electronic formats.

We all know that there are certain basic -- irresolvable -- philosophical tensions

in language -- most particularly between its capacity to represent knowledge

systematically and its capacity to be knowledge experientially and perceptually -

- and this tension intensifies in an electronic environment as a tension between

machine language and natural language -- since it is a reductive hybrid version of

the one which can be encoded/encrypted in order to serve as the basis of the

other. Ultimately the dilemma of the immaterial/material is unresolvable: you

can't reconcile the need for the machine to work through logic and the capacity

of human language to function through and on account of -- not just in spite of --

its illogic. Wittgenstein's dilemma: the realization that logic was outside the limit

of philosophy, could only barely get one to the threshold -- it was the material

particularity of language in use which demonstrated the capacity of language to

begin to pose philosophical questions -- and it is here that the

immaterial/material dilemma founders on a final rock, cracking apart the whole

enterprise of translation and record, of language as history, memory, identity.

For language to function as "immaterial" it must give up much of what consitutes

its information -- or at least, allow it to be translated into some alternate code,

some other, surrogate, substitute form, a representation without material

substrate in which to embed, make one with itself, the actual factual stuffness of

lived language. Not a tragedy, just a fact. Having nothing particular at stake in

trying to make machines be/imitate/function as or replace people, I'm not

particularly bothered by this -- but what is worrisome is the constraints on

communicative action which potentially come into play -- the unforgiving

unkindness of the electronic database whose newspeak categories may at some

point simply refuse to recognize those signs of human life, which like Artaud's,

are "signalling through the flames" with a primal urgency which

precedes/excludes/ is exterior/anterior outside of the logic of the code and

which is simultaneous to, interpenetrated with, inseparable from the illogical

materiality of natural language.

Now, Mike Groden did tell me that the one condition attached to this panel was

that the talks had to deal with some aspect of English or American literature --

and I have been searching my files without any success. But I contacted my

friend, Herr Doktor Professor Popov-Opov who is working at Star Lab VI on

Alpha Centuri 5.6 to see if he could find anything relevant in his data banks.

Here's what he sent me:

One final bit of insight I can share with you and your audience comes from a

snippet of material recovered from a mid-20th century source in what was then

termed "popular culture." It is a single fragment from a much longer, possibly

epic, piece -- it deals with a journey, no doubt a heroic quest, long in scope, wide

in breadth, of which only this tiny piece has come to light . According to the

formulations of which you have been speaking, the line takes on a cosmological

significance which you will of course all immeidately recognize. Here it is: "Why

oh why oh why oh did I ever leave O-HI-O?" Analyzable into those component

parts it turns out to be an instance of pure code (a vibratory binarism flickering

between the I and O of all such inscriptions) and impure materiality (the

inflections and embedments of its elemental structure within complex phonemic

units whose semantic value as constitutent components I can, as yet, without

more reseach, only hint at here). That the text itself contains suggestions of

impermanence, of eternal return, of longing and displacement -- !is hardly

suprising, given its ritual character. The line bears within it the full weight of the

20th century dilemma -- the question of how to encode language as knowledge

without loss, while recognizing the impossibility of this task in philosophical and

even mathematical terms. The answer to the question of course is that one

leaves the "paradise" of "O-hI-O" in order to be able to "write back" -- that is, to

"write to disk" to inscribe oneself perpetually in a relation to that act of coming

into being which is writing, which is scribing, inscribing, in a communicative

dialogue with the mother board, the operating system, the familial network of

soft and hard-ware mouthing its happy duck-billed platitudes to an old favorite

electronic tune.