The First Principle of New Media identified in *The Language of New Media* is that the resulting objects of new media practices “are composed of digital code; they are numerical representations.”

When we look at a picture of the Mona Lisa on a computer screen, we do not consciously perceive digital code or numerical representations; there is an implicit phenomenological distinction being drawn between two descriptive categories: the operation of the computer (how information behaves within a computer as a formal system), and the conscious perceptions of a person interacting with a new media object through a computer interface (we see “new media objects” rather than digital code or numerical representations).

While these two descriptive categories refer to the same physical behavior (what, in a given instance, a person might be doing with a computer), they also require different vocabularies to adequately describe precisely what physical processes are involved and how these processes affect us. In the case of a person doing something with a computer, these vocabularies differ in terms of “how” something might be happening, agree on “that” which might be happening, and converge on “why” certain events result.

The First Principle of New Media elects to analyze the resulting objects of new media practices in terms of how information behaves within a computer as a mechanistic formal system. The assumption here is that analyzing new media in such a way will lead to a more fundamental or profound account of new media, compared to a study of what people do with new media; the desire behind such an assumption is often a belief that a more fundamental account of a phenomenon might be understood as objectively grounding subsequent inferences about observations of relevant phenomena — in this case, that phenomenon being new media objects.

Contemporary computers are formal machines designed to provide a syntactical specification for a set of formal symbols, the rule-governed means to manipulate these symbols, and the physical means to express these symbols using arbitrarily distinct formal systems.

The information stored on a computer is stored in a symbolic form, and numbers are just one possible way to represent these symbols. A loose analogy can be drawn with natural languages: when we have something in mind that we would like to express, we may write it down on paper so that it is expressed as chemical pigments and light waves, or we may speak it aloud so that it is expressed as air molecules and sound waves. If we say something aloud, it would not be correct to assume that the spoken expression is more or less fundamental than its written equivalent; if we want to understand natural language communication, it would not be correct to privilege the spoken word over the written word, or to focus strictly on how words exist in minds before they are communicated to others.

When we talk about information stored in a computer, we frequently talk about binary code being composed of 1′s and 0′s; **the fact that we use the Arabic numerals “1″ and “0″ to represent this code is simply a matter of convention**. TRUE and FALSE or ON and OFF work just as well as 1 and 0; the convention we use has little effect on how the stored information is structured so long as the convention allows exactly two states in binary opposition.

It is certainly possible to treat all the 1′s and 0′s involved in storing a software application on a computer (along with all the plain-language text unique to that software application, such as online documentation) as a single integer, represented by a very large base-2 number. However, the same 1′s and 0′s can also be viewed equivalently as instructing the computer hardware to move bits of information from one place to another; or, those 1′s and 0′s can be viewed as truth-value assertions in a long expression of Boolean logic.

In his essay, “The Practical Logic of Computer Work,” Philip E. Agre asserts that:

“The design of a computer begins with formalization — an intensive and highly skilled type of work on language. A computer is, in an important sense, a machine that we build so that we can talk about it; it is successful if its operation can be narrated using a certain vocabulary. As the machine is set to running, we too easily overlook the complicated reflexive relationship between the designer and user on which the narratibility of the machine’s operation depends. The corpus of language that was transformed in producing the machine, like any discourse or text, is necessarily embedded in an institutional and historical context, and the machine itself must therefore be understood as being, in some sense, a text.”

To describe a computer program as a long expression in Boolean logic shouldn’t be counter-intuitive: computer programs are, after all, written in artifical “programming languages.”

This means that **computers are just as much logic machines as they are arithmetic machines**. That the information computers store can be represented numerically is very much incidental to how the stored information is structured, and is neither a fundamental nor an axiomatic observation about how computers behave.

Why we so often talk about computers as though they are arithmetic machines (or fancy calculators) is in large part a matter of what sorts of patterns we are culturally taught to be sensitive to. A modern city-dweller well-acquainted with the cycles of the business day and traffic laws might easily see patterns in a contemporary cityscape which a farmer from 1800 might have to struggle to perceive. We are taught a good deal of arithmetic in public schools, but very little formal logic; we are furthermore taught science in the tradition of a Platonic-Pythagorean conception of nature, which holds abstract constructs to be the most fundamental description of the world. Since we know science gave us computers, why wouldn’t arithmetic therefore appear to be essential to the most fundamental description of what computers do?

Given, however, that computers are just as much logic machines as arithmetic machines, it is not at all self-evident why the numerical features of computerized information storage should be privileged in an analysis of how practical computers behave.

Furthermore, **it is not unique to new media objects that they originate in a numerical form: musical notation, too, can be viewed mathematically, as can the design of architecture** (or the motion of planets for that matter). In fact, the information contents of practical computers are hardly unique in their ability to be described mathematically, and their ability to be described mathematically is hardly fundamental to their operation. It is therefore just as problematic to describe computers as arithmetic machines as it is to describe the arithmetic features of new media objects as characteristic of new media.

It is perhaps more relevant to discuss contemporary computers in terms of the relative convenience with which they allow us to perform sophisticated manipulations of symbolic information.