Geek Sublime by Vikram Chandra


  But nobody in the nineteenth century made the connection between the ANDs and ORs of Boolean algebra and the wiring of simple switches in series and in parallel. No mathematician, no electrician, no telegraph operator, nobody. Not even that icon of the computer revolution Charles Babbage (1792–1871), who had corresponded with Boole and knew his work, and who struggled for much of his life designing first a Difference Engine and then an Analytical Engine that a century later would be regarded as the precursors to modern computers …

  Nobody in that century ever realized that Boolean expressions could be directly realized in electrical circuits. This equivalence wasn’t discovered until the 1930s, most notably by Claude Elwood Shannon … whose famous 1938 M.I.T. master’s thesis was entitled “A Symbolic Analysis of Relay and Switching Circuits.”4

  After Shannon, early pioneers of modern computing had no choice but to comprehend that you could build Boolean logic and binary numbers into electrical circuits and work directly with this equivalence to produce computation. That is, early computers required that you wire the logic of your program into the machine. If you needed to solve a different problem, you had to build a whole new computer. General programmable computers, capable of receiving instructions to process varying kinds of logic, were first conceived of by Charles Babbage in 1837, and Lady Ada Byron wrote the first-ever computer program—which computed Bernoulli numbers—for this imaginary machine, but the technology of the era was incapable of building a working model.5 The first electronic programmable computers appeared in the nineteen forties. They required instructions in binary—to talk to a computer, you had to actually understand Boolean logic and binary numbers and the innards of the machine you were driving into action. Since then, decades of effort have constructed layer upon layer of translation between human and machine. The paradox is, quite simply, that modern high-level programming languages hide the internal structures of computers from programmers. This is how Rob P. can acquire an advanced degree in computer science and still be capable of that plaintive, boldfaced cry, “But I Don’t Know How Computers Work!”6

  Computers have not really changed radically in terms of their underlying architecture over the last half-century; what we think of as advancement or progress is really a slowly growing ease of human use, an amenability to human cognition and manipulation that is completely dependent on vast increases in processing power and storage capabilities. As you can tell from our journey down the stack of languages mentioned earlier, the purpose of each layer is to shield the user from the perplexing complexities of the layer just below, and to allow instructions to be phrased in a syntax that is just a bit closer to everyday, spoken language. All this translation from one dialect to a lower one exacts a fearsome cost in processing cycles, which users are never aware of because the chips which do all the work gain astonishing amounts of computing ability every year; in the famous formulation by Intel co-founder George E. Moore, the number of transistors that can be fitted on to an integrated circuit should double approximately every two years. Moore’s Law has held true since 1965. What this means in practical terms is that computers get exponentially more powerful and smaller every decade.

  According to computer scientist Jack Ganssle, your iPad 2 has “about the compute capability of the Cray 2, 1985’s leading supercomputer. The Cray cost $35 million more than the iPad. Apple’s product runs 10 hours on a charge; the Cray needed 150 KW and liquid Flourinert cooling.”7 He goes on to describe ENIAC—the Electronic Numerical Integrator and Computer—which was the world’s first general-purpose, fully electronic computer capable of being programmed for diverse tasks. It was put into operation in 1945.8 “If we built [an iPhone] using the ENIAC’s active element technology,” Ganssle writes:

  the phone would be about the size of 170 Vertical Assembly Buildings (the largest single-story building in the world) … Weight? 2,500 Nimitz-class aircraft carriers. And what a power hog! Figure over a terawatt, requiring all of the output of 500 of Olkiluoto power plants (the largest nuclear plant in the world). An ENIAC-technology iPhone would run a cool $50 trillion, roughly the GDP of the entire world.9

  So that smartphone you carry in your pocket is actually a fully programmable supercomputer; you could break the Enigma code with it, or design nuclear bombs. You can use it to tap out shopping lists because millions of logic gates are churning away to draw that pretty keyboard and all those shadowed checkboxes. And I can write working programs because modern high-level languages like C# protect me from the overwhelming intricacy of the machine as it actually is. When I write code in C#, I work within a regime that has been designed to be “best for human understanding,” far removed from the alien digital idiom of the machine. Until the early fifties, programmers worked in machine code or one of its close variants. As we’ve just seen, instructions passed to the computer’s CPU have to be encoded as binary numbers (“1010101 10001011 …”), which are extremely hard for humans to read and write, or even distinguish from one another. Representing these numbers in a hexadecimal format (“55 8B …”) makes the code more legible, but only slightly so. So assembly language was created; in assembly, each low-level machine-code instruction is represented by a mnemonic. So our earlier hexadecimal representation of “Hello, world!” becomes:

  One line of code in assembly language usually translates into one machine-code instruction. Writing code in assembly is more efficient than writing machine code, but is still difficult and error-prone.

  In 1954, John Backus and a legendary team of IBM programmers began work on a pioneering high-level programming language, FORTRAN (from FORmula TRANslation), intended for use in scientific and numerical applications. FORTRAN offered not only a more English-like vocabulary and syntax, but also economy—a single line of FORTRAN would be translated into many machine-code instructions. “Hello, world!” in FORTRAN is:

  All modern high-level languages provide the same ease of use. I work inside an orderly, simplified hallucination, a maya that is illusion and not-illusion—the code I write sets off other subterranean incantations which are completely illegible to me, but I can cause objects to move in the real world, and send messages to the other side of the planet.

  4 HISTORIES AND MYTHOLOGIES

  The American novels I found on the shelves of my lending library in Bombay were dense little packets of information and emotion and culture from across the globe. I consumed them and the values and mythologies they incarnated, and was transformed in some very intimate way. Once I was in America, face-to-face with the foreign, I wrote a novel about another Indian encounter with the Other: about colonialism, about the coming together and clash of cultures. Despite my love for American modernism, it turned out I didn’t want to write a modernist novel. I ended up writing a hybrid book, a kind of mongrel construction which used, in one half, the Indian storytelling mode of magical tale-within-tale and all the sacred and profane registers of classical Indian literature; the other half operated more or less within the mode of modern psychological realism. Colonialism exercised its depredations not only within the realms of economics and politics; an essential part of its ideology was the assertion that Indian narrative modes were primitive, or childish, or degenerate, and that Western aesthetic norms were more civilized and sophisticated. History was progress, the colonized were told, and the West was more evolved. The current state of the world was living proof of this developmental teleology. I wanted to write a book that incarnated in its very form a resistance to this Just-So story about culture.

  I understood this intention quite clearly as I wrote, but looking back now I see, also, a very young writer finding a form to contain all his various selves. I was moving between cultures, from India to America and back. I was a wanderer between nation states, I negotiated my way through their rigid borders and bureaucracies, and what could be more modern than that? I was surely a postmodern lover of modernist fiction. Yet, in my creative urges, in the deepest parts of myself, I also remained somehow stubbornly premodern. I didn’
t use those premodern forms only for political and polemical reasons; I wasn’t only trying to ironize psychological “realism” by placing it next to the epic and the mythical, or only to create lo real maravilloso as a critique of bourgeois Western imperial notions of the real. No, the impulse was not merely negative. This multiply layered narrative was how I lived within myself, how I knew myself, how I spoke to myself. There was the modern me, and also certain other simultaneous selves who lived on alongside. These “shadow selves”—to follow sociologist Ashis Nandy—responded passionately and instantly to epic tropes, whether in the Mahabharata or in Hindi films; believed implicitly and stubbornly in reincarnation despite a devotion to Enlightenment positivism; insisted on regarding matter and consciousness as one; and experienced the world and oneself as the habitations of devatas, “deities” who simultaneously represent inner realities and cosmic principles. So my book—to speak in my voice—had to contain these selves too.

  This un-modern half of my book tended to confuse my American writing-program peers. In our workshops, the prevailing aesthetic tended toward minimalism; the models were Raymond Carver and Ann Beattie and Bobbie Ann Mason. The winding tales I brought in were judged, at least initially, to be melodramatic, mystical, exotic, strange. I didn’t try to explain what I was trying to do mainly because I didn’t have a vocabulary in which I could articulate the lived sensation of this shadow-world within me. I wrote on.

  My other life as a computer geek was excitingly active and remunerative. As I taught myself about code, I discovered yet another culture on the newsgroups of Usenet and in meetings of the Houston Area League of PC Users (HAL-PC), “the world’s largest PC user group.” Programmers had their own lingo, their own hierarchies of value and respect, their own mythology. Many of these new norms were being created online. By the turn of the twenty-first century, Scott Rosenberg notes, programmers were writing

  personally, intently, and voluminously, pouring out their inspirations and frustrations, their insights and tips and fears and dreams, on Web sites and in blogs. It is a process that began in the earliest days of the Internet, on mailing lists and in newsgroup postings … Not all of this writing is consequential, and not all programmers read it. Yet it is changing the field—creating, if not a canon of the great works of software, at least an informal literature around the day-today practice of programming. The Web itself has become a distributed version of that vending-machine-lined common room … an informal and essential place for coders to share their knowledge and kibitz. It is also an open forum in which they continue to ponder, debate, and redefine the nature of the work they do.1

  One of the urtexts in this shared folklore of computing is “The Story of Mel, a Real Programmer.” It first appeared on a Usenet discussion board in May 1983, as a riposte to a recently published article “devoted to the *macho* side of programming [which] made the bald and unvarnished statement: Real Programmers write in FORTRAN.”2 Our Usenet storyteller here, like any chronicler of the days of yore, wants to set the quiche-eating, FORTRAN-writing young ’uns straight. He begins:

  Maybe [real programmers] do [use FORTRAN] now, in this decadent era of Lite beer, hand calculators, and “user-friendly” software but back in the Good Old Days, when the term “software” sounded funny and Real Computers were made out of drums and vacuum tubes, Real Programmers wrote in machine code. Not FORTRAN. Not RATFOR. Not, even, assembly language. Machine Code. Raw, unadorned, inscrutable hexadecimal numbers. Directly.3

  This post was originally written in straightforward prose by Ed Nather, an astronomer, but some anonymous coder responded to its rhythms and elegiac tone and converted it into free verse, and so it has existed on the Web ever since:

  Lest a whole new generation of programmers

  grow up in ignorance of this glorious past,

  I feel duty-bound to describe,

  as best I can through the generation gap,

  how a Real Programmer wrote code.

  I’ll call him Mel,

  because that was his name.4

  Mel, the eponymous protagonist of this epic, is the kind of programmer who is already a rarity in 1983: he understands the machine so well that he can program in machine code. The conveniences afforded by high-level languages like FORTRAN and its successors—which now all seem primitive—have by 1983 already so cushioned the practitioners of computing from the metal, from the mechanics of what they do, that they are hard-pressed to debug Mel’s code. Mel’s understanding of his hardware seems uncanny, mystical, a remnant from a bygone heroic epoch:

  Mel never wrote time-delay loops, either,

  even when the balky Flexowriter

  required a delay between output characters to work right.

  He just located instructions on the drum

  so each successive one was just *past* the read head

  when it was needed;

  the drum had to execute another complete revolution

  to find the next instruction.

  He coined an unforgettable term for this procedure.

  Although “optimum” is an absolute term,

  like “unique,” it became common verbal practice

  to make it relative:

  “not quite optimum” or “less optimum”

  or “not very optimum.”

  Mel called the maximum time-delay locations

  the “most pessimum.”

  …

  I have often felt that programming is an art form,

  whose real value can only be appreciated

  by another versed in the same arcane art;

  there are lovely gems and brilliant coups

  hidden from human view and admiration, sometimes forever,

  by the very nature of the process.

  You can learn a lot about an individual

  just by reading through his code,

  even in hexadecimal.

  Mel was, I think, an unsung genius.5

  Within the division of Microsoft that produces programming tools, a Mel-like programmer is represented by the persona “Einstein,” who is an “expert on both low level bit-twiddling and high-level object oriented architectures.”6 There is also another persona named “Elvis,” a “professional application developer.”7 As described by Eric Lippert, former senior software engineer at Microsoft, both Einstein and Elvis “got their jobs by studying computer science and going into development as a career.”8 And then there is the persona “Mort,” who is “an expert on frobnicating [tweaking, adjusting] widgets, [who] one day realizes that his widget-tracking spreadsheets could benefit from a little [Visual Basic for Applications] magic, so he picks up enough VBA to get by.”9

  The vast majority of programmers in the world today are Morts. Despite my intermittent, fumbling attempts at studying data structures and algorithms—the bricks and mortar of computer science—I most definitely remain on the Mort end of the scale. The ever-receding minority of Mels and Einsteins has observed this democratization of the computer with mixed feelings: on the one hand, the legendary early hackers at MIT and Apple are revered precisely because they took on the bureaucratic priesthood that protected the mainframes, defeated its defenses, and made computing available to all; on the other, the millions of Morts who have benefited from the computer revolution produce awful, bloated, buggy software because they don’t know how the machine really works, and, what’s worse, most Morts don’t want to know. “Mort is a very local programmer—he wants to make a few changes to one subroutine and be done,” writes Lippert.

  Mort does not want to understand how an entire system works in order to tinker with it. And my goodness, Mort hates reading documentation … Mort’s primary job is to frobnicate widgets—code is just a means to that end—so every second spent making the code more elegant takes him away from his primary job.10

  Mort lacks “mechanical sympathy,” that quality possessed by the best race-car drivers, who understand their machines so well that they flow in harmony with them.

  T
o the Morts of the world, and even to the Elvii, Mel the Real Programmer’s programming is inscrutable and his mystique dazzling. The narrator of our epic is asked to investigate and change the behavior of a program that Mel has written. He reads through Mel’s code, and is baffled by an “innocent loop” which doesn’t have a test within it—as is usual—to break the loop. Code loops always contain a conditional test of the form “if numberOfLoops > 4 then break”; without such a construct you are trapped in an endless circling repetition. “Common sense said that it had to be a closed loop, / where the program would circle, forever, endlessly.”11 But Mel’s program doesn’t get stuck in the loop, it flows through, it works. It takes the narrator two weeks to comprehend Mel’s uncanny melding of code and machine, which uses the test-less loop and a programmer-forced malfunction in the system’s memory to position the next program instruction in the right location; such is the force of this revelation that “when the light went on it nearly blinded me.” After such knowledge, reverence is the only proper emotion; the narrator tells his Big Boss that he can’t fix the error because he can’t find it.

  I didn’t feel comfortable

  hacking up the code of a Real Programmer.12

  Despite the allusion above to “the *macho* side of programming,” the non-geek may not fully grasp that within the culture of programmers, Mel es muy macho. The Real Programmer squints his eyes, does his work, and rides into the horizon to the whistling notes of Ennio Morricone. To you, Steve Wozniak may be that cuddly penguin who was on a few episodes of Dancing with the Stars, and by all accounts, he really is the good, generous man one sees in interviews. But within the imaginations of programmers, Woz is also a hard man, an Original Gangsta: he wired together his television set and a keyboard and a bunch of chips on a circuit board and so created the Apple I computer. Then he realized he needed a programming language for the microprocessor he’d used, and none existed, so Woz—who had never taken a language-design class—read a couple of books, wrote a compiler, and then wrote a programming language called Integer BASIC in machine code. And when we say “wrote” this programming language we mean that he wrote the assembly code in a paper notebook on the right side of the pages, and then transcribed it into machine code on the left.13 And he did all this while holding down a full-time job at Hewlett-Packard: “I designed two computers and cassette tape interfaces and printer interfaces and serial ports and I wrote a Basic and all this application software, I wrote demos, and I did all this moonlighting, all in a year.”14

 
Previous Page Next Page
Should you have any enquiry, please contact us via [email protected]