BASIC COURSE INFORMATION

As an online course, the writing that we do in English 305 is substantially
different from a face to face course. As such, it is imperative that you
understand the course style from the start. Nearly all of your work in this
course will be posted on the course blog. EACH WEEK YOU WILL HAVE THREE BLOG
ASSIGNMENTS:
1. A BLOG ENTRY,
2. A READING, AND
3. A WRITING ABOUT
THE READING.

Your reading and writing on the blog must be completed by
the Friday (by midnight) of the week in which the reading falls. You have all week each week to complete the reading and writing for that week, but there are no late assignments accepted, so be sure to be disciplined about the
work from the start.
Let me re-state that point; if you do the assigned
work before or during the week it is due, you will receive full credit. If you do the work after the Friday of the week it is assigned, you will get zero credit for that week.

Saturday, March 31, 2012

WEEK ONE BLOG ENTRY

What is the greatest movie of all time and why?

(REMEMBER, write approximately a paragraph and then come back later and respond to what your classmates have written)



WEEK ONE READING

THE FOLLOWING IS FROM Orwell's essay, “Politics and the English Language”

A scrupulous writer, in every sentence that he writes, will ask himself at least four questions, thus:


  1. What am I trying to say?
  2. What words will express it?
  3. What image or idiom will make it clearer?
  4. Is this image fresh enough to have an effect?

And he will probably ask himself two more:

  1. Could I put it more shortly?
  2. Have I said anything that is avoidably ugly?

One can often be in doubt about the effect of a word or a phrase, and one needs rules that one can rely on when instinct fails. I think the following rules will cover most cases:

  1. Never use a metaphor, simile, or other figure of speech which you are used to seeing in print.
  2. Never use a long word where a short one will do.
  3. If it is possible to cut a word out, always cut it out.
  4. Never use the passive where you can use the active.
  5. Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent.
  6. Break any of these rules sooner than say anything outright barbarous.

SECONDLY, HERE IS MARK TWAIN ON WRITING:



Twain's Rules of Writing

(from Mark Twain's scathing essay on the Literary Offenses of James Fenimore Cooper)

1.     A tale shall accomplish something and arrive somewhere.
2. The episodes of a tale shall be necessary parts of the tale, and shall help develop it.
3. The personages in a tale shall be alive, except in the case of corpses, and that always the reader shall be able to tell the corpses from the others.
4. The personages in a tale, both dead and alive, shall exhibit a sufficient excuse for being there.
5. When the personages of a tale deal in conversation, the talk shall sound like human talk, and be talk such as human beings would be likely to talk in the given circumstances, and have a discoverable meaning, also a discoverable purpose, and a show of relevancy, and remain in the neighborhood of the subject in hand, and be interesting to the reader, and help out the tale, and stop when the people cannot think of anything more to say.
6. When the author describes the character of a personage in his tale, the conduct and conversation of that personage shall justify said description.
7. When a personage talks like an illustrated, gilt-edged, tree-calf, hand-tooled, seven-dollar Friendship's Offering in the beginning of a paragraph, he shall not talk like a Negro minstrel at the end of it.
8. Crass stupidities shall not be played upon the reader by either the author or the people in the tale.
9. The personages of a tale shall confine themselves to possibilities and let miracles alone; or, if they venture a miracle, the author must so plausably set it forth as to make it look possible and reasonable.
10. The author shall make the reader feel a deep interest in the personages of his tale and their fate; and that he shall make the reader love the good people in the tale and hate the bad ones.
11. The characters in tale be so clearly defined that the reader can tell beforehand what each will do in a given emergency.
An author should
12. _Say_ what he is proposing to say, not merely come near it.
13. Use the right word, not its second cousin.
14. Eschew surplusage.
15. Not omit necessary details.
16. Avoid slovenliness of form.
17. Use good grammar.
18. Employ a simple, straightforward style.

Finally, here are a bunch of quotes about writing:

http://grammar.about.com/od/yourwriting/a/advice.htm
Advice From One Writer to Another
"Real writers are those who want to write, need to write, have to write"
By Richard Nordquist,
When faced with a major project, whether it's designing a bridge or laying new tile in the kitchen, most of us like to rely on experts for advice. So why should a writing project be any different? As we'll see, professional writers have a lot to tell us about the writing process.
Some of the advice may be helpful, some of it encouraging, and some may do no more than raise a smile. Here then is some free advice--from one writer to another.
• "There is no rule on how to write. Sometimes it comes easily and perfectly: sometimes it's like drilling rock and then blasting it out with charges."
(Ernest Hemingway)
• "Writing is an adventure."
(Winston Churchill)
• "There are no dull subjects. There are only dull writers."
(H. L. Mencken)
• "Writing is just work--there's no secret. If you dictate or use a pen or type or write with your toes--it's still just work."
(Sinclair Lewis)
• "Nothing you write, if you hope to be any good, will ever come out as you first hoped."
(Lillian Hellman)
• "English usage is sometimes more than mere taste, judgment and education--sometimes it's sheer luck, like getting across the street."
(E. B. White)
• "Many people hear voices when no one is there. Some of them are called mad and are shut up in rooms where they stare at the walls all day. Others are called writers and they do pretty much the same thing."
(Meg Chittenden)
• "I love being a writer. What I can't stand is the paperwork."
(Peter de Vries)
• "When I finish a first draft, it's always just as much of a mess as it's always been. I still make the same mistakes every time."
(Michael Chabon)
• "Writing is like everything else: the more you do it the better you get. Don't try to perfect as you go along, just get to the end of the damn thing. Accept imperfections. Get it finished and then you can go back. If you try to polish every sentence there's a chance you'll never get past the first chapter."
(Iain Banks)
• "The writer learns to write, in the last resort, only by writing. He must get words onto paper even if he is dissatisfied with them. A young writer must cross many psychological barriers to acquire confidence in his capacity to produce good work--especially his first full-length book--and he cannot do this by staring at a piece of blank paper, searching for the perfect sentence."
(Paul Johnson)
• "Real writers are those who want to write, need to write, have to write."
(Robert Penn Warren)
• "Writing is an exploration. You start from nothing and learn as you go. . . . Writing is like driving at night in the fog. You can only see as far as your headlights, but you can make the whole trip that way. . . . Writing is a socially acceptable form of schizophrenia."
(E. L. Doctorow)
• "Writing became such a process of discovery that I couldn't wait to get to work in the morning: I wanted to know what I was going to say."
(Sharon O'Brien)
• "I write to discover what I think. After all, the bars aren't open that early."
(Daniel J. Boorstin)
• "Writing is easy: All you do is sit staring at a blank sheet of paper until drops of blood form on your forehead."
(Gene Fowler)
• "You fail only if you stop writing."
(Ray Bradbury)
• "Writing is not hard. Just get paper and pencil, sit down, and write as it occurs to you. The writing is easy--it's the occurring that's hard."
(Stephen Leacock)
• "I notice that you use plain, simple language, short words and brief sentences. That is the way to write English--it is the modern way and the best way. Stick to it; don't let fluff and flowers and verbosity creep in. When you catch an adjective, kill it. No, I don't mean utterly, but kill most of them--then the rest will be valuable. They weaken when they are close together. They give strength when they are wide apart. An adjective habit, or a wordy, diffuse, flowery habit, once fastened upon a person, is as hard to get rid of as any other vice."
(Mark Twain)
• "Writing is a form of therapy; sometimes I wonder how all those, who do not write, compose, or paint can manage to escape the madness, the melancholia, the panic fear, which is inherent in the human condition."
(Graham Greene)
• "You can be a little ungrammatical if you come from the right part of the country."
(Robert Frost)
• "What this means, in practical terms for the student writer, is that in order to achieve mastery he must read widely and deeply and must write not just carefully but continually, thoughtfully assessing and reassessing what he writes, because practice, for the writer as for the concert pianist, is the heart of the matter."
(John Gardner, The Art of Fiction: Notes on Craft for Young Writers, 1983)
• "A writer is somebody for whom writing is more difficult than it is for other people."
(Thomas Mann)
• "What is written without effort is in general read without pleasure."
(Samuel Johnson)

WEEK TWO BLOG ENTRY:

Write about anything that happened to you in a restaurant. Yes, it is that general. Just think of some time when you were in a restaurant and now have a memory of the experience for whatever reason. For example, I was in a restaurant in Little Italy in New York City once and the only other people in the restaurant were at a table arguing. He looked like Tony Soprano and she looked like his mistress…actually, from overhearing them, she was indeed his mistress but was trying to leave the “relationship.” It was a tense conversation. I did not look their way much. This was in 1996…I wonder if they are still “together.”

WEEK TWO READING

The Find: Taco María truck survives the downturn


Chef Carlos Salgado's mobile restaurant specializes in food that re-imagines tantalizing Mexican traditions.

Special to The Los Angeles Times

January 19, 2012

When food truck fatigue finally set in among the Twitter-equipped some time last year, the mobile movement all but stalled. Gone were the throngs that waited for hours, their attentions shifted instead to newly minted food artisans and itinerant pop-up restaurants. But in a Darwinian twist, only the strongest trucks have survived. And though the thrill of the chase may be gone for some, what remains are by and large the best meals on wheels.

Taco María is a product of that natural selection. The truck is helmed by Carlos Salgado, whose culinary pedigree instantly drove Taco María onto the radar screen of every serious Orange County eater. His has indeed an impressive résumé: Salgado served as pastry chef in some of the Bay Area's top restaurants, including Daniel Patterson's Coi and Oakland's Commis. He returned home to Orange County to help his parents transform the family's taquería. Taco María is what emerged from that reinvention, a truck that's constantly re-imagining lonchera traditions with the techniques and style of Mexican alta cocina.

"My parents' restaurant, La Siesta [in Orange], has been in business for over 25 years," Salgado says. "It was when they started talking about selling a few years ago that I began pointing myself back toward my hometown. Taco María was to be an extension of the restaurant and a flagship for our catering operations.

"Coming to work for a different audience, at a different price point, I've had to simplify my approach and distill the cooking ethics that are most important to me into a method that works within the food truck model. And while I may not have a kitchen full of highly trained,
Michelin-quality cooks, a Pacojet, Cryovac machine or a dozen immersion circulators, I do have my family to support me and keep me grounded. My dad is the best sous-chef I could imagine having."

Those at the truck inevitably start with the aracherra taco, made with grilled hanger steak, a blistered shisito pepper, caramelized onion and bacon's smoky quintessence. The taco has both the humble charm of a backyard barbecue and the finesse of a fine steakhouse.

Yet even the most hard-core carnivores ultimately end up ordering the jardineros taco as well: knobs of roasted pumpkin, black beans, cotija cheese and a pumpkin seed salsa de semillas. There's no need for meat — this is a vegetarian taco built not on the artifice of mock meat or incongruous fusion but on the simple rhythms of the market.

If the aracherra doesn't sway you, there's always the carnitas. The slow-cooked pork shoulder is lashed with a bit of citrus and enlivened by the noticeable warmth of cinnamon. The mole de pollo is even more richly spiced — the mahogany mole is as complex as an Indian curry.

But Taco María's ever-changing specials are its signature. The truck's quesadilla de tuétano triggers Pavlovian devotion. It's a dish already cemented in food truck lore: crisp nuggets of bone marrow, stringy queso Oaxaca and a garlic-and-herb paste pulverized in a molcajete. It's predictably rich but powerfully addictive.

Salgado's rendition of esquites is similarly good,
chile- and lime-laced corn sautéed with garlic, thyme and epazote in a butter flavored with blackened corncobs and toasty husks.

"I was telling [my] mom about some of my favorite foods and struggling to find a translation for bone marrow," Salgado explains. "She said something like, 'I think we used to make quesadillas [with that].' I was floored and immediately wrote it into our opening menu. What I assumed would be a fringe dish for the adventurous actually turned out to be incredibly popular. My whole staff has cuts and scrapes on their hands from pushing marrow every day just to meet demand."

It isn't brunch without the truck's excellent chilaquiles: freshly fried tortilla chips enrobed in a cascabel chile sauce and topped with pickled onions, queso fresco and a fried egg. Taco María isn't all about masa, either — any taco can be turned into a burrito. And you've really got to try the beet salad dressed with avocado, orange, almonds and charred scallion vinaigrette.

There may be a melon-lemon grass agua fresca to drink, or perhaps one flush with hibiscus and Concord grape. Salgado's almond horchata, however, is what you'll want a jug of, almond milk perfumed with coriander seeds. It's a brilliant addition: fragrant and floral, the coriander is at once unmistakable and ingeniously subtle.

Whether it's by an obsessive need for completion or sheer force of will, you will find room for dessert. Salgado's sweets are every bit as good as his pastry training portends, like the steamed chocolate bread pudding strewn with fried peanuts and glazed with milky caramel. When there isn't dense rice pudding scented with star anise and cinnamon, there's a glorious ricotta flan of homemade ricotta, caramel and a few sangria-soaked raspberries.

Witness the truck's crowds at Orange County's farmers markets and business parks and you begin to understand Taco María's growing cult, a purveyor of precisely the kind of modern Mexican cooking that's destined not for disposable cardboard containers but fine porcelain.

Salgado hints at that future. "It's still too early for us to share details, but we're excited about creating a unique type of Mexican restaurant here in Orange County, where Mexican food is such a large part of our shared experience. Exactly where and when depend on how far our truck, Frida, can take us. What I can say is that the restaurant will remain local, honest and accessible, with a menu that is recognizably Mexican in soul, in a space that is central, warm and inviting and will hopefully become a fixture in our own community."

source:
http://www.latimes.com/features/food/la-fo-find-20120119,0,3934262.story

WEEK TWO WRITING ABOUT WHAT YOU READ:

For this week, you have to write the restaurant review, so nothing else is due here. You already have the assignment on the blog. Email if you have questions.

WEEK THREE BLOG ENTRY

Tell me about your use of technology. If you can recall a time before cell phones, texting, and maybe even, dare I say it, email, tell us how life was different in those days. Do your best to avoid judgmental statements(better or worse) now: just describe.

WEEK THREE READING

Is Google Making Us Stupid?
What the Internet is doing to our brains

By Nicholas Carr

"Dave, stop. Stop, will you? Stop, Dave. Will you stop, Dave?” So the supercomputer HAL pleads with the implacable astronaut Dave Bowman in a famous and weirdly poignant scene toward the end of Stanley Kubrick’s 2001: A Space Odyssey. Bowman, having nearly been sent to a deep-space death by the malfunctioning machine, is calmly, coldly disconnecting the memory circuits that control its artificial “ brain. “Dave, my mind is going,” HAL says, forlornly. “I can feel it. I can feel it.”

I can feel it, too. Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
I think I know what’s going on. For more than a decade now, I’ve been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet. The Web has been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes. A few Google searches, some quick clicks on hyperlinks, and I’ve got the telltale fact or pithy quote I was after. Even when I’m not working, I’m as likely as not to be foraging in the Web’s info-thickets’reading and writing e-mails, scanning headlines and blog posts, watching videos and listening to podcasts, or just tripping from link to link to link. (Unlike footnotes, to which they’re sometimes likened, hyperlinks don’t merely point to related works; they propel you toward them.)
For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they’ve been widely described and duly applauded. “The perfect recall of silicon memory,” Wired’s Clive Thompson
has written, “can be an enormous boon to thinking.” But that boon comes at a price. As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.
I’m not the only one. When I mention my troubles with reading to friends and acquaintances—literary types, most of them—many say they’re having similar experiences. The more they use the Web, the more they have to fight to stay focused on long pieces of writing. Some of the bloggers I follow have also begun mentioning the phenomenon.
Scott Karp, who writes a blog about online media, recently confessed that he has stopped reading books altogether. “I was a lit major in college, and used to be [a] voracious book reader,” he wrote. “What happened?” He speculates on the answer: “What if I do all my reading on the web not so much because the way I read has changed, i.e. I’m just seeking convenience, but because the way I THINK has changed?”
Bruce Friedman, who blogs regularly about the use of computers in medicine, also has described how the Internet has altered his mental habits. “I now have almost totally lost the ability to read and absorb a longish article on the web or in print,” he wrote earlier this year. A pathologist who has long been on the faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me. His thinking, he said, has taken on a “staccato” quality, reflecting the way he quickly scans short passages of text from many sources online. “I can’t read War and Peace anymore,” he admitted. “I’ve lost the ability to do that. Even a blog post of more than three or four paragraphs is too much to absorb. I skim it.”
Anecdotes alone don’t prove much. And we still await the long-term neurological and psychological experiments that will provide a definitive picture of how Internet use affects cognition. But a recently published
study of online research habits , conducted by scholars from University College London, suggests that we may well be in the midst of a sea change in the way we read and think. As part of the five-year research program, the scholars examined computer logs documenting the behavior of visitors to two popular research sites, one operated by the British Library and one by a U.K. educational consortium, that provide access to journal articles, e-books, and other sources of written information. They found that people using the sites exhibited “a form of skimming activity,” hopping from one source to another and rarely returning to any source they’d already visited. They typically read no more than one or two pages of an article or book before they would “bounce” out to another site. Sometimes they’d save a long article, but there’s no evidence that they ever went back and actually read it. The authors of the study report:

It is clear that users are not reading online in the traditional sense; indeed there are signs that new forms of “reading” are emerging as users “power browse” horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.


Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice. But it’s a different kind of reading, and behind it lies a different kind of thinking—perhaps even a new sense of the self. “We are not only what we read,” says Maryanne Wolf, a developmental psychologist at Tufts University and the author of
Proust and the Squid: The Story and Science of the Reading Brain. “We are how we read.” Wolf worries that the style of reading promoted by the Net, a style that puts “efficiency” and “immediacy” above all else, may be weakening our capacity for the kind of deep reading that emerged when an earlier technology, the printing press, made long and complex works of prose commonplace. When we read online, she says, we tend to become “mere decoders of information.” Our ability to interpret text, to make the rich mental connections that form when we read deeply and without distraction, remains largely disengaged.
Reading, explains Wolf, is not an instinctive skill for human beings. It’s not etched into our genes the way speech is. We have to teach our minds how to translate the symbolic characters we see into the language we understand. And the media or other technologies we use in learning and practicing the craft of reading play an important part in shaping the neural circuits inside our brains. Experiments demonstrate that readers of ideograms, such as the Chinese, develop a mental circuitry for reading that is very different from the circuitry found in those of us whose written language employs an alphabet. The variations extend across many regions of the brain, including those that govern such essential cognitive functions as memory and the interpretation of visual and auditory stimuli. We can expect as well that the circuits woven by our use of the Net will be different from those woven by our reading of books and other printed works.

Sometime in 1882, Friedrich Nietzsche bought a typewriter—a Malling-Hansen Writing Ball, to be precise. His vision was failing, and keeping his eyes focused on a page had become exhausting and painful, often bringing on crushing headaches. He had been forced to curtail his writing, and he feared that he would soon have to give it up. The typewriter rescued him, at least for a time. Once he had mastered touch-typing, he was able to write with his eyes closed, using only the tips of his fingers. Words could once again flow from his mind to the page.

But the machine had a subtler effect on his work. One of Nietzsche’s friends, a composer, noticed a change in the style of his writing. His already terse prose had become even tighter, more telegraphic. “Perhaps you will through this instrument even take to a new idiom,” the friend wrote in a letter, noting that, in his own work, his “‘thoughts’ in music and language often depend on the quality of pen and paper.”
“You are right,” Nietzsche replied, “our writing equipment takes part in the forming of our thoughts.” Under the sway of the machine, writes the German media scholar
Friedrich A. Kittler , Nietzsche’s prose “changed from arguments to aphorisms, from thoughts to puns, from rhetoric to telegram style.”
The human brain is almost infinitely malleable. People used to think that our mental meshwork, the dense connections formed among the 100 billion or so neurons inside our skulls, was largely fixed by the time we reached adulthood. But brain researchers have discovered that that’s not the case. James Olds, a professor of neuroscience who directs the Krasnow Institute for Advanced Study at George Mason University, says that even the adult mind “is very plastic.” Nerve cells routinely break old connections and form new ones. “The brain,” according to Olds, “has the ability to reprogram itself on the fly, altering the way it functions.”
As we use what the sociologist Daniel Bell has called our “intellectual technologies”—the tools that extend our mental rather than our physical capacities—we inevitably begin to take on the qualities of those technologies. The mechanical clock, which came into common use in the 14th century, provides a compelling example. In Technics and Civilization, the historian and cultural critic
Lewis Mumford described how the clock “disassociated time from human events and helped create the belief in an independent world of mathematically measurable sequences.” The “abstract framework of divided time” became “the point of reference for both action and thought.”
The clock’s methodical ticking helped bring into being the scientific mind and the scientific man. But it also took something away. As the late MIT computer scientist
Joseph Weizenbaum observed in his 1976 book, Computer Power and Human Reason: From Judgment to Calculation, the conception of the world that emerged from the widespread use of timekeeping instruments “remains an impoverished version of the older one, for it rests on a rejection of those direct experiences that formed the basis for, and indeed constituted, the old reality.” In deciding when to eat, to work, to sleep, to rise, we stopped listening to our senses and started obeying the clock.
The process of adapting to new intellectual technologies is reflected in the changing metaphors we use to explain ourselves to ourselves. When the mechanical clock arrived, people began thinking of their brains as operating “like clockwork.” Today, in the age of software, we have come to think of them as operating “like computers.” But the changes, neuroscience tells us, go much deeper than metaphor. Thanks to our brain’s plasticity, the adaptation occurs also at a biological level.
The Internet promises to have particularly far-reaching effects on cognition. In a
paper published in 1936, the British mathematician Alan Turing proved that a digital computer, which at the time existed only as a theoretical machine, could be programmed to perform the function of any other information-processing device. And that’s what we’re seeing today. The Internet, an immeasurably powerful computing system, is subsuming most of our other intellectual technologies. It’s becoming our map and our clock, our printing press and our typewriter, our calculator and our telephone, and our radio and TV.
When the Net absorbs a medium, that medium is re-created in the Net’s image. It injects the medium’s content with hyperlinks, blinking ads, and other digital gewgaws, and it surrounds the content with the content of all the other media it has absorbed. A new e-mail message, for instance, may announce its arrival as we’re glancing over the latest headlines at a newspaper’s site. The result is to scatter our attention and diffuse our concentration.
The Net’s influence doesn’t end at the edges of a computer screen, either. As people’s minds become attuned to the crazy quilt of Internet media, traditional media have to adapt to the audience’s new expectations. Television programs add text crawls and pop-up ads, and magazines and newspapers shorten their articles, introduce capsule summaries, and crowd their pages with easy-to-browse info-snippets. When, in March of this year, TheNew York Times
decided to devote the second and third pages of every edition to article abstracts , its design director, Tom Bodkin, explained that the “shortcuts” would give harried readers a quick “taste” of the day’s news, sparing them the “less efficient” method of actually turning the pages and reading the articles. Old media have little choice but to play by the new-media rules.
Never has a communications system played so many roles in our lives—or exerted such broad influence over our thoughts—as the Internet does today. Yet, for all that’s been written about the Net, there’s been little consideration of how, exactly, it’s reprogramming us. The Net’s intellectual ethic remains obscure.

About the same time that Nietzsche started using his typewriter, an earnest young man named Frederick Winslow Taylor carried a stopwatch into the Midvale Steel plant in Philadelphia and began a historic series of experiments aimed at improving the efficiency of the plant’s machinists. With the approval of Midvale’s owners, he recruited a group of factory hands, set them to work on various metalworking machines, and recorded and timed their every movement as well as the operations of the machines. By breaking down every job into a sequence of small, discrete steps and then testing different ways of performing each one, Taylor created a set of precise instructions—an “algorithm,” we might say today—for how each worker should work. Midvale’s employees grumbled about the strict new regime, claiming that it turned them into little more than automatons, but the factory’s productivity soared.

More than a hundred years after the invention of the steam engine, the Industrial Revolution had at last found its philosophy and its philosopher. Taylor’s tight industrial choreography—his “system,” as he liked to call it—was embraced by manufacturers throughout the country and, in time, around the world. Seeking maximum speed, maximum efficiency, and maximum output, factory owners used time-and-motion studies to organize their work and configure the jobs of their workers. The goal, as Taylor defined it in his celebrated 1911 treatise, The Principles of Scientific Management, was to identify and adopt, for every job, the “one best method” of work and thereby to effect “the gradual substitution of science for rule of thumb throughout the mechanic arts.” Once his system was applied to all acts of manual labor, Taylor assured his followers, it would bring about a restructuring not only of industry but of society, creating a utopia of perfect efficiency. “In the past the man has been first,” he declared; “in the future the system must be first.”
Taylor’s system is still very much with us; it remains the ethic of industrial manufacturing. And now, thanks to the growing power that computer engineers and software coders wield over our intellectual lives, Taylor’s ethic is beginning to govern the realm of the mind as well. The Internet is a machine designed for the efficient and automated collection, transmission, and manipulation of information, and its legions of programmers are intent on finding the “one best method”—the perfect algorithm—to carry out every mental movement of what we’ve come to describe as “knowledge work.”

Google’s headquarters, in Mountain View, California—the Googleplex—is the Internet’s high church, and the religion practiced inside its walls is Taylorism. Google, says its chief executive, Eric Schmidt, is “a company that’s founded around the science of measurement,” and it is striving to “systematize everything” it does. Drawing on the terabytes of behavioral data it collects through its search engine and other sites, it carries out thousands of experiments a day, according to the Harvard Business Review, and it uses the results to refine the algorithms that increasingly control how people find information and extract meaning from it. What Taylor did for the work of the hand, Google is doing for the work of the mind.

The company has declared that its mission is “to organize the world’s information and make it universally accessible and useful.” It seeks to develop “the perfect search engine,” which it defines as something that “understands exactly what you mean and gives you back exactly what you want.” In Google’s view, information is a kind of commodity, a utilitarian resource that can be mined and processed with industrial efficiency. The more pieces of information we can “access” and the faster we can extract their gist, the more productive we become as thinkers.
Where does it end? Sergey Brin and Larry Page, the gifted young men who founded Google while pursuing doctoral degrees in computer science at Stanford, speak frequently of their desire to turn their search engine into an artificial intelligence, a HAL-like machine that might be connected directly to our brains. “The ultimate search engine is something as smart as people—or smarter,” Page said in a speech a few years back. “For us, working on search is a way to work on artificial intelligence.” In a
2004 interview with Newsweek, Brin said, “Certainly if you had all the world’s information directly attached to your brain, or an artificial brain that was smarter than your brain, you’d be better off.” Last year, Page told a convention of scientists that Google is “really trying to build artificial intelligence and to do it on a large scale.”
Such an ambition is a natural one, even an admirable one, for a pair of math whizzes with vast quantities of cash at their disposal and a small army of computer scientists in their employ. A fundamentally scientific enterprise, Google is motivated by a desire to use technology, in Eric Schmidt’s words, “to solve problems that have never been solved before,” and artificial intelligence is the hardest problem out there. Why wouldn’t Brin and Page want to be the ones to crack it?
Still, their easy assumption that we’d all “be better off” if our brains were supplemented, or even replaced, by an artificial intelligence is unsettling. It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized. In Google’s world, the world we enter when we go online, there’s little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive.
The idea that our minds should operate as high-speed data-processing machines is not only built into the workings of the Internet, it is the network’s reigning business model as well. The faster we surf across the Web—the more links we click and pages we view—the more opportunities Google and other companies gain to collect information about us and to feed us advertisements. Most of the proprietors of the commercial Internet have a financial stake in collecting the crumbs of data we leave behind as we flit from link to link—the more crumbs, the better. The last thing these companies want is to encourage leisurely reading or slow, concentrated thought. It’s in their economic interest to drive us to distraction.

Maybe I’m just a worrywart. Just as there’s a tendency to glorify technological progress, there’s a countertendency to expect the worst of every new tool or machine. In Plato’s Phaedrus, Socrates bemoaned the development of writing. He feared that, as people came to rely on the written word as a substitute for the knowledge they used to carry inside their heads, they would, in the words of one of the dialogue’s characters, “cease to exercise their memory and become forgetful.” And because they would be able to “receive a quantity of information without proper instruction,” they would “be thought very knowledgeable when they are for the most part quite ignorant.” They would be “filled with the conceit of wisdom instead of real wisdom.” Socrates wasn’t wrong—the new technology did often have the effects he feared—but he was shortsighted. He couldn’t foresee the many ways that writing and reading would serve to spread information, spur fresh ideas, and expand human knowledge (if not wisdom).

The arrival of Gutenberg’s printing press, in the 15th century, set off another round of teeth gnashing. The Italian humanist Hieronimo Squarciafico worried that the easy availability of books would lead to intellectual laziness, making men “less studious” and weakening their minds. Others argued that cheaply printed books and broadsheets would undermine religious authority, demean the work of scholars and scribes, and spread sedition and debauchery. As New York University professor Clay Shirky notes, “Most of the arguments made against the printing press were correct, even prescient.” But, again, the doomsayers were unable to imagine the myriad blessings that the printed word would deliver.
So, yes, you should be skeptical of my skepticism. Perhaps those who dismiss critics of the Internet as Luddites or nostalgists will be proved correct, and from our hyperactive, data-stoked minds will spring a golden age of intellectual discovery and universal wisdom. Then again, the Net isn’t the alphabet, and although it may replace the printing press, it produces something altogether different. The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author’s words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas. Deep reading, as Maryanne Wolf argues, is indistinguishable from deep thinking.
If we lose those quiet spaces, or fill them up with “content,” we will sacrifice something important not only in our selves but in our culture. In a
recent essay, the playwright Richard Foreman eloquently described what’s at stake:

I come from a tradition of Western culture, in which the ideal (my ideal) was the complex, dense and “cathedral-like” structure of the highly educated and articulate personality—a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West. [But now] I see within us all (myself included) the replacement of complex inner density with a new kind of self—evolving under the pressure of information overload and the technology of the “instantly available.”


As we are drained of our “inner repertory of dense cultural inheritance,” Foreman concluded, we risk turning into “‘pancake people’—spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.”
I’m haunted by that scene in 2001. What makes it so poignant, and so weird, is the computer’s emotional response to the disassembly of its mind: its despair as one circuit after another goes dark, its childlike pleading with the astronaut—“I can feel it. I can feel it. I’m afraid”—and its final reversion to what can only be called a state of innocence. HAL’s outpouring of feeling contrasts with the emotionlessness that characterizes the human figures in the film, who go about their business with an almost robotic efficiency. Their thoughts and actions feel scripted, as if they’re following the steps of an algorithm. In the world of 2001, people have become so machinelike that the most human character turns out to be a machine. That’s the essence of Kubrick’s dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.
This article available online at:
http://www.theatlantic.com/magazine/archive/2008/07/is-google-making-us-stupid/6868/