Downstream Deliverables

So much depends upon the language we use, the metaphors we live by.

When an assignment says, “Don’t just tell me what you think. Analyze your passage,” I understand that the assignment is really asking for something other than a superficial response. I’m convinced, though, that some part of the student’s brain reads the instruction literally and draws the obvious conclusions: analysis has nothing to do with thinking (it’s an alien exercise in trying to copy the inexplicable things teachers do), and more sadly, “my thoughts are beside the point, irrelevant.”

My own conclusion: the words we use matter, and they matter greatly. I don’t want superficial, thoughtless, or uncommitted responses, but I do very much want to know what the student thinks (no “just” about it), both because I want the student to think, and because I want the student to have the chance to be surprised by the value of their own thoughts before the rest of the lesson continues. “Don’t just tell me what you think”? I shudder. Someone just walked across the grave of higher education.

I had a similar shudder in an otherwise splendid AAC&U session today when a panelist used the phrase “downstream deliverables.” The phrase denoted the necessary, laudable goal of asking grantees to produce evidence of the results they had gotten from the grant monies. Nothing at all wrong with that–except again, that the words and metaphors matter. In this case, the metaphor brings to mind a barge floating downstream, laden with containers of, well, things–things that are probably products, products that are probably delivered to consumers. A fairly brutal metaphor when it comes to the results of messy, aspirational human processes.

Yet I will shift that metaphor into a different context. This conference has been many things for me: an opportunity to break bread and share ideas with the QEP team at VCU, a chance to learn from extraordinary colleagues from the around the world, a season of reflection on what matters most to me as a professor and a leader in higher ed, an opportunity to hear from some wonderfully thoughtful and provocative speakers. It’s been all of that, and more. Some of the most intense moments, however, have been what I will now call “downstream deliverables.” The stream is Time, that ever-rolling stream that in the words of the hymn “bears all its sons [and daughters] away.” What the hymn doesn’t say, however, is that time sometimes bears its sons and daughters back together. During this conference, my own “downstream deliverables,” the people whom the stream of time has borne back to me (and back to them), include a student from two years ago, a student from twenty-two years ago, and a student from thirty years ago; a colleague whom I knew a little during grad school and suddenly, unexpectedly reconnected with after a business conversation led to “you know, you look kind of familiar to me”; a moment in which I saw out of the corner of my eye a mentor (she walked by too quickly for me to hail her); a moment in which I learned that a huge intellectual influence was seated at the back of the room that housed a panel discussion I was honored to participate in.

My deliverables, years and decades down the stream of time, are the lives I’ve touched, and the lives that have touched mine, the thousand acts of kindness, attention, and love, “the primal sympathy / Which having been must ever be….” Each time these unexpected meetings occurred, I felt my soul expand, extend, enlarge. Each moment arrived downstream,  carrying not freight but a fullness of being among souls I am privileged and humbled to know. After many years, we are met. My downstream deliverables become a kind of deliverance, and for that I am grateful.

Word use over time

A lagniappe to yesterday’s post. The Google etymology feature isn’t new–it went live in August, 2013–but it was new to me and I continue to think about it.

I’m struck by how Google continues to work, often in very creative ways, to pique my interest. From the Google Doodle to the What Do You Love? page (go ahead and try it–you’ll find it interesting), they continue to earn my attention, even as I remind myself that they do have a business model and they are not a non-profit. They also manage to reward my intuitions about what they might do next. That’s how I found their etymology affordance last night.

Things get even more interesting when you see the rest of the affordance, portions I didn’t include in last night’s post:

certificate use over time credential use over time

 

(Ignore the bits from the Online Etymology Dictionary, as these aren’t Google’s work.)

Google’s used their culturomics data (the Ngrams) to yield the usage stats over time. What’s interesting is that the data prompt thought. If these data provide a representative sample (granted, something I cannot tell), I will wonder why the use of certificate has declined steadily since its peak, plateauing for now, while the use of credential has grown suddenly and sharply over the last 20 years or so. One hypothesis might be that “credential” is being used in places where “certificate” might have been used earlier, perhaps because “credential” implies something more prestigious than “certificate.” That’s a chain of suppositions, so not at all reliable, but still perhaps an interesting inquiry project.

If “credential” and “certificate” become synonymous, as I hope they do not but fear they may have already, then the added luster of “credential” will be a cruel illusion indeed.

 

Credential or Certificate

I continue to think about what we mean by a “degree.” Or rather, I think about what a degree might or should mean, and what we in higher ed increasingly act as if it means, and how that disjunction (if it is one, which I think it is) plays out across our practices, our assumptions, our mission statements, and our civic life. (I’m sure I’ve left out several crucial areas there.)

My thoughts are spurred by a conversation I had several days ago with a colleague who wanted to know what kind of certificate we might offer as an incentive for open participants to complete a cMOOC. I started thinking about the difference between a certificate and a credential. I talked about credentials many years ago in a presentation I podcast here. At the time, though, I simply urged we recall the root meaning of credential, a word that derives from credence, the mark of believability and the grounds for trust we stipulate as a result of some experience or, perhaps, a formation of character we have collectively witnessed.

I didn’t then have the contrast, though, that would drive the point home. I think now the contrast is between “credential,” a condition of being, and “certificate,” something that is not of a person so much as about some specific competency the person has demonstrated. I grant that I am skeptical of any education that focuses narrowly on “competency,” as if skills could be divorced from contexts, or ideas, or personhood. I grant that my skepticism may lead me to exaggerate the distinction I’m trying to make. Yet the distinction may prove useful in articulating how two views might diverge, an what the consequences might be.

Incorrigible and largely unrepentant English professor that I am, I went on an etymology hunt. R. W. Emerson observed that language is fossil poetry, so it was time for some paleontology. I usually go to the Oxford English Dictionary for my etymologies, for there I will also find a useful set of historical definitions that help chart how early usage changes over time. Tonight, though, I had only my iPad with me at dinner. (I try to travel lighter at conferences when possible–I’m writing this post from the annual meeting of the AAC&U.) I have long known how to use Google to define a word: simply type in the search box “define x” (without quotation marks and with a word where the x goes, of course), and away you go. On a lark, and because Google is always introducing cool new things on the sly (aside from tracking its users, that is), I typed “etymology credential” — and here’s what came up!

image

Ah. The word was first an adjective, and only later became a noun. First a descriptor, then the thing it described. Alas, the thing described, a credential document, seems to have skipped the possible middle sense of a quality or virtue. Instead, a credential, a trustworthiness or recommendation, is typically reduced to that piece of paper we call a diploma–in other words, a certificate.

image

As “credential” moves toward “certficate,” “recommendation” becomes “document,” indeed an “official document” attesting to facts, records, achievements, ownership. I’m not arguing that facts, records, achielvement, and ownership are unimportant. Not at all. They’re vital. But taken outside the context of trust, of personhood, of recommendation, credentials edge toward a kind of “guarantee,” or a license. Something transformative becomes  instead flat and transactional. Get a certificate, get a raise, get a job. Yes, and those are important, But what of the person?

I continue to mull these things over. A small shift in meaning may lead to a large and potentially regrettable shift in civic and cultural practice. I am especially struck by this possibility in the aftermath of the challenging and fascinating opening forum tonight at the AAC&U meeting.

And I think of the words we say at our higher education commencement ceremonies when it comes time to award to–or is it confer upon?–our students their degrees: we deans present our degree candidates to the President, and say that we are doing so upon the “recommendation of the faculty.” In that moment, deep within that phrase and yet still visible if one knows to look, we may still find what is most valuable about a truly credential education.

The Sweet Spots

I have been thinking very hard lately about the nature and value of focused learning, and especially the kinds of focused learning experiences we might explore and craft within school. I greatly admire the DALMOOC George Siemens and his research group at UT-Arlington crafted last fall, but I also worry a little about the binary structure. As a practical matter, the dichotomy makes a great deal of sense. Those of us who are trying to work on modes of openly networked learning continually struggle with the question of how to define, recognize, and reward multiple modes of engagement–or to speak even more precisely, multiple ranges of commitment. Yet I wonder if one can truly read a book, hear a symphony, or watch a movie without being all in. I wonder if being led and being leaders are necessarily always mutually exclusive. That’s not to say that what Tom Woodward calls the “energy inputs” of open participants who come and go during a course of study are of no benefit to the class. Quite the contrary. But I do worry. Are formal structures of  what may amount to lesser commitment really a way forward? The opposite extreme, of course, is a formal structure of pedantic insistence–i.e., much or most of what constitutes school-based learning– that can bleach away all the energy of self-directed learning. But these are sad realities of misguided practice, not necessities. I just don’t think that “instructor-led” or “learner-centered” set up the deeper conceptual framework very well. And if I never again hear the grinding binary of “guide at the side / sage on the stage,”  I will weep tears of joy. Even Ivan Illich, the great prophet of deschooling, recognized the role and importance of the genuine pedagogue.

For me the positive vectors are commitment, openness; a willingness to dwell in conjectures and dilemmas and to insist on precision (or the nearest aspirational approximation) when precise information and precise execution are needed to keep the spacecraft from disintegrating. I must also testify that experts lead in many different ways, and many of those different ways are not only important and eminently cherishable but have in fact changed my life. When I watch The Godfather, or read A. S. Byatt, or talk with a gifted and humane practitioner of the healing arts and sciences, I give myself over to the experts, not uncritically, but with commitment and a desire to open myself toward those talents, so long as they are not exercised with cruelty or in mere self-interest.

I too keep looking for the sweet spots.

Oddly, that search has also characterized much of my scholarly work as a Miltonist. How could it not, when one of Milton’s choicest lines is “the sober certainty of waking bliss”?

Here’s an anthology of sweet-spot readings, placed together with minimal commentary: bread crumbs along my wandering way.


 

glenn miller direct disc[Jimmy Henderson] has been compared to Miller as a strict disciplinarian. Certainly he is an excellent leader. Jimmy sees the band as self-disciplined out of pride in themselves as artists and pride in being associated with the Glenn Miller Orchestra. “Discipline we do have,” he affirms, “but regimentation we do not. There is an enormous difference. Regimentation has no place in music.”

Patricia Willard, from her liner notes to  The Direct Disc Sound of The Glenn Miller Orchestra, directed by Jimmy Henderson (The Great American Gramophone Company, 1977. GADD-1020).


giant hairball tocOrbiting is responsible creativity: vigorously exploring and operating beyond the Hairball of the corporate mind set, beyond accepted models, patterns, or standards” — all the while remaining connected to the spirit of the corporate mission.

To find Orbit around a corporate Hairball is to find a place of balance where you benefit. from the physical, intellectual and philosophical resources of the organization without becoming entombed it eh bureaucracy of the institution.

If you are interested (and it is not for everyone), you can achieve Orbit by finding the personal courage to be genuine and to take the best course of action to get the job done rather than following the pallid path of corporate appropriateness.

To be of optimum value to the corporate endeavor, you must invest enough individuality to counteract the pull of Corporate Gravity, but not so much that you escape that pull altogether. Just enough to stay out of the Hairball.

Through this measured assertion of your own uniqueness, it is possible to establish a dynamic relationship with the Hairball–to Orbit around the institutional mass. If you do this, you make an asset of the gravity in that it becomes a force that keeps you from flying out into the overwhelming nothingness of deep space.

But if you allow that same gravity to suck you into the bureaucratic Hairball, you will find yourself in a different kind of nothingness. The nothingness of a normalcy made stagnant by a compulsion to cling to past successes. The nothingness of the Hairball.

Gordon Mackenzie, Orbiting The Giant Hairball: A Corporate Fool’s Guide to Surviving with Grace (Viking, 1996), p. 33. H/t @marianafunes. NB: I question that “it is not for everyone.” I believe Mackenzie is delivering a strong caution there, not a statement about eligibility or desirability. Or he may simply be trying to forestall objections.


By virtue of a privilege which he shared with the greatest creative artists, the composer [Maurice Ravel] never lost, in his obstinate determination to acquire technical mastery, that fresh sensibility which is the privilege of childhood and is normally lost with advancing years.

Alexis Roland-Manuel, quoted in Richard Freed’s liner notes to the original 1975 Vox Quad recording of Daphnis et Chloe (Ballet Suites Nos. 1&2) and Ma Mere l’Oye, as reproduced in the Mobile Fidelity SACD reissue of that recording in 2005. Ravel is one of my favorite composers, and I cannot imagine a sweeter spot than at the intersection of an “obstinate determination to acquire technical mastery” and “that fresh sensibility which is the privilege of childhood….” How to keep that intersection always in view and always yielding energy? Great teachers have a feel for those tasks.


 

And yes, such commitments are difficult to manage, especially when they take vastly different forms, experiences, and methodologies. Ironically, an obsession with standardization built out of superficial outputs, outcomes, and analytics will appear to ease the learner’s path, only to rob the learner of the very many-mindedness that leads to the deepest, most transferable, most enduring learning of all.

The changing values of the 1960s influenced the CMU [Carnegie Mellon University] drama department’s instructional program. A few faculty members explored innovative techniques, while others adhered to the acting methods they had learned, or passed along specialized skills like mime, dance, and diction. The diversity of their approaches was both helpful and challenging for the undergraduates. Leon Katz remembers, “There was no uniform attitude to the faculty. We had five acting teachers. All of them were tremendously good and they loathed what one another was doing. Each one had a totally different conceptual training. The students were confused. They would go to [department chairman] Earle Gister and say, “What are we supposed to believe? We’re totally confused!” He said, “Good, that’s your training. You sort it out and find the thing that’s right for you.”

Carol De Giere, The Godspell Experience: Inside a Transformative Musical (Bethel, CT: Scene 1 Publishing, 2014, pp. 21-22)

Teachers, Leaders

Nadia Boulanger 1925

“May I have the power to exchange my best with your best.” –Nadia Boulanger

I have been mulling over this great and greatly insightful post for a couple of days. What follows is a slightly modified version of my comment there. Please go read it and share your own thoughts however and wherever you like.

I have many tangled responses that are a little painful to contemplate, so I’ll just leave this marker here for now: I think part of the subject here is leadership. I have had many spirited disagreements with a leader named Jim Groom about the role, necessity, and ethics of leadership. For me, a teacher is also a kind of leader. Ivan Illich, no fan of schooling or authoritarian structures of any kind, writes movingly about the role of the true, deep teacher. So does George Steiner, using language of “master” and “disciple” that would make many open-web folks cringe–or worse. Yet even the great and greatly democratic poet Walt Whitman salutes his “eleves” at one point. And I have experienced and been very grateful for the wisdom of those teacher-leaders who brought me into a fuller experience and understanding of my own responsibilities as a leader. What is “self-directed learning” if not an act of leadership?

One of the books that’s affected me most profoundly this year is Leadership on the Line: Staying Alive Through the Dangers of Leading by Ronald A. Heifetz and Marty Linsky. In it, I find this wisdom:

And every day you must decide whether to put your contribution out there, or keep it to yourself to avoid upsetting anyone, and get through another day. You are right to be cautious. Prudence is a virtue. You disturb people when you take unpopular initiatives in your community, put provocative new ideas on the table in your organization, question the gap between colleagues’ values and behavior, or ask friends and relatives to face up to tough realities. You risk people’s ire and make yourself vulnerable. Exercising leadership can get you into a lot of trouble. To lead is to live dangerously because when leadership counts, when you lead people through difficult change, you challenge what people hold dear—their daily habits, tools, loyalties, and ways of thinking—with nothing more to offer perhaps than a possibility. Moreover, leadership often means exceeding the authority you are given to tackle the challenge at hand. People push back when you disturb the personal and institutional equilibrium they know. And people resist in all kinds of creative and unexpected ways that can get you taken out of the game: pushed aside, undermined, or eliminated. It is no wonder that when the myriad opportunities to exercise leadership call, you often hesitate. Anyone who has stepped out on the line, leading part or all of an organization, a community, or a family, knows the personal and professional vulnerabilities. However gentle your style, however careful your strategy, however sure you may be that you are on the right track, leading is risky business.

Perhaps everyone is called to some form of leadership as an ethical imperative. Perhaps for everyone, a moment or occasion of leadership will emerge, reveal itself, and call to us with the painful, necessary task of speaking up, patiently asking for alternatives, insistently rocking the boat … and lovingly organizing the celebrations and rites of passage. Not to mention keeping the tribe alert to the value and splendor of newcomers, and to the persistent value of encountering other tribes to work together in building the commons.

I think that leadership may be mostly a commitment to the constant mediation and care required by love, that place where both individuality and relationship must assert themselves and somehow walk and dance together.

 

Why I Teach

(Who knows how this will turn out. An impossible topic allows some latitude in the exploration, yes? I mean, what do I have to lose? )

To try to explain why I teach seems impossible to me for several reasons. I never set out to be a teacher. They told me (you know, those folks who tell you things) that teachers were patient. I didn’t know whether my teachers were truly patient. (Looking back, of course, it seems to me they must have been in order to put up with me as a student.) I did know, without a second’s hesitation or an iota of doubt, that I was not patient. Nor am I now.

I don’t teach because I like to manage learning, though I suppose there is some kind of management that does foster learning. I love to imagine and help build interesting experiences that conduce to learning, but unless one says that Abbey Road was the result of “management,” I don’t think I like to manage learning. I’m not even sure that’s really possible.

The terrible truth is that I never set out to be a teacher. If you had told me at age 12, or 16, or even at age 21 that I’d end up being a teacher, I would probably have laughed at you. The weird thing about my laughter is that the teachers I loved imprinted themselves indelibly on my entire being. To this day, I can imagine them so vividly that I can almost believe myself back in their presence. I guess I didn’t think of those teachers I loved as part of school, and thus I probably didn’t think of them as teachers, though I knew very well that’s what they were. Instead, I thought of them as extraordinary human beings who were deeply inquisitive and thus deeply knowledgeable in ways that seemed to me to amplify one’s being far past any degree I could imagine. And the particular mode of the extraordinary had to do with the intellect, somehow, even if the visible result seemed to be a “skill” of some sort.

Perhaps I could see they were teachers, but I could never catch them “teaching.”

One approached knowledge in the spirit of making it accessible to the problem-solving learner by modes of thinking that he already possessed or that he could, so to speak, assemble by combining natural ways of thinking that he had not previously combined. (Jerome Bruner, The Process of Education)

 The teachers I loved did their work, as far as I could tell and as nearly as I can recall, by doing that. They weren’t covering or delivering content. They weren’t specifying learning outcomes on their syllabi. They weren’t prepping me for a high-stakes standardized test. They were doing that. And they seemed to be doing that because it was the precondition for the enlargement of being in this world full of people who live, talk, and work together and want to do that better.

So much of learning depends upon the need to achieve joint attention, to conduct enterprises jointly, to honor the social relationship that exists between learner and tutor, to generate possible worlds in which given propositions may be true or appropriate or even felicitous: to overlook this functional setting of learning–whatever its content–is to dry it to a mummy. (Bruner, op. cit.)

The first inkling I had that I might be a teacher, even if I generally disliked visible “teaching” in most of my classes, came in graduate school, when I led a small discussion (“recitation”) group in a large undergraduate class. I was reading some of the books for the first time myself. I didn’t think I was teaching anything. I thought I was asking interesting questions to which I was pretty sure I did not have the answers. The students responded very warmly. They said they had learned a lot from me. I found that puzzling, truly deeply puzzling, until much later when I read the second Bruner quotation above and realized that I apparently had a talent for fostering joint attention. I also realized along the way that “joint attention” meant much more than making sure all the students were paying attention to me. In fact, it probably didn’t mean that at all, though sometimes that kind of attention is warranted and handy. It meant, I think, that I was able to focus and make visible the purposeful attention any of us might bring to the learning moment, and with that focus and visibility strengthen and amplify its power and efficacy for all of us.

But it felt like being alight with delight. Together. And while I catalyzed it, it didn’t belong to me–which meant I could have it, too.


 

In the New Yorker‘s issue of May 19, 2014, there’s a strangely wonderful essay by Alec Wilkinson titled “A Voice From The Past.” In it, Wilkinson tells the story of a physicist who figured out a way to take very old traces of sound waves–traces predating phonograph records or even wax cylinders–and by scanning their visible marks, convert them back into sounds. By doing so, this physicist, Carl Haber, heard voices from farther back in time than anyone else had up to that date. (Yes, I’m messing with chronotopes again.) As Wilkinson tells Haber’s story, he veers into an uncanny moment in which the implications of Haber’s work–or I should say, the curiosity driving his work–suddenly grow very large indeed.

Silence is imaginary, because the world never stops making noise. A sound is a disruption of the air, and it doesn’t so much die as recede until it subsides beneath the level of the world’s random noise and can no longer be recovered, like a face that is lost in a crowd. In past times, people sometimes thought that all sounds that ever existed were still present, hovering like ghosts. Guglielmo Marconi, who sent the first radio message, in 1902, believed that with a microphone that was sufficiently sensitive he could hear Jesus delivering the Sermon on the Mount, and in 1925 a writer for the Washington Post speculated that a radio was capable of broadcasting the voices of the dead. A radio transmits vibrations, he wrote, and the voices of the dead “simply vibrate at a lower rate.”

“Teaching” might well address this conjecture with dispatch, so as to cover more content: the thing is impossible, the expression is fanciful, the conjecture is worthless. “In past times, people sometimes thought”: isn’t an essential part of critical thinking the way learners are schooled in the swift, efficient recognition that if people thought it in the past, it’s probably wrong? And if it involves metaphor or imagination in the hands of a non-expert, it’s almost certainly a naive mistake, at best. Yet that kind of critical thinking (yes, there are others) dramatically reduces the scope of one’s curiosity, one’s drive, the sense of possibility, the wild surmise that may lead nowhere but may also bring into being the very thing we all “knew” (because we were “taught’ it) was impossible.

I teach not only because I am thrilled to participate in most kinds of joint attention, but because I love the kind of idea Marconi had about the microphone, and I recognize that my love for that kind of idea is a love of enlarged forms and horizons of inquiry, and the energy released by that enlargement. I want that enlargement and that energy to be available to anyone who wants it. And I know from my own experience that this kind of idea is the most fragile of all, yet also one of the most valuable kinds of ideas we can have, because it can bring good new things into the world.

[Haber] said that what intrigued him about recovering relic sounds was the period and the figures who inhabited it. “Roughly toward the end of the nineteenth century, there were these early guys—I like to call them the heroic inventors,” he said. “Edison, Bell, Muybridge with his time studies, Marconi. They were not particularly well established academically; they were not trained as engineers, mathematicians, or scientists; they were very creative; and they did intuitive, seat-of-the-pants, trial-and-error experiments, whereas once you get into the twentieth century, and you have an understanding of the physics and chemistry involved in these original scientific gestures, you get engineers and academics doing this kind of work. They’re more cautious. No scientist would have thought you could hear Jesus. It violates the Second Law of Thermodynamics.”

He shook his head.

“Anyway, they were the first to record the world as it was actually happening,” he continued.

To encourage others–and thus myself as well–to be creative, intuitive, heroic inventors who record the world as it is actually happening, and thus to build a world of incautious love for the possible good we have not yet imagined: this, too, is why I teach.

 

Homophone trouble as a parable of learning

Klein Bottle: obliquity unbounded. Image cc Wikimedia Commons

This one is quite oblique, so if you’re not patient or inquisitive or morbidly curious, this one may not be for you. 

Here’s the backstory. Students in my section of “Living The Dreams: Digital Investigation and Unfettered Minds” are completing their inquiry projects. I’ve been thinking a lot, and emailing a few folks with some frequency, about their drafts. I’ve been thinking about writing and language and how the inquiry’s focus and articulation chart a course for the creator. Along the way, I had the opportunity to interact with one of the students about one of the most difficult homophone (or near-homophone) combos of all: “affect” and “effect.” My email turned into something like one of Vi Hart’s famous “math doodles” (or it seemed so to me). By the end, it felt as if I’d arrived at something near the core of teaching and learning.

I’m confident that what follows is not original with me. The ending, though, is where I think there may be a small contribution. It’s the paragraph that begins “now the interesting thing educationally….”

Unfortunately, the contribution needs to have the homophone discussion for context.

That small contribution draws on Bruner’s “The Will To Learn” as well as Pirsig’s Zen and the Art of Motorcycle Maintenance, the latter of which I am re-reading, as I do about every two years. The whole idea is to find the place where one must stand to start building. The will to learn. The idea or intuition of “quality” that precedes analysis or even conscious experience. The place that makes all possible, and without which we try to run our marathons on broken legs.


Now for the homophone affect-effect problem. You’ve actually got it wrong in your sentence (sorry). Here’s a thumbnail sketch of the differences:affect n. means “emotion.” Example: I could not read his affect from his face.
affect v. means “to adopt in a fake or pretentious way.” Example: She affects a scholarly air.
But now it gets harder.
affected v. *past tense* means “had an effect on” and is often used with the passive voice. (English is a hard language for these and many other reasons. I blame humanity.) Example: “Computing was affected for the better by Engelbart.” It’s a vague verb so I usually try to find a different way to put it. (My typical workaround for linguistic indecision.)
affected adj. means “fake.” Example: He had a very affected manner of speaking.

effect n. means “impact.” An effect is the thing that results from a cause, the thing the cause brings into being. Example: “Engelbart had a great effect on civilization, particularly with regard to computing.”
effect v. means “to bring into being.” Example: “Engelbart effected great change in society.” A vague sentence and not very useful for that reason, but the word is correct. :)

So: “The weather affects my mood” and “the weather has an effect on my mood” mean roughly the same thing. “I affect a moody disposition when it rains” means “I put on, or fake, a moody disposition when it rains.” “The weather effects my mood” means “the weather brings my mood into being,” which is not the same thing as “the weather changes my mood.”

Now the interesting thing educationally, for me, is that a) you knew there was a difference and b) you were aware that you may not have understood the difference. The fact that you used “affect” incorrectly is of secondary importance. If those first two conditions didn’t obtain, the incorrect usage would be hard or impossible to address. For decades I have continued to work on instilling the metacognitive loop of 1) and 2) in students without having that loop paralyze their writing. It’s no good trying to get one’s messy thoughts into a rough draft if one gets blocked by linguistic uncertainty. The key is to write and keep writing, then go back and revise later. Eventually one becomes more fluent and can therefore get a lot of this stuff right on the fly, but even then careful proofreading is helpful. I still get my its and it’s wrong in rough drafts, just because I’m going quickly. Same for typos, etc.

Apostrophes? Don’t get me started…. I was laughing with an English friend in Herefordshire several years ago about a sign he saw regarding “Christma’s toys.”

In the end, it’s all conventions, but the conventions matter even though they’re arbitrary. They represent a set of agreements. It’s also kind of fun, in a geeky way, to get it right and to know why–like being able to play a favorite lick on one’s guitar, or being able to draw a recognizable face, or being able to tell a joke well.

Sorry to natter on at such length, but I thought it might be fun to stroll along and think aloud.

All best,
Dr. C.

J. C. R. Icarus: Nugget for “Man-Computer Symbiosis”

Set the controls for the heart of the sun.

Set the controls for the heart of the sun.

No one knows what it would do to a creative brain to think creatively continuously. Perhaps the brain, like the heart, must devote most of its time to rest between beats. But I doubt that this is true. I hope it is not, because [interactive computers] can give us our first look at unfettered thought.
J. C. R. Licklider, “Computers in the University,” in Computers and the World of the Future (1962)

Ah, but a man’s reach should exceed his grasp,
Or what’s a heaven for?
Robert Browning, “Andrea del Sarto”

“I’ve never been certain whether the moral of the Icarus story should only be, as is generally accepted, ‘don’t try to fly too high,’ or whether it might also be thought of as ‘forget the wax and feathers, and do a better job on the wings.”
Stanley Kubrick

Yes, it’s all about human intellect, that strange product of a decisive moment in our evolution in which we got just enough working memory to begin to generate counterfactuals, to imagine that things could be different, and then to invent symbols–language, of course, but also art, math, music–to be able to share those imaginings with each other. We even came up with a word for the concept of “symbol” itself, a way of talking about how we were thinking, and thus generated what Douglas Hofstadter calls “an infinitely extensible symbol set.” As we shared our symbols, we began to be intentional about what we could build together, and what might persist beyond our own deaths.

So much extraordinary capacity, and we take most of it for granted.

And now come computers, with the promise of helping us generate, store, retrieve, share, and more fully understand the rich symbols that form the record of our species.

In “Man-Computer Symbiosis,” Licklider talks about all of this symbol-use in pretty straightforward ways. The essay reads very much like a project outline, at times almost bureaucratically so. There’s the research (on himself), the conclusion, and thus the problem description. There’s the itemized analysis of what will be needed to realize the vision of intellectual augmentation he imagines. By the end of the essay, he’s outlined all the engineering and seems well on his way to putting a budget together. I’ve always found the end very abrupt, oddly so, given the highly metaphorical way the essay begins. It’s very different from Vannevar Bush’s poignant, almost plaintive ending to “As We May Think.”

The passage I’ve chosen exhibits both the project-oriented Licklider (he preferred to go by “Lick”) and the dreamer Lick. It’s interesting to see how they don’t quite go together:

In short, it seems worthwhile to avoid argument with (other) enthusiasts for artificial intelligence by conceding dominance in the distant future of cerebration to machines alone. There will nevertheless be a fairly long interim during which the main intellectual advances will be made by men and computers working together in intimate association. A multidisciplinary study group, examining future research and development problems of the Air Force, estimated that it would be 1980 before developments in artificial intelligence make it possible for machines alone to do much thinking or problem solving of military significance. That would leave, say, five years to develop man-computer symbiosis and 15 years to use it. The 15 may be 10 or 500, but those years should be intellectually the most creative and exciting in the history of mankind.

Lick simply assumes that AI will be developed, and that when it is developed, “dominance” in “cerebration” (meaning “thought,” I believe) will belong to “machines alone.” We will invent our own obsolescence. Yet the 15, or 10, or 500 years during which we invent our obsolescence “should be intellectually the most creative and exciting in the history of mankind.” The note of excitement is familiar and thrilling. And we are living in that time as I type these words on my computer’s keyboard, which makes Lick’s pronouncement doubly thrilling.

Yet I hesitate to say with Lick, “I, for one, welcome our new robot overlords.” Moreover, I don’t think he’s being very careful with his own argument. In the essay, he distinguishes “formulated” thinking from “formulative” thinking. The latter is more about problem-finding, about using the algorithmic powers of the computer in concert with the goal-setting and meaning-making activity of the human being to refine the human’s questions and enrich the scale and depth of the human’s powers of imagination and analysis. Does Lick believe that computers will eventually become superior meaning-makers? (Does the Netflix recommendation engine create meaning, or simply reveal it?) Does Lick believe that computers will identify problems for us to work on, optimizing the work for our messy associative brains? Does he believe that creativity itself will take on a new meaning independent of human input or judgment? Hard to say. I don’t think he’s consistent in the essay. And in truth, as John Markoff notes in What The Dormouse Said, the split between the AI researchers and those who, like Doug Engelbart, imagined that computers would augment human intellect, not replace it, was eventually unbridgeable.

And yet the dreams were similar, which brings me back to the epigraphs. From cave paintings to epic poetry, there’s strong evidence that ever since human beings became symbol users and symbol sharers (really two aspects of the same thing), we have found our minds to be spooky, paradoxical, oddly free, and strangely limited. And in the midst of that feeling, we aspire to greater heights of ingenuity and invention. It is our very minds that drive us to enlarge our minds, since somewhere in our minds we find we have not reached the end of what we can imagine grasping.

That’s a strange thought, a troubling thought, an exhilarating thought. Many cautionary tales have sprung up around this thought. Many dreams have emerged from it as well. Given the nature of our ingenuity, I’m not sure we have much hope of stalling this thought.

Might as well see what we can build with it.

As Pete Townshend once sang, “No easy way to be free.”

Reviewing our shady past: Nugget #1

Satan Overlooking Paradise

Above is a many-layered photograph of Gustav Dore’s illustration of an episode from Book IV of John Milton’s Paradise Lost. I say “many-layered” because it’s a photograph of a blacklight poster that uses the Dore image as a starting-point for a psychedelic vision of a psychedelic poem published by a blind genius in 1667.

The poster is in fact illuminated by a blacklight in this photograph, which is why the colors give off a strange and compelling glow. I took the photo during an all-night Paradise Lost readathon at the University of Mary Washington March 23-24, 2007. The photo has been viewed on Flickr 537 times since then. I did fourteen of those readathons from 1980, when I did my own first reading of Paradise Lost all the way through, overnight, in one sitting, to the summer of 2008, when I taught the last Milton seminar of my tenure at Mary Washington. One of the students who came to that final readathon had done the readathon ten times over the years.

The second readathon I did, and the first as a professor, was at the University of San Diego. During the later books, the sky began to rain. Because rain was an unusual occurrence in San Diego, the students got very excited. They ran outside during the break so they could play in the rain. I found this strange and weirdly engaging, too. Rain is to SoCal as snow is to Virginia, or so it seemed to me at the time.

I bought this poster from a head shop in Bristol, Tennessee in the fall of 1970, ten years before I would read the poem it came from. I had no idea what it was a picture of. The caption said “Overlooking Paradise,” and the picture looked to me like an angel overlooking a landscape of such extraordinary beauty that I hoped one day I might be able to climb inside the picture and explore that country. I didn’t know that the angel was really a fallen angel, and that the figure was that of Satan as he overlooked the Paradise that would break his heart and harden it, too, as he resolved to bring ruin and destruction, war and hatred and death, to a country in which, to quote Milton, “spring and fall danced hand in hand.” A country with a garden at the center, one that was “wild above rule or art,” a country of “enormous bliss.” That’s what this angel wrecked.

In 1970 I was building an associative trail without any knowledge that I was doing so. Maybe I was continuing it. I had no idea at all that the blacklight poster I bought in one of the coolest stores I’ve ever seen (the house was painted purple, and it was called “The Spirit House”) was a picture from my future commitment to a poem from centuries past. I did not know that I was at that moment making good on Milton’s words that “books are not absolutely dead things.” All I knew was that the poster was shopworn and a little tattered, so the shop had it on sale for a dollar. And I had a dollar to spend, so I bought it.

In “As We May Think,” Vannevar Bush writes,

Presumably man’s spirit should be elevated if he can better review his shady past and analyze more completely and objectively his present problems. He has built a civilization so complex that he needs to mechanize his records more fully if he is to push his experiment to its logical conclusion and not merely become bogged down part way there by overtaxing his limited memory. His excursions may be more enjoyable if he can reacquire the privilege of forgetting the manifold things he does not need to have immediately at hand, with some assurance that he can find them again if they prove important.

Reviewing our shady past better, analyzing our present problems more completely and objectively: is this a way forward, or a moment in which we will overlook the Paradise from which we are now excluded, the Paradise to which we can bring only destruction. We have indeed built a complex civilization. The building we have done together makes the words I type at this very moment able to appear and move on a screen as I push tiny switches on a keyboard. The complexity allows me to turn every bit of nearly everything into a bit, thus to send it at the speed of light around the world, discoverable by that world, reviewed perhaps by some part of it. What do we see as we look over this complexity in this civilization we have built? What do we overlook and what do we overlook? Vannevar Bush writes of pushing our experiment–presumably, the experiment of “civilization”–to its logical conclusion. Will the machines we have invented, these personal interactive networked Memexes, bring us a happy conclusion for our experiment? How will we know the experiment has ended?

These are important questions. At this moment however I write not of them, but of a moment from my own past, shaded by years but lighted by memory, that proved important on a scale I could not then imagine, but which now I can share, with a whole heart, and with you.

How does it feel when I think?

I’ve always associated thinking with feeling. I’ve always known that thinking makes me feel a certain way. I used to wonder if other people thought that way, felt that way. One of the great pleasures of getting to know the world and my fellow human beings a little better over the years has been learning that however idiosyncratic I may feel (or be) at any given time about my thinking and the emotions it brings, I am never really alone.

Those times I feel that the way I feel when I think is not unique … those are good times. Sometimes those times last a while. Sometimes they come in flashes. Either way, those times are truly meetings. Each of those moments is what Richard Linklater’s characters in Waking Life call a “holy moment.”

I remember those moments. I remember the moment Dr. Roman read us T. S. Eliot’s words describing the way the Renaissance poet John Donne felt when he thought (or what Eliot believed was true, given Donne’s poetry and other writings): “For Donne, a thought was an experience; it modified his sensibility.” Those words are just as thrilling (ah, there’s one feeling I’ll come back to at the end) today as they were when I first heard them, age 19. They suggest that thinking is not just detached, ethereal, or impractical. Thinking is an experience. It changes you. Thinking changes your mind, which means that thinking changes the way you think. As one neurologist put it in the title of his recent book, we human beings have a Brain That Changes Itself. It’s like that great Escher drawing of the two hands drawing themselves. It’s recursive, and gloriously so.

And then it turns out that interest, which is the brain reaching in and out to experience the glorious trails of wonder and puzzlement within and without its hard-boned boundaries, is an emotion.knowledge emotionlike confusion and awe. Just thinking about that idea fills me with joy. I’m listening to Pandora (The Buckinghams channel, if you’re curious) as I type these words, just to keep my brain in the happy state that allows the joy of these ideas to permeate every axon, jump every synapse.

Thinking doesn’t always feel great, of course, even for someone like me who’s frankly besotted with it. Sometimes thinking is unpleasant, hard, regretful. When I worry, for example, I’m thinking about things that either a) make me anxious or b) consume me with an urgent set of problems, the way we say a dog “worries” a bone. Actually, now that I think about it (heh), “b” can feel unpleasantly focused sometimes, but other times it feels like good exercise. By contrast, I try to avoid the “a” variety of worry, sometimes with more success than others.

But leaving aside that unpleasantness, I believe there are other varieties of thinking that are pretty much unalloyed pleasure, though of different kinds. (Anyone who tells you that all pleasures are really the same needs to get out more.) Here’s a partial listing of my thinking pleasures.

Musing is a great relaxed pleasure as long as “a” worries don’t intrude. Musing is that state when the mind floats free, playing with associative trails the way a child plays with soap bubbles, balloons, or sticks in a brook. When musing gets very intense, it gets even dreamier, at least for me; it becomes a reverie, which to my ears is one of the loveliest words in the language, though obviously we borrowed it from the French. Not that there’s anything wrong with that.

Mulling is a somewhat more arduous pleasure. “Mulling” means thinking hard but without any single goal in mind. Mulling is like a great conversation that grows more intense by the moment, but without any agenda or “takeaway” that has to be agreed upon or accomplished. I once did a talk on mulling (the thinking kind, not the vintner variety), and because I had to do the talk, I ended up learning what it was I wanted to say. It came to me in the course of my research (see, there’s inquiry for you). I learned that the words “mull,” “meal,” and “muddle” were all related. Irresistible words. Alliterative, and nicely balanced between two monosyllables and one disyllable. Best of all, they gave me the grand finale that I hoped would also give the audience something to mull over as they thought about my talk on thinking: I concluded that “mulling” was what we did to make a “muddle” into a “meal.” Ok, two disyllables there, but I’m an amateur poet only.

Worrying (“b” type), musing, mulling: all are pleasures, though all feel different. But for the tip-top pleasure, the one that keeps me moving through uncertainty and courting more (heaven knows–for a fact–that there’s no lack of uncertainty in life and in this weary world), is the feeling I get when an idea comes to me. When that idea arrives, it sometimes feels like moving through a door to see a splendid sunlit landscape on the other side. Sometimes it feels like I’ve spotted a long-lost friend while music plays in the most exquisite foreign land I’ve ever visited (the choice of which, music and land and friend, would depend on the day). Sometimes it feels the way I felt when I saw my newborn children and the exhausted joy in their mother’s eyes. Sometimes I get the feeling and I don’t know why, because some part of my brain has registered the insight, has felt the charge of the connection, before my prefrontal cortex has had a chance to say to itself, “Whoa! I see!”

Roger Penrose describes this last sensation so perfectly that I leave the last word in this post to him, as given to us by the superb immortal filmmaker Errol Morris: