Hospitality

From The Oxford English Dictionary:
1. a. The act or practice of being hospitable; the reception and entertainment of guests, visitors, or strangers, with liberality and goodwill.

A quick dive into the etymology of “hospitality” reveals that it grows from the Latin word meaning “host.” A host, the OED helpfully explains, is “the correlative to guest.”

From The New Testament, Hebrews 13:2, slightly altered:
Be not forgetful to entertain [i.e., show hospitality to] strangers: for thereby some have entertained seekers unawares.

I’d have quoted a more recent translation–I’m no “AVolater”–but the prose was so clumsy in the versions I consulted that I just couldn’t bring myself to do it.

From Elizabeth A. Wilson, Affect & Artificial Intelligence (University of Washington Press, 2010 — with thanks to Christine Labuski at Virginia Tech for alerting me to the book)

The following chapters present an empathic engagement with these eccentric tendencies toward artificial feeling. Readers may find themselves disappointed by a methodology that is not critical enough…. I want to be hospitable to my material. This orientation derives, in part from my immersion in the archives: the marginalia of these researchers’ lives and the density of their intellectual and emotional interconnections have generated an attachment to them governed more by curiosity and care than by cynicism. This orientation also derives from a very strongly felt intellectual conviction that engagements of an empathic kind can be immensely, uniquely effective. If that makes my analysis seem too credulous it’s a small price to pay to divert myself and my readers from the dogged approaches to critique that have become our stock in trade. On occasion I will insert a footnote to direct readers to more conventional critiques of these projects; and at times I will examine in some depth where I think AI [artificial intelligence] researchers have gone astray. In general, however, it is my intention to build a critical engagement with these men and their ambitions and their moods and their machines that is based in secure, rather than avoidant or ambivalent, attachment.

Wilson then quotes what she calls Bruno Latour’s “fiery rhetoric about the dissipation of critique.” Latour writes,

The critic is not the one who debunks, but the one who assembles. The critic is not the one who lifts the rug from under the feet of the naive believers, but the one who offers the participants arenas in which to gather. The critic is not the one who alternates haphazardly between antifetishism and positivism like the drunk iconoclast drawn by Goya, but the one for whom, if something is constructed, then it means it is fragile and thus in great need of care and caution. (Bruno Latour, “Why Has Critique Run Out Of Steam? From Matters Of Fact To Matters Of Concern,” Critical Inquiry 30, Winter 2004)

After which Wilson notes, “Latour pleads for a mode of critique that trades in multiplication rather than subtraction and scorn.”

The word scorn is exactly right, I think. “[A] Common Romanic word of Germanic origin,” the OED says, tracing the word from 1200 to the present across many languages, all of them using this word–harsh to the point of onomatopoeia–to indicate mockery, derision, contempt. As a verb, scorn carries an even stronger sense of an attitude toward an inferior: “[t]o feel it beneath one, to disdain indignantly to do something,” the dictionary relates. “Jeer” is a synonym, as is “despise.”

When Jon Udell went to his alma mater, the University of Michigan, to urge academics there to narrate their work by blogging on the open web, he met with considerable resistance from those who couldn’t imagine putting what they called “half-baked ideas” onto the Web for all to see. Compare that response with Michael Nielsen’s poignant story in his Reinventing Discovery: The Rise Of Networked Science:

A friend of mine who was fortunate enough to attend Princeton University once told me that the best thing about attending Princeton wasn’t the classes, or even the classmates he met. Rather, it was meeting some of the extraordinarily accomplished professors, and realizing that they were just people–people who sometimes got upset over trivial things, or who made silly jokes, or who made boneheaded mistakes, or who had faced great challenges in their life, and who somehow, despite their faults and challenges, very occasionally managed to do something extraordinary. “If they can do it, I can do it too” was the most important lesson my friend learned.

What’s important then is that blogs make it possible for anyone with an internet connection to get an informal, rapid-fire glimpse into the minds of many of the world’s scientists…. It’s not just the scientific content that matters, it’s the culture that is revealed, a particularly way of viewing the world. This view of the world can take many forms…. The content ranges widely, but as you read, a pattern starts to take shape: you start to understand at least a little about how an experimental physicist views the world, what [he or she] thinks is funny, what [he or she] thinks is important, what [he or she] finds irritating. You may not necessarily agree with this view of the world, or completely understand it, but it’s interesting and transformative nonetheless. Exposure to this view of the world has always been possible if you live in one of the world’s intellectual capitals, places such as Boston, Cambridge, and Paris. Many blog readers no doubt live in such intellectual centers. But you also routinely see comments on the blog from people who live outside the intellectual centers. I grew up in a big city (Brisbane) in Australia. Compared to most of the world’s population, I had a youth of intellectual privilege. And yet the first time in my life that I heard a scientist speaking informally was when I was 16. It changed my life.

Who are the academics willing to be present in this way? Academia reserves a particular scorn for the informal expression of one’s thoughts, which has among many other dreadful results a terrible chilling effect upon many professors’ willingness to reveal themselves as learners, let alone seekers, in front of their students.

Who will dare to be a learner, a seeker, a yearner in the face of such hostility? Many academics insist that the withering, contemptuous varieties of critique are like a refiner’s fire, purging dross from gold, making our arguments more rigorous and our conclusions more sound. I wonder. Others wonder as well: see, in addition to Wilson and Latour, Alex Reid’s notes on the ethics and limitations of “critique.” Be sure to read the comment stream that follows, in which hospitality emerges after a great and admirable struggle to walk away from the more usual paths of academic exchange.

One of the things I enjoyed most about my first days in the world of “edtech” (for want of a better world) was the hospitality I was shown by folks who’d been blogging, using wikis, exploring mobile technologies, and working within virtual worlds for a long time before I arrived on the scene (conference debut as an attender: MIT, fall 2003, AAC&U topical meeting). That hospitality was challenged rather severely when the conversation became fragmented over edupunk. My own capacity for hospitality shrank, which I truly regret, but hospitality and punk seemed so antithetical (still do, to me) that I was mostly just torn up inside. I recognized the urgent need for reform, but I couldn’t see my way to these means, though of course I worried that too might be a failure, either of hospitality or of outrage. Hospitality has also been strained in the current storm over MOOCs and the future of higher education. Ironically, this is the very moment in which we need to grow and employ more hospitality than ever, if only to demonstrate the value of carving out a space for college, the place where colleagues share intellectual devotion that is hospitable even to sharply defined, but collegially (even cordially) expressed disagreements.

I keep hearing Elizabeth Wilson’s voice: I want to be hospitable to my material. More curiosity and care, less cynicism. It’s a poignant insistence, given the challenges she evidently anticipates in response. It’s a necessary insistence, I believe, given the openness and zestful curiosity that we need to entertain each other, and our conjectures and dilemmas, in community. Henri Nouwen reminds us that “the time has come to realize that neither parents nor teachers nor counsellors can do much more than offer a free and friendly place where one has to discover his own lonely way” (Reaching Out, special edition, 2006). Not much more than that, perhaps, but not any less than that, either.

In response to my last post, my friend Louis sent me a link to a marvelous exploration of hospitality by Leonard Cohen. Louis is patiently opening the world of this artist to me–and doing so very hospitably. In the comment below the clip, Louis writes,

Here’s a transcript of Cohen’s interview comment on the song, so it’s easier to follow: “I think that kind of imagery can be discovered all through the literature. The Persian poet Rumi [13th century] uses the idea of the guests a lot. The festival, the feast and the guests. It’s almost impossible to talk about that seed moment of when a song begins. It could be the soul comes into world. There is some notion the soul has that there is a feast, that there is a festival, that there is a banquet. It strives to experience the hospitality of the world. It doesn’t achieve it. It feels lonely, this is everybody’s experience. It feels lost. It stumbles around on the outskirts of the party. If the striving is deep enough or if the Grace of the host is turned towards the seeking guest, then suddenly the inner door flies open and he finds himself, or the soul finds himself, at that banquet table. Although no one knows where the night is going, no one knows why the wine is flowing. No one actually understands the mechanics of this grace except that we experience it from time to time.” It’s a clip from a 1979 documentary called *The Song of Leonard Cohen.*

It’s a good place to pause for now.

Conjectures, Dilemmas, Hospitality, Humility

I’m going to try to get started on some thoughts that have been weighing pretty heavily on me over the last couple of months. The four nouns in this post’s title are markers of the general direction. In fact, here’s the executive summary, a genre I’ve been getting more practice in of late:

The process of discovery thrives not only in establishing facts, but in lively, interest-provoking encounters with the conjectures and dilemmas that inform human inquiry.
Each academic discipline changes and reframes itself not only on the basis of new knowledge, but on the basis of those conjectures and dilemmas, themselves the result of a process of discovery not unlike that which uncovers, discovers, and constructs knowledge. Problem finding, every bit as important as problem solving, requires respect and eager openness toward conjectures and dilemmas above all, since here we find the thinker, the researcher, the experimenter, the practitioner most vulnerable, most tentative (i.e., most engaged in trying, in the essay of the thing). Our necessary practice of problem finding, of uncovering and sharing conjectures and dilemmas, thus requires a sense of hospitality toward real questions and toward those who advance them. While claims advanced as fact benefit from critique and challenge, the core of conjectures and dilemmas in our shared endeavors suggests that hospitality must also be present if we want to encourage inquiry. This hospitality also requires a certain humility, which is not self-abasement so much as a zestful yearning for problem-finding in community, a shared appetite for conjectures and dilemmas, not that action will not be taken–far from it–but so that neither argument nor action ossify into authoritarianism. (Authority and authoritarianism are two different things, I believe.)

I’ll start with Jerome Bruner, from whom I have learned so much over the last fifteen years, ever since the Roving Librarian brought home a book called The Culture of Education. I’ve returned recently to one of my favorite Bruner volumes, Toward A Theory Of Instruction, because of a lovely moment in Bret Victor’s notes on his “Future of Programming” talk. At the very end, Bret refers to his own reading of Bruner, and to Alan Kay’s advice:

Lastly, here’s some advice Alan Kay gave me (as I was going through a small personal crisis as a result of reading Jerome Bruner’s “Toward a Theory of Instruction”):

I think the trick with knowledge is to “acquire it, and forget all except the perfume” — because it is noisy and sometimes drowns out one’s own “brain voices”. The perfume part is important because it will help find the knowledge again to help get to the destinations the inner urges pick.

So following my nose and my inner urges, I returned to Bruner and found again these words:

For the question opens up the deep issues of what might be and why it isn’t…. It is such conjecture … that produces rational, self consciously problem-finding behavior so crucial to the growth of intellectual power…. I would say that while a body of knowledge is given life and direction by the conjectures and dilemmas that brought it into being and sustained its growth, pupils who are being taught often do not have a corresponding sense of conjecture and dilemma. The task of the curriculum maker and teacher is to provide exercises and occasions for its nurturing. If one only thinks of materials and content, one can all too easily overlook the problem…. The answer is the design of exercises in conjecture, in ways of inquiry, in problem finding. It is something that the good teacher does naturally at least some of the time. (From “A Retrospect On Making And Judgment,” in Toward A Theory Of Instruction).

This of course is the very opposite of what I call the bad Sunday School technique of education, in which the teacher poses an essentially rhetorical question and waits until the good student produces the foregone conclusion: “because he loves us.”

Reading Bruner’s words again, I am reminded of what feeds curiosity and interest, according to researchers: novelty (things that fall between categories), uncertainty, conflict, complexity. I note that “clarity” doesn’t make the cut. Of course clarity is important, but without conjectures and dilemmas, “clarity” can easily become “certainty,” interest will become compliance, and as surely as night follows day, students will do what the rubrics say to do, for that is what the teacher wants.

I think with particular unease of the moment at the recent AAC&U meeting in which a panelist, understandably frustrated by the thin quality of student reflections in one of his classes, spoke out forcefully on his plans to double down on the instructions and tell the students exactly what he meant by reflection. I believe he was sincere in his desire to improve their learning, but I worry that his plan of action will result in compliance of the most persuasive sort, the kind that allows everyone to say that the mission has been accomplished when exactly the opposite has happened.

For it seems to me that we are tempted to imagine reflection as a process of discovering and affirming lessons learned and problems solved, when anyone who has spent a moment in reflection will realize, I believe, that the depths of that practice awaken conjectures and dilemmas. (Reflection is neither capstone nor cornerstone.) Too much of school teaches learners to fear or mask conjectures and dilemmas. My students tell me they raise their hands when they have answers, not when they have questions.

I continue to think about hospitality as the great and charitable soil of conjectures and dilemmas. Those who entertain conjectures and dilemmas might also be called seekers.

Happy New Year: Here’s Our Challenge

On December 9, 2013, Doug Engelbart and his work were honored in a memorial gathering at the Computer History Museum in San Jose, California. The tributes, with a panel discussion following, are up on YouTube. If you’re at all curious about Doug’s vision and the legacy it offers us, I urge you to watch the video. It’s two hours very well spent.

There are several challenges in the video, not just one, but the one I want to highlight here comes from the tribute by Elizabeth “Jake” Feinler, who worked with Doug for several years at the Augmentation Research Center (ARC) and went on to become the founding director of ARPANET’s Network Information Center after. Her tribute starts at 30:12 into the video.

Ms. Feinler had many great memories of Doug and ARC, but the part that resonated most deeply with me came in words she quoted from Doug himself.  The words illustrated Feinler’s experience of ARC as well as her intense admiration for Doug’s vision and humanity. Here’s what Feinler read, from an essay titled “Working Together,” written by Doug and Harvey Lehtman (also of ARC) and published in the December 1988 issue of Byte magazine.

We thought that success in tools for collaborative knowledge work was essential to the necessary evolution of work groups in increasingly knowledge-rich societies and to increasing organizational effectiveness. Until the recent growing interest in CSCW [computer supported collaborative work], most developers limited their analyses to technical issues and ignored the social and organizational implications of the introduction of their tools; such considerations were, however, key to our work.

There is growing recognition that some of the barriers to acceptance of fully integrated systems for augmenting groups of knowledge workers may be more significantly social, not solely technical. The availability of rapidly evolving new technologies implies the need for concomitant evolution in the ways in which work is done in local and geographically distributed groups.

ARC [the Augmentation Research Center] experienced this phenomenon continuously. The bootstrapping approach, so important to the continuing evolution of the system, caused us to constantly undercut our world: As soon as we became used to ways of doing things, we replaced platforms to which we were just becoming accustomed. We needed to learn new roles, change attitudes, and adopt different methods because of growth in the technical system we ourselves produced.

We brought in psychologists and social scientists to serve as observers and facilitators. They were as important to our team as the hardware and software developers. The resistance to change, which we soon realized was an essential part of introducing new technologies into established organizational settings, and the psychological and organizational tensions created by that resistance were apparent in ourselves. We were required to observe ourselves in order to create appropriate methodologies and procedures to go along with our evolving computer technologies. [my emphases]

This language, shifted only slightly, applies equally well to the process of education itself. True learning generates both increasing complexity and, at a meta level, an increasing awareness of the nature and potential uses of that complexity–i.e. strategies of mindfulness. A university is an augmentation research center, is it not? And yet how much time do we spend in fruitful self-observation, in bootstrapping ourselves into higher levels of mindfulness and invention despite the fact that by doing so we inevitably, constantly “undercut our world”? “World” means not the planet or civilization, but the structures and organizations that inevitably choke new growth. These “worlds” must serve the values we profess, not the other way around. These “worlds” and ourselves as their architects and inhabitants must evolve and grow even as we struggle to keep up with the change we have set in motion. “Undercut” is one way to acknowledge the struggle–but “reinvent” is also apt, for it points to the essential goals of learning.

What is the alternative? Bureaucratic self-defense? Does the world need more lessons in that?

2014 is likely to be a full-on year of Engelbart activity for me. The cMOOC I’ll be teaching with Jon Becker (with the able disruptive ingenuity of Tom Woodward) will explore topics central to Doug’s vision and work. There will be at least two Engelbart Scholars among the VCU students who take that course (more details coming soon). And as always, I will be doing my level best to bring Doug’s ideas and the ongoing work of the Engelbart Institute to the conversation about networked learning, wherever I can find it (or it can find me).

Two birthdays, an anniversary, and a brief lament

First, the birthdays.

Happy birthday to the author I’ve studied and delighted over for the last thirty-three years: John Milton, born 1608.

John Milton, busted.

I never imagined I’d spend my life reading and thinking and writing about this writer. Just goes to show. (Show what? I’ll leave that as an exercise for the reader.)

Happy birthday wishes also go out to Rear Admiral Grace Hopper, the mother of COBOL, a fountain of wit and wisdom, and a pioneering genius of computer science. I first learned about Admiral Hopper from Dr. David Evans’ Udacity course CS101. (Yes, Udacity. It just goes to show.) Dr. Evans linked to her famous interview with David Letterman, and I was an instant fan.

“Grace Murray Hopper at the UNIVAC keyboard, c. 1960.” From Wikipedia.

The anniversary: 45 years ago today, Dr. Douglas Engelbart sat on a stage in San Francisco and, according to one awestruck observer, “dealt lightning with both hands.” The event has come to be known as “the mother of all demos.” There’s a very nice remembrance of Doug and his demo in The Atlantic today. I know there’s also a memorial happening right about now in San Jose, as his daughter Christina and many of Doug’s family, friends, and admirers are gathered to remember the demo and Doug, who passed away this year on July 5.

I talked to Doug for about an hour, back in 2006. I met him and shook his hand in 2008 on the night before the 40th anniversary of the mother of all demos. I am humbled to be working with Christina on a project for this summer and beyond. I am so very grateful to be linked in spirit and work with Doug’s vision. When there are dark or confusing days, I try to remember how lucky I am to have found that vision, and to have thanked that visionary, while he was still with us.

Here’s the first part of the mother of all demos:

And here’s a version my wonderful student Phillip Heinrich did for a final project in my second-ever “Introduction to New Media Studies” class, what eventually became “From Memex To YouTube: Cognition, Learning, and the Internet” (and will have another morphing this summer at VCU and worldwide–watch this space):

Phillip’s work is a conceptual mashup of Doug’s demo and Michael Wesch’s “The Machine Is Us/ing Us.” Even four years later, Phillip’s work still dazzles. Apparently Doug himself saw it at one point, which makes me very joyful.


I write these words from a hotel room in Atlanta, where I’m attending the annual meeting of the Southern Association of Colleges and Schools Commission on Colleges. I’ve heard some inspiring speakers and learned a great deal about more of the vast machinery of higher education. At the same time, I’ve seen many folks whose eyes are on fire with a passionate devotion to learning and teaching. I honor them, and salute their survival despite the vast machinery that exists, in part and sometimes ironically, to support them and their vocations.

The lament is for the ways in which the notion of “technology” that surrounds me here is untouched by the vision of either Grace Hopper or Doug Engelbart. When I hear a presenter say that a survey couldn’t include questions about “technology” as part of its core because “technology changes so rapidly,” I groan inwardly. In addition to the (typically) underthought use of the word “technology,” the speaker obviously has confused computing devices with computing. In the latter sense, “technology” has not changed substantially since the introduction of networked, interactive, personal computing, with the possible exception of mobile computing. But the confusion here keeps “technology” questions in a different survey “module,” and keeps educators from learning or even asking what they don’t know. (And eventually we all suffer.)

Similarly, when I hear another presenter say “we didn’t know technology would eliminate jobs the way it has,” then offer a list of “technology improvements” for the organization that include new computers and monitors, new office software, etc., I have to gnash my teeth (quietly, but still). How can we be in 2013 and still be so far removed from even the outer edges of the bright light shed by the visions of Hopper and Engelbart, among many others? How can we call ourselves educators and be content not only to remain in darkness, but to spread it through inaction and (I’m sorry, but it must be said) ignorance?

More than once at this conference I’ve heard presenters talk about “technology” in the same breath that they lament how old they are and how strange youth culture seems to them. Sometimes the lament is mingled with a little of that “kids, get off my lawn” curmudgeonliness. We all get to be a little prickly as we age, I guess, but methinks we do protest too much. Doug Engelbart and Grace Hopper didn’t surrender their visions as age overtook them. We do ourselves and our students no good service to remain in the shallows we have created for ourselves, the shallows we continue to excuse and extend. As Janet Murray writes, “When will we recognize the gift for what it is…?” Or as Doug Engelbart asked on that San Francisco stage forty-five years ago today:

If, in your office, you as an intellectual worker were supplied with a computer display backed up by a computer that was alive for you all day and was instantly responsive to every action you had, how much value could you derive from that?

Both Murray’s and Engelbart’s questions remain unanswered, and that itself is worth lamenting. The real grief comes, for me, because the questions are almost never asked, even among those who pride themselves on the arts of inquiry.

A sad case, but there is still hope. My students have taught me that.

Thomas Merton on Education

 

Thomas Merton’s hermitage.

Very salutary readings for a rainy Sunday morning at the SACS-COC conference in Atlanta, Georgia. This is the first time I’ve attended this annual meeting. Higher education is my vocation, so you wouldn’t think I’d have culture shock here–but I find I do. Perhaps that’s a first-timer’s gift. I must practice gratitude!

Here are some of Merton’s thoughts. These come from a man who had been educated in France, England (graduating from Cambridge), and the US (graduating with an MA from Columbia University). For a short time, he was a professor of English at St. Bonaventure. So he knows whereof he speaks.

“The danger of education, I have found, is that it so easily confuses means with ends. Worse than that, it quite easily forgets both and devotes itself merely to the mass production of uneducated graduates–people literally unfit for anything except to take part in an elaborate and completely artificial charade which they and their contemporaries have conspired to call ‘life’.”

“The least of the work of learning is done in classrooms.”

“Anyone who regards love as a deal made on the basis of ‘needs’ is in danger of falling into a purely quantitative ethic. If love is a deal, then who is to say that you should not make as many deals as possible?” [One can substitute "learning" for "love" and reach the same conclusion.]

“[A publisher asked me to write something on 'The Secret of Success,' and I refused.] If I had a message to my contemporaries, I said, it was surely this: Be anything you like, be madmen, drunks, and bastards of every shape and form, but at all costs avoid one thing: success. … If you have learned only how to be a success, your life has probably been wasted. If a university concentrates on producing successful people, it is lamentably failing in its obligation to society and to the students themselves.” [Particularly bracing words given the buzz here--and in my own title at work!--regarding "student success." Who would wish that our students would fail? Yet too narrow a view of success may be the most insidious route to failure of them all.]

And finally, in words that I would love to see above every classroom door and on the cover of every learning-related conference (my editorial material is clumsy but I want to present Merton generously):

“The purpose of education is to show a person how to define himself [or herself] authentically and spontaneously in relation to his [or her] world–not to impose a prefabricated definition of the world, still less an arbitrary definition of the individual himself [or herself].”

Source: Love and Living.
h/t @rovinglibrarian, @graceiseverywhere

Doug Engelbart, transcontextualist

@GardnerCampbell's TEDx Talk: Wisdom as a Learning Outcome

I’ve been mulling over this next post for far too long, and the results will be brief and rushed (such bad food, and such small portions!). You have been warned.

The three strands, or claims I’m engaging with (EDIT: I’ve tried to make things clearer and more parallel in the list below):

1. The computer is  “just a tool.” This part’s in partial response to the comments on my previous post.

2. Doug Engelbart’s “Augmenting Human Intellect: A Conceptual Framework” is “difficult to understand” or “poorly written.” This one’s a perpetual reply. :) It was most recently triggered by an especially perplexing Twitter exchange shared with me by Jon Becker.

3. Engelbart’s ideas regarding the augmentation of human intellect aim for an inhuman and inhumane parsing of thought and imagination, an “efficiency expert” reduction of the richness of human cognition. This one tries to think about some points raised in the VCU New Media Seminar this fall.

These are the strands. The weave will be loose. (Food, textiles, textures, text.)

1. There is no such thing as “just a tool.” McLuhan wisely notes that tools are not inert things to be used by human beings, but extensions of human capabilities that redefine both the tool and the user. A “tooler” results, or perhaps a “tuser” (pronounced “TOO-zer”). I believe those two words are neologisms but I’ll leave the googling as an exercise for the tuser. The way I used to explain this is my new media classes was to ask students to imagine a hammer lying on the ground and a person standing above the hammer. The person picks up the hammer. What results? The usual answers are something like “a person with a hammer in his or her hand.” I don’t hold much with the elicit-a-wrong-answer-then-spring-the-right-one-on-them school of “Socratic” instruction, but in this case it was irresistible and I tried to make a game of it so folks would feel excited, not tricked. “No!” I would cry. “The result is a HammerHand!” This answer was particularly easy to imagine inside Second Life, where metaphors become real within the irreality of a virtual landscape. In fact, I first came up with the game while leading a class in Second Life–but that’s for another time.

So no “just a tool,” since a HammerHand is something quite different from a hammer or a hand, or a hammer in a hand. It’s one of those small but powerful points that can make one see the designed built world, a world full of builders and designers (i.e., human beings), as something much less inert and “external” than it might otherwise appear. It can also make one feel slightly deranged, perhaps usefully so, when one proceeds through the quotidian details (so-called) of a life full of tasks and taskings.

To complicate matters further, the computer is an unusual tool, a meta-tool, a machine that simulates any other machine, a universal machine with properties unlike any other machine. Earlier in the seminar this semester a sentence popped out of my mouth as we talked about one of the essays–”As We May Think”? I can’t remember now: “This is your brain on brain.” What Papert and Turkle refer to as computers’ “holding power” is not just the addictive cat videos (not that there’s anything wrong with that, I imagine), but something weirdly mindlike and reflective about the computer-human symbiosis. One of my goals continues to be to raise that uncanny holding power into a fuller (and freer) (and more metaphorical) (and more practical in the sense of able-to-be-practiced) mode of awareness so that we can be more mindful of the environment’s potential for good and, yes, for ill. (Some days, it seems to me that the “for ill” part is almost as poorly understood as the “for good” part, pace Morozov.)

George Dyson writes, “The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same” (Turing’s Cathedral: The Origins of the Digital Universe). This is a very bold statement. I’ve connected it with everything from the myth of Orpheus to synaesthetic environments like the one @rovinglibrarian shared with me in which one can listen to, and visualize, Wikipedia being edited. Thought vectors in concept space, indeed. The closest analogies I can find are with language itself, particularly the phonetic alphabet.

The larger point is now at the ready: in fullest practice and perhaps even for best results, particularly when it comes to deeper learning, it may well be that nothing is just anything. Bateson describes the moment in which “just a” thing becomes far more than “just a” thing as a “double take.” For Bateson, the double take bears a thrilling and uneasy relationship to the double bind, as well as to some kinds of derangement that are not at all beneficial. (This is the double-edged sword of human intellect, a sword that sometimes has ten edges or more–but I digress.) This double take (the kids call it, or used to call it, “wait what?”) indicates a moment of what Bateson calls “transcontextualism,” a paradoxical level-crossing moment (micro to macro, instance to meta, territory to map, or vice-versa) that initiates or indicates (hard to tell) deeper learning.

It seems that both those whose life is enriched by transcontextual gifts and those who are impoverished by transcontextual confusions are alike in one respect: for them there is always or often a “double take.” A falling leaf, the greeting of a friend, or a “primrose by the river’s brim” is not “just that and nothing more.” Exogenous experience may be framed in the contexts of dream, and internal thought may be projected into the contexts of the external world. And so on. For all this, we seek a partial explanation in learning and experience. (“Double Bind, 1969,” in Steps to an Ecology of Mind, U Chicago Press, 2000, p. 272). (EDIT: I had originally typed “eternal world,” but Bateson writes “external.” It’s an interesting typo, though, so I remember it here.)

It does seem to me, very often, that we do our best to purge our learning environments of opportunities for transcontextual gifts to emerge. This is understandable, given how bad and indeed “unproductive” (by certain lights) the transcontextual confusions can be. No one enjoys the feeling of falling, unless there are environments and guides that can make the falling feel like flying–more matter for another conversation, and a difficult art indeed, and one that like all art has no guarantees (pace Madame Tussaud).

2. So now the second strand, regarding Engelbart’s “Augmenting Human Intellect: A Conceptual Framework.” Much of this essay, it seems to me, is about identifying and fostering transcontextualism (transcontextualization?) as a networked activity in which both the individual and the networked community recognize the potential for “bootstrapping” themselves into greater learning through the kind of level-crossing Bateson imagines (Douglas Hofstadter explores these ideas too, particularly in I Am A Strange Loop and, it appears, in a book Tom Woodward is exploring and brought to my attention yesterday, Surfaces and Essences: Analogy as the Fuel and Fire of Thinking. That title alone makes the recursive point very neatly). So when Engelbart switches modes from engineering-style-specification to the story of bricks-on-pens to the dialogue with “Joe,” he seems to me not to be willful or even prohibitively difficult (though some of the ideas are undeniably complex). He seems to me to be experimenting with transcontextualism as an expressive device, an analytical strategy, and a kind of self-directed learning, a true essay: an attempt:

And by “complex situations” we include the professional problems of diplomats, executives, social scientists, life scientists, physical scientists, attorneys, designers–whether the problem situation exists for twenty minutes or twenty years.

A list worthy of Walt Whitman, and one that explicitly (and for me, thrillingly) crosses levels and enacts transcontextualism.

Here’s another list, one in which Engelbart tallies the range of “thought kernels” he wants to track in his formulative thinking (one might also say, his “research”):

The “unit records” here, unlike those in the Memex example, are generally scraps of typed or handwritten text on IBM-card-sized edge-notchable cards. These represent little “kernels” of data, thought, fact, consideration, concepts, ideas, worries, etc. That are relevant to a given problem area in my professional life.

Again, the listing enacts a principle: we map a problem space, a sphere of inquiry, along many dimensions–or we should. Those dimensions cross contexts–or they should. To think about this in terms of language for a moment, Engelbart’s idea seems to be that we should track our “kernels” across the indicative, the imperative, the subjunctive, the interrogative. To put it another way, we should be mindful of, and somehow make available for mindful building, many varieties of cognitive activity, including affect (which can be distinguished but not divided from cognition).

3. I don’t think this activity increases efficiency, if efficiency means “getting more done in less time.” (A “cognitive Taylorism,” as one seminarian put it.) More what is always the question. For me, Engelbart’s transcontextual gifts (and I’ll concede that there are likely transcontextual confusions in there too–it’s the price of trancontextualism, clearly) are such that the emphasis lands squarely on effectiveness, which in his essay means more work with positive potential (understanding there’s some disagreement but not total disagreement about what “positive” means).

It’s an attempt to tell more of the the whole truth about experience, and to build a better world out of those double takes. Together.

Is Engelbart’s essay a flawless attempt? Of course not. But for me, Bateson’s idea of transcontextualism helps to explain the character of the attempt, and to indicate how brave and necessary it is, especially within a world we can and must (and do, yet often willy nilly) build together.

Not perfect; just miraculous.

More on this anon!

Understanding the machine

Last week’s VCU’s New Media Faculty-Staff Development Seminar took up two related but also quite distinct essays: Norbert Wiener’s “Men, Machines, and the World About” and J.C.R. Licklider’s “Man-Computer Symbiosis.” Aside from the regrettable (but understandable) androcentric language, both essays are forward-looking, yet in different ways. Each of them understands that human history moves in the direction of greater complexity, especially in the accelerating streams of technological innovation and invention. (Wiener wrote a whole book on the subject of invention, one well worth reading, though it was not published until years after his death.) Both writers write about machines, systems, and human-machine interaction. Both writers emphasize that the computer is a new kind of machine. Wiener writes of a “logical machine” with feedback loops, and Licklider emphasizes the “routinizable, clerical” capabilities of the computer. Although neither one uses the magical phrase “universal machine” that Alan Turing uses, they both seem to understand that a difference in degree (speed, memory) can mean a difference in kind. Wiener also writes of “the machine whose taping [i.e., programming] is continually being modified by experience” and concludes that this kind of a machine “can, in some sense, learn.” Such machine learning, and research into its possibilities, is going on all around us today, and that pace too is accelerating. (Google Translate is but one example. Notice that it keeps getting better?)

Part of the experience computers learn from, of course, is our experience–that is, computers can be made and programmed so that they adapt to (learn from) our uses of them. It was hard to see this happening in the pre-Internet era. We could customize various things in DOS, and on the Macintosh, and on Windows (yes, even on Windows), but we didn’t have the feeling of the computer adapting to our uses. For that phenomenon to become truly visible, we needed the World Wide Web and cloud computing. (If you see an unidiomatic translation in Google Translate, click on the word, and Google Translate gives you the opportunity to teach it something.) The computer that learns from us most visibly is the computer formed of the decentralized, open, ubiquitous Internet, as that medium is harnessed by various entities. The most powerful application ever deployed on the Internet, the platform that enabled the macro-computer of the Internet to become visible and self-stimulating, is the World Wide Web.

Which leads me to my point, one already made more elegantly by Michael Wesch (see “The Machine is Us/ing Us“), Kevin Kelly, and Jon Udell, among many others. As we publish to the Web, purposefully and variously and creatively, we also make the Web. This is also true on the micro scale of personal computing, deeply considered, but we see the effects most powerfully at the macro scale of networked, interactive, personal computing enabled by the World Wide Web. The Web, freely given to the world by Tim Berners-Lee, is a metaplatform with the peculiar recursive phenomenon of unrolling before your eyes as you walk forward upon it. It is a world that appears in the very making–assuming, of course, that you are indeed a web maker and not simply a web user.

Wiener writes, “If we want to live with the machine, we must understand the machine, we must not worship the machine…. It is going to be a difficult time. if we can live through it and keep our heads, and if we are not annihilated by war itself and our other problems, there is a great chance of turning the machine to human advantage, but the machine itself has no particular favor for humanity.” If the machine is us, however, as Michael Wesch argues (and in the case of the machine of networked, interactive, personal computing on the World Wide Web, I agree), then Wiener’s statement reads like this:

If we want to live with ourselves, we must understand ourselves, we must not worship ourselves…. It is going to be a difficult time. If we can live through it and keep our heads, and if we are not annihilated by war itself and our other problems, there is a great chance of turning ourselves to human advantage, but we ourselves have no particular favor for humanity.

The idea of enlarging human capabilities should make us nervous, I suppose, but it’s a step forward to understand that that is what we’re thinking about, and that is what’s uniquely empowered and enlarged by interactive, networked, personal computing. From art to medicine to engineering to business and beyond, one capability we have and share, to an alarming and exhilarating extent, is a capability for enlarging our capabilities. Computers are an interesting manifestation of that capability, and a powerful means of using (exploiting, unleashing) that capability. As is education. (Schooling? Depends on the day and the school and the teacher.)

Once we understand that, deeply, we may to Poincare’s observation, quoted by Licklider: “The question is not, ‘What is the answer?’ The question is, ‘What is the question?’”

Licklider dreamed of using computers to help humans “through an intuitively guided trial-and-error procedure” to formulate better questions. I am hopeful that awakening our digital imaginations will lead us to formulate better questions about our species’ inquiring nature and our very quest for understanding itself.

No way out but through

A-Bomb group leaders, via NY Times/Bettmann/Corbis

Last week’s NMFS here at Virginia Commonwealth University discussed Vannevar Bush’s epochal (and, in its way, epic) “As We May Think.” The essay truly marks a profound shift, appearing just as WWII was about to conclude with a display of horrific invention that still has the power to make one’s mind go blank with fear. From Resnais’ Hiroshima Mon Amour to a film that can still give me nightmares, The Day After, the mushroom cloud that signifies this invention hung over my childhood and adolescence–and I don’t expect it will ever go away. Now that we know how, there is no unknowing unless civilization erases itself.

But as myth, fiction, and science continue to demonstrate, each in its own way, there are thousands of demonstrations of the real problem to hand every day: human ingenuity. It’s easy to get distracted by the name “technology,” as if it’s what we make, rather than our role as makers, that’s to blame. But no, it’s the makers we should lament. Or celebrate. Or watchfully, painfully love.

The state of man does change and vary,
Now sound, now sick, now blyth, now sary,
Now dansand mirry, now like to die:
Timor mortis conturbat me.

William Dunbar, “Lament for the Makers”

What shall we do with these vexing, alarming, exhilarating abilities? We learn, we know, we symbolize. Sometimes we believe we understand. We find a huddling place. We explore, and share our stories.

Presumably man’s spirit should be elevated if he can better review his shady past and analyze more completely and objectively his present problems.

Vannevar Bush, “As We May Think”

For several iterations through the seminar, that word “presumably” leapt out at me, signalling a poignant, wary hope as well as a frank admission that all hope is a working assumption and can be nothing more. This time, however, the word “review” glows on the page. Re-view. Why look again? How can repetition make the blind to see? Ever tried to find something hiding in plain sight? Ever felt the frustration of re-viewing with greater intensity, while feeling deep down that the fiercer looking merely amplifies the darkness? (Ever tried to proofread a paper?)

We console ourselves with the joke, attributed to Einstein, that the definition of insanity is to do the same thing again and again while expecting different results. Yet we hope that thinking, mindfully undertaken, may contradict that wry observation. We hope that thinking again can also mean thinking differently, that a re-view strengthened by a meta-view can yield more insight and bring us a better result than the initial view did. Look again. Think again. And, in Vannevar Bush’s dream of a future, a dream that empowered epochal making, looking again and thinking again would be enriched, not encumbered, by a memory extender, a “memex”:

[Man] has built a civilization so complex that he needs to mechanize his records more fully if he is to push his experiment to its logical conclusion and not merely become bogged down part way there by overtaxing his limited memory.

What is this experiment? When exactly did we sign the papers giving our informed consent to any such thing?

Our ingenuity is the experiment, the problem, the hope. Our birthright may also be our death warrant. Is that the logical conclusion?

Yet, in the application of science to the needs and desires of man, it would seem to be a singularly unfortunate stage at which to terminate the process, or to lose hope as to the outcome.

The word “science” signifies more than simply the methodological revolutions emerging in Renaissance Europe. For me, it signifies knowing. We in the humanities enact our own experiments in knowing, exerting our own ingenuity both constructively and destructively. We too are makers.

Re-view. Analyze more completely. “Encompass the great record and … grow in the wisdom of race [i.e., species] experience.” As we may think, and create and share “momentary stays against confusion.”

No way out but through.

Optimism

3. Hopefulness and confidence about the future or the successful outcome of something; a tendency to take a favourable or hopeful view. Contrasted with pessimism n. 2.

So the Oxford English Dictionary. I picked sense 3 because it seems most resilient in the face of abundant evidence that this is in fact NOT the best of all possible worlds (pace Leibniz, at least as he’s pilloried by Voltaire).

It seems to me that educators, no matter how skeptical their views (skepticism is necessary but not sufficient for an inquiring mind), are implicitly committed to optimism. Otherwise, why learn? and why teach?

Satan Overlooking Paradise

I think of this as I begin another semester thinking with faculty and staff across the university (last term Virginia Tech, this term Virginia Commonwealth University) about the possible good we could co-create, and derive, from interactive, networked, personal computing. To be pessimistic (not skeptical, pessimistic–they are not synonyms) about personal, networked, interactive computing is to be pessimistic not about an invention, but about invention itself–that is, about one of our most powerful distinctions as a species.

Computers have become woven into our lives in ways we can barely imagine, but the best dreams about the texture of such a world are hopeful, and stimulate hope. Are we there yet? Of course not. But to be pessimistic about computers is to be pessimistic about humanity. And while that’s certainly a defensible position generally speaking, it seems to me that education is an activity, a co-creation, a calling, that runs clean counter to pessimism.

Last week in the seminar we read Janet Murray’s stirring introduction to The New Media Reader. A colleague from the School of Dentistry. A colleague from the library. A colleague from the Center for Teaching Excellence. Colleagues from University College. And more. Once again, I read these words:

We are drawn to a new medium of representation because we are pattern makers who are thinking beyond our old tools. We cannot rewind our collective cognitive effort, since the digital medium is as much a pattern of thinking and perceiving as it is a pattern of making things.

Indeed–yet this is not to deny the meta level at which we consider our consideration, and think about our blind spots so we can find more light:

We are drawn to this medium because we need to understand the world and our place in it.

Yes–and now the world we need to understand is also a world transformed, for good and for ill but potentially for good, why not?, by the medium itself. Recursive, yes–but more deeply, a paradox, not an infinite regress. That’s the hope, anyway. And educators are committed to hope.

To return to [Vannevar] Bush’s speculations: now that we have shaped this new medium of expression, how may we think? We may, if we are lucky and mindful enough, learn to think together by building shared structures of meaning.

That mindfulness is the meta level. I am optimistic about that meta level. As a learner, I have to be. If mindfulness is impossible, then it’s truly turtles all the way down, and who would care?

How will we escape the labyrinth of deconstructed ideologies and self-reflective signs? We will, if we are lucky enough and mindful enough, invent communities of communication at the widest possible bandwidth and smallest possible granularity.

Lucky, and mindful. Chance favors the mindful mind.

We need not imagine ourselves stranded somewhere over the evolutionary horizon, separated from our species by the power of our own thinking.

Or separated from our history, or from our loved ones–though clearly Hamlet (to name only one) demonstrates that mindfulness alone is no guarantee of anything. But what is on the other side of the horizon? What do we find when we return to the place we left and see it for the first time?

The machine like the book and the painting and the symphony and the photograph is made in our own image, and reflects it back again.

To which I would add: the syntax and punctuation in Murray’s sentence above enact the pulses of ways we may think. Those pulses and the ways they enact are poetry. What more complex shared structure of meaning is there?–unless it’s true that all art aspires to the condition of music. Poetry “begins in delight and ends in wisdom,” Frost writes. He continues: “the figure is the same as for love.” Can the shared structures of meaning emerging from our species’ collective cognitive effort begin in delight and end in wisdom, too? Can the figure our collective cognitive efforts make be the same as for love? I think: I hope so. I think: it better be. I think: how can I try to help? The seminar is one answer, a crux of hopes, the discovery of an invisible republic of optimism.

The task is the same now as it ever has been, familiar, thrilling, unavoidable: we work with all our myriad talents to expand our media of expression to the full measure of our humanity.

And by doing so, that measure increases. May we use that abundance wisely, fairly, and lovingly within this mean old brave new world.

With luck and mindfulness, I am  hopeful that we can.

for my Jewish mother, Dr. Janet Murray, with love and deepest gratitude

So let’s recap

Soaring into the eye of the gods

In mid-March I got an email telling me I was nominated in the search for a senior leadership position at Virginia Commonwealth University: Vice Provost for Learning Innovation and Student Success. I was intrigued. I looked at the leadership profile. I was mightily interested. Can’t hurt to apply, I said to myself. So I did.

The hectic, rewarding pace of life went on. Janet Murray came to VT as the third Distinguished Innovator in Residence. (Very exciting.) The Center for Innovation in Learning prepared its first call for Innovation Grant proposals. (Ditto above.) Learning Technologies began its metamorphosis into Technology-enhanced Learning and Online Strategies. I traveled to Richmond, Boston, Bethlehem (Pennsylvania), and in June, to Rome (Italy) for conference presentations and faculty seminars. And to my wonder and delight, my candidacy continued to advance in the VCU search.

On June 7, as I sat in my lodgings in Barcelona, I spoke with VCU’s Provost, Dr. Beverly Warren, who offered me the job. As a literature scholar, it is my duty of course to tell you that the last time I was in Barcelona, in October of 2010, I was offered the Virginia Tech job. I guess Barcelona is my lucky town in the narrative of my professional life. (No one who’s been there will be in the least surprised.)

On June 24, back in the States, I signed the contract.

On August 1, I formally began my work, though I’d been ramping up at VCU and ramping down at VT since my return to the US. On this same day, my wife and I closed on our new home in Richmond.

Oh, and the conference in Rome was wonderful, far beyond my already-high expectations. The city and country were also pretty stupendous (litotes alert). As was Spain the week before, as was England the week before that. A summer of summers.

And sadly, the cloud over the trip was the death of my beloved mother-in-law on the same day that her youngest daughter, my wife, arrived in Madrid to join me in my travels. That grieving continues. If my experience with my parents at their passing is any guide, one learns to live with death, but one never gets over it.

I guess I’m a little behind in my blogging. Perhaps you can see why? The problem seems to be time, but it isn’t really. Time has become extremely compressed, yes, and spare time has become a vanishing commodity. My perception of time many days borders on the surreal as I adjust to the scale, scope, pace, and challenges of the new job–all very exciting, all very welcome, and all very demanding. Yet the real problem is, as ever, too much to say.

Time to write anyway. Not that I’ve been idle in that department, but I have been silent in this space, and I miss it. I did get 6000+ words done in an article on temptation in Paradise Lost, however–turns out I miss that kind of writing, too. Yes, Gardner writes, even if you haven’t seen it here for several months. Time to write anyway.