A first try at reading T. S. Eliot’s Ash Wednesday. I’ll try to do better next time, but for now this will have to do.
From The Oxford English Dictionary:
1. a. The act or practice of being hospitable; the reception and entertainment of guests, visitors, or strangers, with liberality and goodwill.
A quick dive into the etymology of “hospitality” reveals that it grows from the Latin word meaning “host.” A host, the OED helpfully explains, is “the correlative to guest.”
From The New Testament, Hebrews 13:2, slightly altered:
Be not forgetful to entertain [i.e., show hospitality to] strangers: for thereby some have entertained seekers unawares.
I’d have quoted a more recent translation–I’m no “AVolater”–but the prose was so clumsy in the versions I consulted that I just couldn’t bring myself to do it.
From Elizabeth A. Wilson, Affect & Artificial Intelligence (University of Washington Press, 2010 — with thanks to Christine Labuski at Virginia Tech for alerting me to the book)
The following chapters present an empathic engagement with these eccentric tendencies toward artificial feeling. Readers may find themselves disappointed by a methodology that is not critical enough…. I want to be hospitable to my material. This orientation derives, in part from my immersion in the archives: the marginalia of these researchers’ lives and the density of their intellectual and emotional interconnections have generated an attachment to them governed more by curiosity and care than by cynicism. This orientation also derives from a very strongly felt intellectual conviction that engagements of an empathic kind can be immensely, uniquely effective. If that makes my analysis seem too credulous it’s a small price to pay to divert myself and my readers from the dogged approaches to critique that have become our stock in trade. On occasion I will insert a footnote to direct readers to more conventional critiques of these projects; and at times I will examine in some depth where I think AI [artificial intelligence] researchers have gone astray. In general, however, it is my intention to build a critical engagement with these men and their ambitions and their moods and their machines that is based in secure, rather than avoidant or ambivalent, attachment.
Wilson then quotes what she calls Bruno Latour’s “fiery rhetoric about the dissipation of critique.” Latour writes,
The critic is not the one who debunks, but the one who assembles. The critic is not the one who lifts the rug from under the feet of the naive believers, but the one who offers the participants arenas in which to gather. The critic is not the one who alternates haphazardly between antifetishism and positivism like the drunk iconoclast drawn by Goya, but the one for whom, if something is constructed, then it means it is fragile and thus in great need of care and caution. (Bruno Latour, “Why Has Critique Run Out Of Steam? From Matters Of Fact To Matters Of Concern,” Critical Inquiry 30, Winter 2004)
After which Wilson notes, “Latour pleads for a mode of critique that trades in multiplication rather than subtraction and scorn.”
The word scorn is exactly right, I think. “[A] Common Romanic word of Germanic origin,” the OED says, tracing the word from 1200 to the present across many languages, all of them using this word–harsh to the point of onomatopoeia–to indicate mockery, derision, contempt. As a verb, scorn carries an even stronger sense of an attitude toward an inferior: “[t]o feel it beneath one, to disdain indignantly to do something,” the dictionary relates. “Jeer” is a synonym, as is “despise.”
When Jon Udell went to his alma mater, the University of Michigan, to urge academics there to narrate their work by blogging on the open web, he met with considerable resistance from those who couldn’t imagine putting what they called “half-baked ideas” onto the Web for all to see. Compare that response with Michael Nielsen’s poignant story in his Reinventing Discovery: The Rise Of Networked Science:
A friend of mine who was fortunate enough to attend Princeton University once told me that the best thing about attending Princeton wasn’t the classes, or even the classmates he met. Rather, it was meeting some of the extraordinarily accomplished professors, and realizing that they were just people–people who sometimes got upset over trivial things, or who made silly jokes, or who made boneheaded mistakes, or who had faced great challenges in their life, and who somehow, despite their faults and challenges, very occasionally managed to do something extraordinary. “If they can do it, I can do it too” was the most important lesson my friend learned.
What’s important then is that blogs make it possible for anyone with an internet connection to get an informal, rapid-fire glimpse into the minds of many of the world’s scientists…. It’s not just the scientific content that matters, it’s the culture that is revealed, a particularly way of viewing the world. This view of the world can take many forms…. The content ranges widely, but as you read, a pattern starts to take shape: you start to understand at least a little about how an experimental physicist views the world, what [he or she] thinks is funny, what [he or she] thinks is important, what [he or she] finds irritating. You may not necessarily agree with this view of the world, or completely understand it, but it’s interesting and transformative nonetheless. Exposure to this view of the world has always been possible if you live in one of the world’s intellectual capitals, places such as Boston, Cambridge, and Paris. Many blog readers no doubt live in such intellectual centers. But you also routinely see comments on the blog from people who live outside the intellectual centers. I grew up in a big city (Brisbane) in Australia. Compared to most of the world’s population, I had a youth of intellectual privilege. And yet the first time in my life that I heard a scientist speaking informally was when I was 16. It changed my life.
Who are the academics willing to be present in this way? Academia reserves a particular scorn for the informal expression of one’s thoughts, which has among many other dreadful results a terrible chilling effect upon many professors’ willingness to reveal themselves as learners, let alone seekers, in front of their students.
Who will dare to be a learner, a seeker, a yearner in the face of such hostility? Many academics insist that the withering, contemptuous varieties of critique are like a refiner’s fire, purging dross from gold, making our arguments more rigorous and our conclusions more sound. I wonder. Others wonder as well: see, in addition to Wilson and Latour, Alex Reid’s notes on the ethics and limitations of “critique.” Be sure to read the comment stream that follows, in which hospitality emerges after a great and admirable struggle to walk away from the more usual paths of academic exchange.
One of the things I enjoyed most about my first days in the world of “edtech” (for want of a better world) was the hospitality I was shown by folks who’d been blogging, using wikis, exploring mobile technologies, and working within virtual worlds for a long time before I arrived on the scene (conference debut as an attender: MIT, fall 2003, AAC&U topical meeting). That hospitality was challenged rather severely when the conversation became fragmented over edupunk. My own capacity for hospitality shrank, which I truly regret, but hospitality and punk seemed so antithetical (still do, to me) that I was mostly just torn up inside. I recognized the urgent need for reform, but I couldn’t see my way to these means, though of course I worried that too might be a failure, either of hospitality or of outrage. Hospitality has also been strained in the current storm over MOOCs and the future of higher education. Ironically, this is the very moment in which we need to grow and employ more hospitality than ever, if only to demonstrate the value of carving out a space for college, the place where colleagues share intellectual devotion that is hospitable even to sharply defined, but collegially (even cordially) expressed disagreements.
I keep hearing Elizabeth Wilson’s voice: I want to be hospitable to my material. More curiosity and care, less cynicism. It’s a poignant insistence, given the challenges she evidently anticipates in response. It’s a necessary insistence, I believe, given the openness and zestful curiosity that we need to entertain each other, and our conjectures and dilemmas, in community. Henri Nouwen reminds us that “the time has come to realize that neither parents nor teachers nor counsellors can do much more than offer a free and friendly place where one has to discover his own lonely way” (Reaching Out, special edition, 2006). Not much more than that, perhaps, but not any less than that, either.
In response to my last post, my friend Louis sent me a link to a marvelous exploration of hospitality by Leonard Cohen. Louis is patiently opening the world of this artist to me–and doing so very hospitably. In the comment below the clip, Louis writes,
Here’s a transcript of Cohen’s interview comment on the song, so it’s easier to follow: “I think that kind of imagery can be discovered all through the literature. The Persian poet Rumi [13th century] uses the idea of the guests a lot. The festival, the feast and the guests. It’s almost impossible to talk about that seed moment of when a song begins. It could be the soul comes into world. There is some notion the soul has that there is a feast, that there is a festival, that there is a banquet. It strives to experience the hospitality of the world. It doesn’t achieve it. It feels lonely, this is everybody’s experience. It feels lost. It stumbles around on the outskirts of the party. If the striving is deep enough or if the Grace of the host is turned towards the seeking guest, then suddenly the inner door flies open and he finds himself, or the soul finds himself, at that banquet table. Although no one knows where the night is going, no one knows why the wine is flowing. No one actually understands the mechanics of this grace except that we experience it from time to time.” It’s a clip from a 1979 documentary called *The Song of Leonard Cohen.*
It’s a good place to pause for now.
I’m going to try to get started on some thoughts that have been weighing pretty heavily on me over the last couple of months. The four nouns in this post’s title are markers of the general direction. In fact, here’s the executive summary, a genre I’ve been getting more practice in of late:
The process of discovery thrives not only in establishing facts, but in lively, interest-provoking encounters with the conjectures and dilemmas that inform human inquiry.
Each academic discipline changes and reframes itself not only on the basis of new knowledge, but on the basis of those conjectures and dilemmas, themselves the result of a process of discovery not unlike that which uncovers, discovers, and constructs knowledge. Problem finding, every bit as important as problem solving, requires respect and eager openness toward conjectures and dilemmas above all, since here we find the thinker, the researcher, the experimenter, the practitioner most vulnerable, most tentative (i.e., most engaged in trying, in the essay of the thing). Our necessary practice of problem finding, of uncovering and sharing conjectures and dilemmas, thus requires a sense of hospitality toward real questions and toward those who advance them. While claims advanced as fact benefit from critique and challenge, the core of conjectures and dilemmas in our shared endeavors suggests that hospitality must also be present if we want to encourage inquiry. This hospitality also requires a certain humility, which is not self-abasement so much as a zestful yearning for problem-finding in community, a shared appetite for conjectures and dilemmas, not that action will not be taken–far from it–but so that neither argument nor action ossify into authoritarianism. (Authority and authoritarianism are two different things, I believe.)
I’ll start with Jerome Bruner, from whom I have learned so much over the last fifteen years, ever since the Roving Librarian brought home a book called The Culture of Education. I’ve returned recently to one of my favorite Bruner volumes, Toward A Theory Of Instruction, because of a lovely moment in Bret Victor’s notes on his “Future of Programming” talk. At the very end, Bret refers to his own reading of Bruner, and to Alan Kay’s advice:
Lastly, here’s some advice Alan Kay gave me (as I was going through a small personal crisis as a result of reading Jerome Bruner’s “Toward a Theory of Instruction”):
I think the trick with knowledge is to “acquire it, and forget all except the perfume” — because it is noisy and sometimes drowns out one’s own “brain voices”. The perfume part is important because it will help find the knowledge again to help get to the destinations the inner urges pick.
So following my nose and my inner urges, I returned to Bruner and found again these words:
For the question opens up the deep issues of what might be and why it isn’t…. It is such conjecture … that produces rational, self consciously problem-finding behavior so crucial to the growth of intellectual power…. I would say that while a body of knowledge is given life and direction by the conjectures and dilemmas that brought it into being and sustained its growth, pupils who are being taught often do not have a corresponding sense of conjecture and dilemma. The task of the curriculum maker and teacher is to provide exercises and occasions for its nurturing. If one only thinks of materials and content, one can all too easily overlook the problem…. The answer is the design of exercises in conjecture, in ways of inquiry, in problem finding. It is something that the good teacher does naturally at least some of the time. (From “A Retrospect On Making And Judgment,” in Toward A Theory Of Instruction).
This of course is the very opposite of what I call the bad Sunday School technique of education, in which the teacher poses an essentially rhetorical question and waits until the good student produces the foregone conclusion: “because he loves us.”
Reading Bruner’s words again, I am reminded of what feeds curiosity and interest, according to researchers: novelty (things that fall between categories), uncertainty, conflict, complexity. I note that “clarity” doesn’t make the cut. Of course clarity is important, but without conjectures and dilemmas, “clarity” can easily become “certainty,” interest will become compliance, and as surely as night follows day, students will do what the rubrics say to do, for that is what the teacher wants.
I think with particular unease of the moment at the recent AAC&U meeting in which a panelist, understandably frustrated by the thin quality of student reflections in one of his classes, spoke out forcefully on his plans to double down on the instructions and tell the students exactly what he meant by reflection. I believe he was sincere in his desire to improve their learning, but I worry that his plan of action will result in compliance of the most persuasive sort, the kind that allows everyone to say that the mission has been accomplished when exactly the opposite has happened.
For it seems to me that we are tempted to imagine reflection as a process of discovering and affirming lessons learned and problems solved, when anyone who has spent a moment in reflection will realize, I believe, that the depths of that practice awaken conjectures and dilemmas. (Reflection is neither capstone nor cornerstone.) Too much of school teaches learners to fear or mask conjectures and dilemmas. My students tell me they raise their hands when they have answers, not when they have questions.
I continue to think about hospitality as the great and charitable soil of conjectures and dilemmas. Those who entertain conjectures and dilemmas might also be called seekers.
On December 9, 2013, Doug Engelbart and his work were honored in a memorial gathering at the Computer History Museum in San Jose, California. The tributes, with a panel discussion following, are up on YouTube. If you’re at all curious about Doug’s vision and the legacy it offers us, I urge you to watch the video. It’s two hours very well spent.
There are several challenges in the video, not just one, but the one I want to highlight here comes from the tribute by Elizabeth “Jake” Feinler, who worked with Doug for several years at the Augmentation Research Center (ARC) and went on to become the founding director of ARPANET’s Network Information Center after. Her tribute starts at 30:12 into the video.
Ms. Feinler had many great memories of Doug and ARC, but the part that resonated most deeply with me came in words she quoted from Doug himself. The words illustrated Feinler’s experience of ARC as well as her intense admiration for Doug’s vision and humanity. Here’s what Feinler read, from an essay titled “Working Together,” written by Doug and Harvey Lehtman (also of ARC) and published in the December 1988 issue of Byte magazine.
We thought that success in tools for collaborative knowledge work was essential to the necessary evolution of work groups in increasingly knowledge-rich societies and to increasing organizational effectiveness. Until the recent growing interest in CSCW [computer supported collaborative work], most developers limited their analyses to technical issues and ignored the social and organizational implications of the introduction of their tools; such considerations were, however, key to our work.
There is growing recognition that some of the barriers to acceptance of fully integrated systems for augmenting groups of knowledge workers may be more significantly social, not solely technical. The availability of rapidly evolving new technologies implies the need for concomitant evolution in the ways in which work is done in local and geographically distributed groups.
ARC [the Augmentation Research Center] experienced this phenomenon continuously. The bootstrapping approach, so important to the continuing evolution of the system, caused us to constantly undercut our world: As soon as we became used to ways of doing things, we replaced platforms to which we were just becoming accustomed. We needed to learn new roles, change attitudes, and adopt different methods because of growth in the technical system we ourselves produced.
We brought in psychologists and social scientists to serve as observers and facilitators. They were as important to our team as the hardware and software developers. The resistance to change, which we soon realized was an essential part of introducing new technologies into established organizational settings, and the psychological and organizational tensions created by that resistance were apparent in ourselves. We were required to observe ourselves in order to create appropriate methodologies and procedures to go along with our evolving computer technologies. [my emphases]
This language, shifted only slightly, applies equally well to the process of education itself. True learning generates both increasing complexity and, at a meta level, an increasing awareness of the nature and potential uses of that complexity–i.e. strategies of mindfulness. A university is an augmentation research center, is it not? And yet how much time do we spend in fruitful self-observation, in bootstrapping ourselves into higher levels of mindfulness and invention despite the fact that by doing so we inevitably, constantly “undercut our world”? “World” means not the planet or civilization, but the structures and organizations that inevitably choke new growth. These “worlds” must serve the values we profess, not the other way around. These “worlds” and ourselves as their architects and inhabitants must evolve and grow even as we struggle to keep up with the change we have set in motion. “Undercut” is one way to acknowledge the struggle–but “reinvent” is also apt, for it points to the essential goals of learning.
What is the alternative? Bureaucratic self-defense? Does the world need more lessons in that?
2014 is likely to be a full-on year of Engelbart activity for me. The cMOOC I’ll be teaching with Jon Becker (with the able disruptive ingenuity of Tom Woodward) will explore topics central to Doug’s vision and work. There will be at least two Engelbart Scholars among the VCU students who take that course (more details coming soon). And as always, I will be doing my level best to bring Doug’s ideas and the ongoing work of the Engelbart Institute to the conversation about networked learning, wherever I can find it (or it can find me).
First, the birthdays.
Happy birthday to the author I’ve studied and delighted over for the last thirty-three years: John Milton, born 1608.
I never imagined I’d spend my life reading and thinking and writing about this writer. Just goes to show. (Show what? I’ll leave that as an exercise for the reader.)
Happy birthday wishes also go out to Rear Admiral Grace Hopper, the mother of COBOL, a fountain of wit and wisdom, and a pioneering genius of computer science. I first learned about Admiral Hopper from Dr. David Evans’ Udacity course CS101. (Yes, Udacity. It just goes to show.) Dr. Evans linked to her famous interview with David Letterman, and I was an instant fan.
The anniversary: 45 years ago today, Dr. Douglas Engelbart sat on a stage in San Francisco and, according to one awestruck observer, “dealt lightning with both hands.” The event has come to be known as “the mother of all demos.” There’s a very nice remembrance of Doug and his demo in The Atlantic today. I know there’s also a memorial happening right about now in San Jose, as his daughter Christina and many of Doug’s family, friends, and admirers are gathered to remember the demo and Doug, who passed away this year on July 5.
I talked to Doug for about an hour, back in 2006. I met him and shook his hand in 2008 on the night before the 40th anniversary of the mother of all demos. I am humbled to be working with Christina on a project for this summer and beyond. I am so very grateful to be linked in spirit and work with Doug’s vision. When there are dark or confusing days, I try to remember how lucky I am to have found that vision, and to have thanked that visionary, while he was still with us.
Here’s the first part of the mother of all demos:
And here’s a version my wonderful student Phillip Heinrich did for a final project in my second-ever “Introduction to New Media Studies” class, what eventually became “From Memex To YouTube: Cognition, Learning, and the Internet” (and will have another morphing this summer at VCU and worldwide–watch this space):
Phillip’s work is a conceptual mashup of Doug’s demo and Michael Wesch’s “The Machine Is Us/ing Us.” Even four years later, Phillip’s work still dazzles. Apparently Doug himself saw it at one point, which makes me very joyful.
I write these words from a hotel room in Atlanta, where I’m attending the annual meeting of the Southern Association of Colleges and Schools Commission on Colleges. I’ve heard some inspiring speakers and learned a great deal about more of the vast machinery of higher education. At the same time, I’ve seen many folks whose eyes are on fire with a passionate devotion to learning and teaching. I honor them, and salute their survival despite the vast machinery that exists, in part and sometimes ironically, to support them and their vocations.
The lament is for the ways in which the notion of “technology” that surrounds me here is untouched by the vision of either Grace Hopper or Doug Engelbart. When I hear a presenter say that a survey couldn’t include questions about “technology” as part of its core because “technology changes so rapidly,” I groan inwardly. In addition to the (typically) underthought use of the word “technology,” the speaker obviously has confused computing devices with computing. In the latter sense, “technology” has not changed substantially since the introduction of networked, interactive, personal computing, with the possible exception of mobile computing. But the confusion here keeps “technology” questions in a different survey “module,” and keeps educators from learning or even asking what they don’t know. (And eventually we all suffer.)
Similarly, when I hear another presenter say “we didn’t know technology would eliminate jobs the way it has,” then offer a list of “technology improvements” for the organization that include new computers and monitors, new office software, etc., I have to gnash my teeth (quietly, but still). How can we be in 2013 and still be so far removed from even the outer edges of the bright light shed by the visions of Hopper and Engelbart, among many others? How can we call ourselves educators and be content not only to remain in darkness, but to spread it through inaction and (I’m sorry, but it must be said) ignorance?
More than once at this conference I’ve heard presenters talk about “technology” in the same breath that they lament how old they are and how strange youth culture seems to them. Sometimes the lament is mingled with a little of that “kids, get off my lawn” curmudgeonliness. We all get to be a little prickly as we age, I guess, but methinks we do protest too much. Doug Engelbart and Grace Hopper didn’t surrender their visions as age overtook them. We do ourselves and our students no good service to remain in the shallows we have created for ourselves, the shallows we continue to excuse and extend. As Janet Murray writes, “When will we recognize the gift for what it is…?” Or as Doug Engelbart asked on that San Francisco stage forty-five years ago today:
If, in your office, you as an intellectual worker were supplied with a computer display backed up by a computer that was alive for you all day and was instantly responsive to every action you had, how much value could you derive from that?
Both Murray’s and Engelbart’s questions remain unanswered, and that itself is worth lamenting. The real grief comes, for me, because the questions are almost never asked, even among those who pride themselves on the arts of inquiry.
A sad case, but there is still hope. My students have taught me that.
Very salutary readings for a rainy Sunday morning at the SACS-COC conference in Atlanta, Georgia. This is the first time I’ve attended this annual meeting. Higher education is my vocation, so you wouldn’t think I’d have culture shock here–but I find I do. Perhaps that’s a first-timer’s gift. I must practice gratitude!
Here are some of Merton’s thoughts. These come from a man who had been educated in France, England (graduating from Cambridge), and the US (graduating with an MA from Columbia University). For a short time, he was a professor of English at St. Bonaventure. So he knows whereof he speaks.
“The danger of education, I have found, is that it so easily confuses means with ends. Worse than that, it quite easily forgets both and devotes itself merely to the mass production of uneducated graduates–people literally unfit for anything except to take part in an elaborate and completely artificial charade which they and their contemporaries have conspired to call ‘life’.”
“The least of the work of learning is done in classrooms.”
“Anyone who regards love as a deal made on the basis of ‘needs’ is in danger of falling into a purely quantitative ethic. If love is a deal, then who is to say that you should not make as many deals as possible?” [One can substitute "learning" for "love" and reach the same conclusion.]
“[A publisher asked me to write something on 'The Secret of Success,' and I refused.] If I had a message to my contemporaries, I said, it was surely this: Be anything you like, be madmen, drunks, and bastards of every shape and form, but at all costs avoid one thing: success. … If you have learned only how to be a success, your life has probably been wasted. If a university concentrates on producing successful people, it is lamentably failing in its obligation to society and to the students themselves.” [Particularly bracing words given the buzz here--and in my own title at work!--regarding "student success." Who would wish that our students would fail? Yet too narrow a view of success may be the most insidious route to failure of them all.]
And finally, in words that I would love to see above every classroom door and on the cover of every learning-related conference (my editorial material is clumsy but I want to present Merton generously):
“The purpose of education is to show a person how to define himself [or herself] authentically and spontaneously in relation to his [or her] world–not to impose a prefabricated definition of the world, still less an arbitrary definition of the individual himself [or herself].”
Source: Love and Living.
h/t @rovinglibrarian, @graceiseverywhere
Last week’s NMFS here at Virginia Commonwealth University discussed Vannevar Bush’s epochal (and, in its way, epic) “As We May Think.” The essay truly marks a profound shift, appearing just as WWII was about to conclude with a display of horrific invention that still has the power to make one’s mind go blank with fear. From Resnais’ Hiroshima Mon Amour to a film that can still give me nightmares, The Day After, the mushroom cloud that signifies this invention hung over my childhood and adolescence–and I don’t expect it will ever go away. Now that we know how, there is no unknowing unless civilization erases itself.
But as myth, fiction, and science continue to demonstrate, each in its own way, there are thousands of demonstrations of the real problem to hand every day: human ingenuity. It’s easy to get distracted by the name “technology,” as if it’s what we make, rather than our role as makers, that’s to blame. But no, it’s the makers we should lament. Or celebrate. Or watchfully, painfully love.
The state of man does change and vary,
Now sound, now sick, now blyth, now sary,
Now dansand mirry, now like to die:
Timor mortis conturbat me.
William Dunbar, “Lament for the Makers”
What shall we do with these vexing, alarming, exhilarating abilities? We learn, we know, we symbolize. Sometimes we believe we understand. We find a huddling place. We explore, and share our stories.
Presumably man’s spirit should be elevated if he can better review his shady past and analyze more completely and objectively his present problems.
Vannevar Bush, “As We May Think”
For several iterations through the seminar, that word “presumably” leapt out at me, signalling a poignant, wary hope as well as a frank admission that all hope is a working assumption and can be nothing more. This time, however, the word “review” glows on the page. Re-view. Why look again? How can repetition make the blind to see? Ever tried to find something hiding in plain sight? Ever felt the frustration of re-viewing with greater intensity, while feeling deep down that the fiercer looking merely amplifies the darkness? (Ever tried to proofread a paper?)
We console ourselves with the joke, attributed to Einstein, that the definition of insanity is to do the same thing again and again while expecting different results. Yet we hope that thinking, mindfully undertaken, may contradict that wry observation. We hope that thinking again can also mean thinking differently, that a re-view strengthened by a meta-view can yield more insight and bring us a better result than the initial view did. Look again. Think again. And, in Vannevar Bush’s dream of a future, a dream that empowered epochal making, looking again and thinking again would be enriched, not encumbered, by a memory extender, a “memex”:
[Man] has built a civilization so complex that he needs to mechanize his records more fully if he is to push his experiment to its logical conclusion and not merely become bogged down part way there by overtaxing his limited memory.
What is this experiment? When exactly did we sign the papers giving our informed consent to any such thing?
Our ingenuity is the experiment, the problem, the hope. Our birthright may also be our death warrant. Is that the logical conclusion?
Yet, in the application of science to the needs and desires of man, it would seem to be a singularly unfortunate stage at which to terminate the process, or to lose hope as to the outcome.
The word “science” signifies more than simply the methodological revolutions emerging in Renaissance Europe. For me, it signifies knowing. We in the humanities enact our own experiments in knowing, exerting our own ingenuity both constructively and destructively. We too are makers.
Re-view. Analyze more completely. “Encompass the great record and … grow in the wisdom of race [i.e., species] experience.” As we may think, and create and share “momentary stays against confusion.”
No way out but through.
3. Hopefulness and confidence about the future or the successful outcome of something; a tendency to take a favourable or hopeful view. Contrasted with pessimism n. 2.
So the Oxford English Dictionary. I picked sense 3 because it seems most resilient in the face of abundant evidence that this is in fact NOT the best of all possible worlds (pace Leibniz, at least as he’s pilloried by Voltaire).
It seems to me that educators, no matter how skeptical their views (skepticism is necessary but not sufficient for an inquiring mind), are implicitly committed to optimism. Otherwise, why learn? and why teach?
I think of this as I begin another semester thinking with faculty and staff across the university (last term Virginia Tech, this term Virginia Commonwealth University) about the possible good we could co-create, and derive, from interactive, networked, personal computing. To be pessimistic (not skeptical, pessimistic–they are not synonyms) about personal, networked, interactive computing is to be pessimistic not about an invention, but about invention itself–that is, about one of our most powerful distinctions as a species.
Computers have become woven into our lives in ways we can barely imagine, but the best dreams about the texture of such a world are hopeful, and stimulate hope. Are we there yet? Of course not. But to be pessimistic about computers is to be pessimistic about humanity. And while that’s certainly a defensible position generally speaking, it seems to me that education is an activity, a co-creation, a calling, that runs clean counter to pessimism.
Last week in the seminar we read Janet Murray’s stirring introduction to The New Media Reader. A colleague from the School of Dentistry. A colleague from the library. A colleague from the Center for Teaching Excellence. Colleagues from University College. And more. Once again, I read these words:
We are drawn to a new medium of representation because we are pattern makers who are thinking beyond our old tools. We cannot rewind our collective cognitive effort, since the digital medium is as much a pattern of thinking and perceiving as it is a pattern of making things.
Indeed–yet this is not to deny the meta level at which we consider our consideration, and think about our blind spots so we can find more light:
We are drawn to this medium because we need to understand the world and our place in it.
Yes–and now the world we need to understand is also a world transformed, for good and for ill but potentially for good, why not?, by the medium itself. Recursive, yes–but more deeply, a paradox, not an infinite regress. That’s the hope, anyway. And educators are committed to hope.
To return to [Vannevar] Bush’s speculations: now that we have shaped this new medium of expression, how may we think? We may, if we are lucky and mindful enough, learn to think together by building shared structures of meaning.
That mindfulness is the meta level. I am optimistic about that meta level. As a learner, I have to be. If mindfulness is impossible, then it’s truly turtles all the way down, and who would care?
How will we escape the labyrinth of deconstructed ideologies and self-reflective signs? We will, if we are lucky enough and mindful enough, invent communities of communication at the widest possible bandwidth and smallest possible granularity.
Lucky, and mindful. Chance favors the mindful mind.
We need not imagine ourselves stranded somewhere over the evolutionary horizon, separated from our species by the power of our own thinking.
Or separated from our history, or from our loved ones–though clearly Hamlet (to name only one) demonstrates that mindfulness alone is no guarantee of anything. But what is on the other side of the horizon? What do we find when we return to the place we left and see it for the first time?
The machine like the book and the painting and the symphony and the photograph is made in our own image, and reflects it back again.
To which I would add: the syntax and punctuation in Murray’s sentence above enact the pulses of ways we may think. Those pulses and the ways they enact are poetry. What more complex shared structure of meaning is there?–unless it’s true that all art aspires to the condition of music. Poetry “begins in delight and ends in wisdom,” Frost writes. He continues: “the figure is the same as for love.” Can the shared structures of meaning emerging from our species’ collective cognitive effort begin in delight and end in wisdom, too? Can the figure our collective cognitive efforts make be the same as for love? I think: I hope so. I think: it better be. I think: how can I try to help? The seminar is one answer, a crux of hopes, the discovery of an invisible republic of optimism.
The task is the same now as it ever has been, familiar, thrilling, unavoidable: we work with all our myriad talents to expand our media of expression to the full measure of our humanity.
And by doing so, that measure increases. May we use that abundance wisely, fairly, and lovingly within this mean old brave new world.
With luck and mindfulness, I am hopeful that we can.
for my Jewish mother, Dr. Janet Murray, with love and deepest gratitude
In mid-March I got an email telling me I was nominated in the search for a senior leadership position at Virginia Commonwealth University: Vice Provost for Learning Innovation and Student Success. I was intrigued. I looked at the leadership profile. I was mightily interested. Can’t hurt to apply, I said to myself. So I did.
The hectic, rewarding pace of life went on. Janet Murray came to VT as the third Distinguished Innovator in Residence. (Very exciting.) The Center for Innovation in Learning prepared its first call for Innovation Grant proposals. (Ditto above.) Learning Technologies began its metamorphosis into Technology-enhanced Learning and Online Strategies. I traveled to Richmond, Boston, Bethlehem (Pennsylvania), and in June, to Rome (Italy) for conference presentations and faculty seminars. And to my wonder and delight, my candidacy continued to advance in the VCU search.
On June 7, as I sat in my lodgings in Barcelona, I spoke with VCU’s Provost, Dr. Beverly Warren, who offered me the job. As a literature scholar, it is my duty of course to tell you that the last time I was in Barcelona, in October of 2010, I was offered the Virginia Tech job. I guess Barcelona is my lucky town in the narrative of my professional life. (No one who’s been there will be in the least surprised.)
On June 24, back in the States, I signed the contract.
On August 1, I formally began my work, though I’d been ramping up at VCU and ramping down at VT since my return to the US. On this same day, my wife and I closed on our new home in Richmond.
Oh, and the conference in Rome was wonderful, far beyond my already-high expectations. The city and country were also pretty stupendous (litotes alert). As was Spain the week before, as was England the week before that. A summer of summers.
And sadly, the cloud over the trip was the death of my beloved mother-in-law on the same day that her youngest daughter, my wife, arrived in Madrid to join me in my travels. That grieving continues. If my experience with my parents at their passing is any guide, one learns to live with death, but one never gets over it.
I guess I’m a little behind in my blogging. Perhaps you can see why? The problem seems to be time, but it isn’t really. Time has become extremely compressed, yes, and spare time has become a vanishing commodity. My perception of time many days borders on the surreal as I adjust to the scale, scope, pace, and challenges of the new job–all very exciting, all very welcome, and all very demanding. Yet the real problem is, as ever, too much to say.
Time to write anyway. Not that I’ve been idle in that department, but I have been silent in this space, and I miss it. I did get 6000+ words done in an article on temptation in Paradise Lost, however–turns out I miss that kind of writing, too. Yes, Gardner writes, even if you haven’t seen it here for several months. Time to write anyway.
“Plight” is an interesting word. We are in a plight, meaning we’re in a tangle, a mess, a terrible fix, with “fix” itself an ironic noun in this context. Yet we also plight our troth, meaning “pledge our truth.” Plight-as-peril and plight-as-pledge both come from an earlier word meaning “care” or “responsibility” or (my favorite from the Oxford English Dictionary) “to be in the habit of doing.” Along a different etymological path, we arrive at the word meaning to braid or weave together. The word “plait” is a variant that makes this meaning more explicit. It’s not too far into poet’s corner before weaving, promising, and care-as-a-plight become entangled, at least in my mind, and perhaps usefully so.
The first McLuhan reading in the New Media Faculty-Staff Development Seminar is from The Gutenberg Galaxy, specifically the chapter called “The Galaxy Reconfigured or the Plight of Mass Man in an Individualist Society.” I don’t know if McLuhan is punning here, but it’s not implausible that the man who coined the term “the global village” and paid special attention to the role of mediation in human affairs–mediation considered as extensions of humanity–might think not only about the plight we find ourselves in but also the plighting of troth we might explore or co-create or braid.
The trick (and McLuhan is nothing if not a trickster, as others have noted) is that the plighting cannot be straightforward or “lineal,” lest it not be a genuine pledge or an authentic weaving. His very writing is obviously a plight for many readers, but it’s also a brave (and sometimes wacky) attempt to do a plighting of the plaiting kind as a sort of pledge of responsibility. He writes these stirring words for our consideration:
For myth is the mode of simultaneous awareness of a complex group of causes and effects. In an age of fragmented lineal awareness, such as produced and was in turn greatly exaggerated by Gutenberg technology, mythological vision remains quite opaque. The Romantic poets fell far short of Blake’s mythical or simultaneous vision. They were faithful to Newton’s single vision and perfected the picturesque outer landscape as a means of isolating single states of the inner life.
From which I draw these conclusions regarding McLuhan’s argument (or plighting):
1. “Lineal” does not mean “synthesized” or “unified.” The straight path or bounded area leads only to fragmentation and reduction. It is not a weaving and cannot be. The lineal and the fragmented are perilously broken promises.
2. Mythological vision is a technology for enlarging awareness of complexity. Mythological vision is both plighted-woven and a means for plighting-weaving.
3. Fragmented, lineal awareness invents technologies of self-propagation that reinforce more lineality, more fragmentation, while giving the illusion of doing quite the opposite. Single-point perspective is not the same as a unifying vision or a simultaneous awareness of a complex group of causes and effects. It is, instead, reductive while pretending to be unified.
4. Even self-consciously or self-proclaimed liberatory movements such as Romantic poetry (or any number of other such apparently radical departures) may quail before the complexity and simply reinscribe a slightly shifted set of boundaries, thus perpetuating a reduction of complexity and a lack of awareness that dooms our technologies to reproducing our failures.
What technologies might reveal, restore, or help us co-construct a mythological vision, a species-wide simultaneous awareness of a complex group of causes and effects? It’s a political question that reaches into the realm of complexity science, art, and potentially even philosophy or (gasp) theology. Does Doug Engelbart’s idea of “augmentation” and complex symbolic innovation answer such a call? Does Bill Viola’s anti-condominium campaign? Is there an eternal golden braid to be had, or woven? What loom should we choose, or make?