Posts Tagged: John Maxwell

A Bird in the Hand: Index Cards and the Handcraft of Creative Thinking

Haig Armen and I presented this paper (actually, we wrote a lot of index cards and stuck them on the wall in front of a projector!) at Congress 2013 in Victoria this past June. The talk was part of a session called Mediating Creative Practice that was put together by Frederik Lesage and Ben Woo. I don’t know why it took so long for me to put it online, but here it is.

Read more


Reading Devices – A Personal Note

A couple of months ago I traded in my old first generation Samsung Galaxy–it had been the fanciest smartphone around for a few weeks in the fall of 2010. I went big: my new phone is a Galaxy Note II, which is close to twice the size of my old phone. Funny to even think of it as a phone; I almost never talk on the phone. Rather, it’s a handheld Internet device. I won’t go into the neologisms, but really, it’s a small tablet–a much better size to carry and hold (and look at) than my first-generation iPad, for instance.

When I got the Note II, Rogers had a promotion on where they’d throw in a Kindle Paperwhite with a new signup. I qualified, so I now have a Kindle as well, which is nice because the last e-ink reading device I had was a first-generation Kobo that was completely useless. So, having spent a little bit of time with these two new toys, I can make some comments. Read more


Software Preservation and the Special Problem of Unix?

The Library of Congress’ Preservation.exe meeting was last week… two days of fascinating presentations on the challenges of preserving historical software, digital media, and computing environments. I followed the #presoft hashtag (the meeting had no website, so twitter was it), and saw in the whole program not a single mention of Unix.

Is it, I wondered, because Unix does not need to be preserved–at least not in the same way as old MSDOS applications, or old Amiga OS disks, or a favourite videogame for the Apple II?

Unix, unlike these systems, is still running. Its history is immanent. Unix rose to dominance in computer research back in the 1970s, weathered a decade or two of the emergence of the computer industry as we know it today, and went through a renaissance with the advent of the Internet, the Web, and the free software movement. Today, Unix and Unix-derived systems are everywhere. It runs most of the Internet. It’s in your MacBook. It’s in your iPhone. And your Android. And your Blackberry. It’s probably in your car, and your microwave oven too. Furthermore, Unix provides the key architectural ideas of modern computing: the very idea of files, streams, network interfaces were all either invented or perfected in Unix. Even Microsoft, which positioned itself as the anti-Unix for decades, is hugely influenced by its architecture.

In contemporary Linux and Mac OS X systems, you can directly see fragments and traces of Unix that go back to the early 1970s. In the source code alone, strictly speaking, there are fragments of BSD Unix that go back to the 1980s. And, because of the existence of open interoperability standards, when the code itself has changed (due to license issues), the way the software works and presents itself (and its documentation) remains a continuous flow since the 80s, with its roots way back in the 1970s. Unix is not so much like a text as it is like a discourse.

Well, consider this:

Unix, by contrast, is not so much a product as it is a painstakingly compiled oral history of the hacker subculture. It is our Gilgamesh epic.

The rejoinder is by Neal Stephenson, not at the #presoft meeting, but rather in his 1999 essay, “In the Beginning was the Command Line.”

Stephenson went on:

What made old epics like Gilgamesh so powerful and so long-lived was that they were living bodies of narrative that many people knew by heart, and told over and over again–making their own personal embellishments whenever it struck their fancy. The bad embellishments were shouted down, the good ones picked up by others, polished, improved, and, over time, incorporated into the story. Likewise, Unix is known, loved, and understood by so many hackers that it can be re-created from scratch whenever someone needs it. This is very difficult to understand for people who are accustomed to thinking of OSes as things that absolutely have to be bought.

Unix is self-archiving. It is not an object or an artifact so much as an ongoing discourse, a culture. It preserves itself by being continually re-implemented and re-created. The important parts of Unix are not the lines of code–certainly not, for the licensing issues prevent this from being a practical reality–but the system of interconnected ideas and the universe of tools and systems built with it.

Contrast this with the kind of software preservation that #presoft addressed. If archiving, like the writing of history, only makes sense when the objects of study are in some sense done, or complete, or finished, then the study of software history is already compromised. It brings to mind the old story of the man who searched for his keys under the lamppost because the light was better there. It is methodologically much simpler to perserve—and study—the pieces that present themselves neatly as artifacts, by being packaged and sold, shink-wrapped.

But I would argue that the mainstream of living software history—not exclusively the Unix tradition by any means, though Unix is a good example of this—does not lend itself to traditional archival approaches, because not only has it evolved over a long period of time, it continues to do so, into the future. It has no boundaries, no edges, and therefore eludes being captured, defined, perserved.

There’s a certain irony in contemporary projects to build emulation environments that will run old software. Clearly being able to run and interact with old software is vastly better than merely collecting the bits (or the floppy disks). But in creating the emulation environment—in all likelihood running on Linux or some other Unix variant, and connected to the Internet (because what good would it be if it weren’t)—you gain a vastly better infrastructure than the software had originally. In the emulator, you give that old software life… and a network connection, the lack which is, to be blunt, why it died in the first place.

So I fear that the preservation projects end up like a Cabinet of Curiosities, filled with interesting things that aren’t necessarily well connected to one another or anything else. Does this invalidate or fatally question the software preservation project? No, it doesn’t. But it does, I think, put it in brackets… it shows that this particular history is not connected, except indirectly, to the larger currents of time. Because there is no network. Bruno Latour wrote, “747s do not fly; airlines fly.” It’s the same critique.

I don’t want to dump on the software preservationists. I appreciate and share their impulse. But I do think that there’s a methdological problem in preserving only the easily preservable, in treating software “titles” much in the same way we treat books and the documentary record: as so many discrete items, whose provenance is more or less unproblematic. While all software artifacts have history and historicity, Unix is on some level nothing but its historicity. I would go farther and suggest that in the age of the network, Unix is the model, not the exception. Software history needs ethnography more than it does archival preservation.


MOOCs, Big Data, and the Open Web

or, Why the MOOC hype is giving me hives.

Please note, the following is not a ‘balanced’ assessment of the MOOC phenomenon; I’ve read several of those (here’s a good one), and to my eye, they always miss some of the most important elements of this new movement. I’ve been paying attention to the rise of the MOOC phenomenon for some time, and I wanted to get at what bothers me so much about it.

First up, let me establish some personal background: I worked for several years in the 1990s in online distance education. At BC’s Open Learning Agency, I worked on several iterations of early online course models, developing instructional design methods, writing and developing course material, teaching and moderating online courses, and doing a lot of experimental R&D work. Having spent that time, the first thing I thought about the MOOC phenom was, “What, this again?”

Second, I did my PhD work in a faculty of education (UBC), in a department of curriculum and instruction, with a research focus on the history and theory of digital educational technology. My PhD research came after my distance ed experiences, and I was specifically interested in ways of going deeper – much deeper – pedagogically than the “access” paradigm of most open/distance ed.

So what’s new? Well, for reasons that have yet to be made entirely clear, the business of online courses has had an absolutely stunning resurgence in popularity in the past year and a half. Why? There certainly isn’t anything new here from a pedagogical or even curriculum development perspective. What is new about MOOCs is hidden in the M (which nominally stands for “massive”). What the M actually stands for is a related contemporary trend in online business known as “big data.”

Big data refers to the emerging business model of superstar online businesses like Google, Amazon, Facebook, and the companies that wish they were. The simple idea in big data is that if you can manage to insinuate yourself into the middle of zillions of day-to-day transactions (as Google, Amazon, Facebook, and others do), then you are in a position to not only aggregate tiny tiny transactions into billions of dollars of revenue. You are also–much more importantly–in a position to collect and capitalize on information about your millions of customers. Big data aggregation is actually what accounts of the immense wealth and power of companies like Google and Amazon; it’s not what they sell, it’s what they collect.

So, in the MOOC phenomenon, we take an old idea–putting course materials online and letting students work through them at their own pace, on their own motivation–scale that up to a very large number, and it becomes an attractive business proposition in the world of big data. Now, this point seems to be lost on lots of people, not least the University committees that are getting all hot and bothered about joining the MOOC trend. Somehow, the hype around MOOCs has led us to the point where all critical sensibilities about learning, pedagogy, curriculum, student experience, privacy, research, and the role of Universities in democratic society has been thrown out the window, in favour of this fabulous bandwagon.

Could we stop for a moment and ask the old question, who stands to gain? Do universities stand to gain? Apparently they think that joining the MOOC movement helps them look like they’re part of the future instead of just a collection of stuffy old ivy-covered buildings. Do students stand to gain? Potentially, I guess, since in the new mythology, you don’t have to be able to afford university tuition or manage entrance qualifications in order to access these learning opportunities… but then, how is that different than what the public library has always offered, exactly? Do ministries of education and funding bodies stand to gain? Definitely, at least in short-term political rhetoric, because they can appear to be opening up educational opportunities on a vast scale and for almost no investment.

The real advantage, however, goes to the providers and platforms who, as in the big data model, act as the platform for every (trans)action, becoming the obligatory passage point that everyone must encounter in order to do anything. So as with Google, Amazon, Facebook, and their kin, we should look not to the overt revenue-generating model (e.g., so many dollars per student per course) but to the longer-range data aggregation opportunities that becoming a popular platform provides.

It is a perfect neoliberal storm, really. Here we have an apparent ‘free market’ for education unencumbered by interference like entrance exams, tuition costs, and exclusivity. Instead, here is a market where every course is potentially available to every willing student. The environment provides an absolute minimum of coddling or instructional support, leaving the success of the venture entirely up to the individual student’s Ayn-Randian personal initiative. All that knowledge, just there for the taking, right? It must look pretty appealling, especially to administrators, bureaucrats, and anyone who sees the world through the input/output logic of budget spreadsheets.

Of course, it makes just about no sense to anyone who cares about, or who knows anything about education and learning. Most of the MOOC phenomenon, like much of the distance learning that preceded it, is based on a straight instructional model: open up student’s head, pour in information, then test for absorption. It is a model that, as a graduate student in education, I had figured was completely debunked; we know so much more about how people encounter information, how minds work, how learning is social and situated and practical and distributed and constructionist and connectivist. A whole literature—generations of research and scholarship–from cognitive science, phenomenology, linguistics, psychology pointed to an active, socially situated, ongoing model of minds engaged in the world. And a generation of scholarship on educational technology looked to how we might use software and networks and digital media to work into these spaces. The advent of a ubiquitous Internet and social media looked like it might provide opportunities to re-imagine schooling and education in some powerfully new ways.

But the good old instructional model hangs on, just as it has done for a hundred years since John Dewey wrote a couple of books about why it was already obsolete. The reasons for the tenacity of the instructional model are simple: it is economically efficient to deliver education as simple instruction. It scales. You nail down curriculum, formalize it, and then mass-produce it for students. It’s an industrial-age model, and it has been the basis of the school system and most of undergraduate education for hundreds of years. Even though we know better.

So MOOCs play straight into that. From an administrator’s perspective, the opportunity to scale up the delivery of education, and in doing so achieve economies of scale (the basic industrial era formula for success), is irresistible. And then, along comes a private-sector partner that wants to underwrite the whole procedure and make it easy to scale up and buy in, and you have a perfect recipe for a big deal.

Of course, it makes no sense to anyone who cares… David Noble’s 1998 book Digital Diploma Mills provided a straightforward evisceration of the business model of both old-school correspondence learning and a newer, 1990s generation of online distance courses. In Noble’s analysis, the model works like this: you charge students tuition when they enroll in a distance course. They start work on it, find it un-engaging and, save for a small percentage who tough it out (and will complete the course no matter how terrible it is), a majority of students drop out before completing. The genius of the model is that you get paid up front, and then realize decreasing student support costs as the course goes along. Essentially, the worse the course, the more you profit, at least as long as you can keep enrollments up.

So how is the MOOC any different? They’re a lot bigger, that’s how. So now apparently, if you have thousands and thousands of enrollments, you can get away with absurdly low completion rates (often, single digit), and you still wind up with a nice-looking absolute number of completions.

But that’s just on the face of it. What’s less obvious is the ‘big data’ element. The platform people are in a position to collect personal information and longitudinal activity data about everyone who enrolls, and everything they do in the course afterwards. At a scale of a few hundred students, that information isn’t very valuable, but if you can get tens or hundreds of thousands of enrollments – or if you can become the big-dog platform to many universities – you start to have some valuable property on your hands. Remember, course completions aren’t the goal; enrollments are… and the data profiling of all those people. The audience is the product.

Is there nothing good about the MOOC phenomenon? I didn’t say that. I do believe that the idea of opening up access to curriculum materials online and encouraging whole networks of students to work with them (or even create them themselves) is a very interesting idea. Lots of new good things could come of that. But remember that this particular ideal is not what’s selling the MOOC movement right now; what’s selling it is the administrative sweetness of scaling up the delivery of education to larger numbers of students.

You want to see a real MOOC? Look at the Internet itself. What I consider to be the greatest educational achievement of our time is how society has rallied around the open web to create a whole new set of literacies, making learning opportunities for millions of people who are motivated simply to make the network work. Look at free & open source software, a massive, revolutionary movement populated by a huge proportion of self-taught individuals. Look at the vast communities online who have become educated – and politically active – about Internet governance, copyright, and legislative threats to network freedom. Look at YouTube and a whole emergent culture of mashups and DIY production, all accomplished by individuals learning stuff on their own by engaging with free resources and–more importantly–free discussion online. The Internet is curriculum.

What’s the difference between that and the MOOC bandwagon? None of those people needed a Coursera or Udacity to learn. They just needed the Net itself.


John’s (very early) Spring Tour of the UK

I flew to London on Valentine’s Day, immediately after Tools of Change in New York (luckily, my sweetie was on the plane with me). In pursuit of my sabbatical goal of building my academic network, I had planned a tour of Publishing programs in the UK. You see, in contrast to North America, the UK has a lot of Publishing programs (including twelve graduate programs!); over two weeks in February, I visited six of them.

A good part of the fun was travelling around to find these people at their institutions. I visited City University London (just north of the Barbican) and University College London (in Bloomsbury). At UCL they took me to see Jeremy Bentham, which was a shocking experience not because his long-dead self is sitting there for all to see, but because I had always thought his mummified remains were tucked out of the way at Cambridge or some place where you wouldn’t just suddenly bump into him. At UCL, Jeremy Bentham sees all.

I spent a glorious early spring day in Oxford at Oxford Brookes University, up the hill via a crocus-strewn park from the old University. I travelled to Stirling University in the old Scottish capital on a day that wasn’t so spring-ish (it was bloody chilly). I took the city bus well outside the normal tourist districts in windy, cold Edinburgh to visit Edinburgh Napier University. And I didn’t, sadly, visit the lovely Bath Spa University, because they met me in London.

In talking with the many interesting people I met at these places, I came away with the sense that Publishing Studies programs are very much alike; we have the same kinds of students, the same kinds of connections with industry, and we face very similar challenges. I gave a talk to the students at Stirling, and it felt a lot like talking with the MPub students here at home.

That feeling was comforting, but it also points to a troubling structural issue: there are a dozen or more programs in the UK (and some of them, like Oxford Brookes, are big programs, with 70+ students). The total number of graduates being turned out is in the hundreds every year. Can the publishing industry as we know it possibly absorb that many new people? Most of the programs (ours is no different) point to industry jobs as the payoff for doing the degree.

And if the short answer to that question is No, then why do students continue to enroll in publishing programs? By all accounts, application numbers and intakes are healthy across the board (our experience is no different). What are all these students after?

To answer that seriously, we have to look past industry training and job placement. In my experience, most students don’t go for a Masters degree in publishing because they want to get a specific job in the industry; what they really want is a career in the world of books (or magazines) and literature. They love books, and what books represent, and they want to have a life in that world. That’s a vocational identity rather than a job search or even a career plan. It’s deeper than our common rhetoric about jobs and placements lets on.

But if that’s the case–that publishing students are really after a deep, well-cultivated engagement with the world of books and literature–are publishing studies programs really giving them what they need? In another age (say, a decade ago, even), when the publishing industry was stable and profitable and predictable, then job training might have been the straightest line to a life in books or magazines. I don’t think we live in that world anymore. So what are we (all) doing as publishing programs to address this deeper (and frankly, much more reliable) desire that lies beneath the easy rhetoric about industry placements?

I think the elements are evident enough: we all lead with “practical skills” in editorial, design and production, marketing, digital. But we also ask our students to do heaps of creative work. We are guided by industry perspectives, but we also demand the kind of research and critical analysis that makes for a serious graduate program. We indoctrinate the individual student in the culture of publishing, but we they also work collectively developing ideas and perspectives, generating new culture and perspective amongst themselves. We teach ‘publishing,’ but hopefully we also transcend it. These are themes that came up in nearly all the discussions with my UK colleagues. But how to rebalance these forward-looking, generative modes against the traditional rationales for publishing studies? That, I think, is the challenge for this generation.


Pressbooks, Monographs, and the Essence of the Book

Earlier this week I took part in a panel discussion at UBC on “Why Do We Need Academic Publishing in the Digital Age,” organized by the smart folks at UBC Press. The discussion touched on a variety of topics in scholarly communication, peer production, and the role of editorial, and we ran out of time before we really got into the meaty part of this really enormous conversation.

One of the meatiest questions that was proposed, which we only barely got started on, was about whether we are “inexorably moving to a post-book world.” I think this issue is of foundational importance to scholarly presses, who have for many decades organized themselves around the exacting demands of the scholarly monograph—a paragon of bookish essence which, like the literary novel, is difficult to imagine as anything other than what it already is. Read more


Amazon and the Engagement Economy (repost)

I posted this last May on our old site. While looking at it again, I thought it was worth bringing it over to the new CCSP website, since it talks to an issue that keeps coming up in the popular media: that Amazon has it in for the book industry. Saying so, I think, misses the larger point.

Earlier this month (May 2012), Farhad Manjoo blogged over at Pandodaily that “Nobody Seems to Understand What Jeff Bezos is Doing. Does He?” I think it’s fair to say that Jeff Bezos, along with rest of the decision makers at Amazon, have a very good idea what they’re doing. The fact that it is not clear to many others—particularly in the publishing industry—is also apparent. The amount of ink spilled in recent months about Amazon’s seemingly predatory practices, their disregard for traditional industry practices and values, and even their penchant for “evil,” speaks to an environment where some very divergent things are afoot. Read more


Books in Browsers 2012 – A Watershed?

I wanted this post—which is my reconstruction of what I saw and thought while at Books in Browsers last month in San Francisco—to be short and pithy and thematic. But it isn’t… my attempts to get it all down in writing have instead produced this rather long and meandering narrative. For posterity’s sake then, if not literature, here’s my report. Read more


Some Coach House history at Alcuin Awards in Toronto, Oct 1

On Monday, October 1st, The Alcuin Society will honouring the winners of its 2011 Awards for Excellence in Book Design in Canada, at the Arts & Letters Hall in Toronto.

I’ll be , to cheer for the book arts, but also to help honour Stan Bevington, this year’s recipient of the society’s Robert R. Reid Award and Medal. I’ll be giving a brief talk on the Coach House Press as a Digital Pioneer, based on my research into the fascinating story of Do-It-Yourself technical innovation that has been a key feature of the Coach House since its beginnings in the late 60s and early 70s.


The Internet is a Friendly Place

I was making a tuna melt the other day, and I went online—as I often do these days—to see what people were suggesting about how to make it.

This little gem was the #2 result for “tuna melt tomatoes”:
http://thepescetarianandthepig.com/2012/04/27/breadless-tuna-melt-in-a-tomato/


css.php