Not if you're a tenured professor in the research division of a university-operated museum, and not if that university is the University of Nebraska.
The Chronicle of Higher Education reports that in response to serious budgetary pressures, the University of Nebraska has decided to eliminate the research division of the University of Nebraska State Museum, which involves the elimination of eight tenured professorships. While the university has found alternative positions for four of the eight professors, if its latest budgetary proposal is approved the other four professors will soon find themselves out of work.
The vulnerability of these professors stems from the fact that though they teach in other departments they do not have their own undergraduate programme: "The chancellor, Harvey Perlman, contends that the elimination of the museum's research division represents a 'programmatic decision' that won't harm the core mission of undergraduate teaching."
There can be little question that university administrators are in a bind. Their state funding was recently cut by 10 percent. 10 percent represents a significant cutback (apparently in the order of about $21 million), in response to which something must be cut. In fact, something already has been. The Chronicle notes that "while budget concerns had led to the layoffs of more than 150 staff and non-tenure-track faculty members since 2001, this was the first slice at tenured professors."
One question on the minds of tenured professors at the University of Nebraska: is this the first in a series of slices? Another question, raised by Susan W. Fisher of Ohio State University, which recently suffered a $28 million cut in state funding, is: "'If the precedent is set that an entire program can be eliminated and the tenured people did not have to be accommodated, then where does it stop?'"
A couple of other questions might also be posed. For example, how many mid- to top-level administrative positions have been cut? And since tenured professors at the University of Nebraska are understandably and, if this article is any indication, quite vocally upset about this first slice at the tenured, we might ask what, if anything, was their response to the elimination of the 150 staff and non-tenured positions?
I don't think this decision by one state-funded university should be taken as a sign that the institution of tenure is now in grave and imminent danger of abolition. This is, after all, not the first time that an entire programme has been eliminated. Moreover, the research division's lack of an undergraduate programme does seem relevant: regardless of the value of the museum (which is surely valuable in any number of ways), I think an argument can be made that it is not as central to the basic mission of the university as an undergraduate teaching programme.
Still, if the significance of this cut should not be exaggerated, I think it must be acknowledged that it is not a good sign. A number of state legislatures are proposing and implementing steep funding cuts for higher education. A number of state taxpayers are under the impression that tenured professors are at best lazy and at worst up to no good and to worse than no good. Indeed, a recent nationwide poll by the Chronicle suggests that while Americans are generally "more than satisfied with the quality of education that American colleges provide," they are "highly skeptical" of some of the practices of the academy, including that of tenure: apparently some two-thirds of the poll's respondents agreed that "experienced professors should not be granted jobs for life." Is this how the question was framed, I wonder? this would surely skew the response. But then, this is almost certainly how the question would be framed in the public sphere if tenure became a political issue. I don't think it would take much to run a successful campaign (say for a state-wide ballot initiative) urging voters to eliminate tenure as a way of making state-funded higher education more cost-effective and more responsive to the needs of the public (I don't think the abolition of tenure would do anything of the sort, but I believe it would be easy enough to convince enough voters that it would). I'm not suggesting that such a campaign is on the immediate horizon. But I also think it would be naive to rule out this, or similar, attacks on tenure as very real possibilities over the next decade or two.
"The Separation of the Learned from the conversible World seems to have been the great Defect of the last Age, and must have had a very bad Influence both on Books and Company...
....[Learning] has been...[a] great Loser by being shut up in Colleges and Cells, and secluded from the World and good Company. By that Means, every Thing of what we call Belles Lettres became totally barbarous, being cultivated by Men without any Taste of Life or Manners, and without that Liberty and Facility of Thought and Expression, which can only be acquir'd by Conversation. Even Philosophy went to Wrack by this moaping recluse Method of Study, and became as chimerical in her Conclusions as she was unintelligible in her Stile and Manner of Delivery."
-- David Hume, "Of Essay-Writing" (1742), Essays Moral, Political and Literary
"We should be more concerned with our quality of mind and less concerned with our production of scholarship, and place greater value by far on one good conversation about the nature of a good society than the publication of five journal articles. That's how we get to a new academy humming with passion for ideas and a generosity of spirit, where academics treat each other with the same tender pedagogical regard that professors at a college like Swarthmore now reserve for their brightest undergraduates, where the excitement of discussion and debate replaces the damp silence that nestles over the academic calendar like a fog."
-- Timothy Burke, "We never talk anymore"
Lack of time, writes Timothy Burke, is "the alibi that everyone uses to lightly explain away the puzzling vacuum at the heart of academic life." He's not buying it.
In a nice dissection of the heartlessness at the heart of the academy, Burke places his emphasis on the significance of fear. There is, for example, the "internalization of shame" and "paranoid wariness" that is too often the accompaniment of graduate school training. And there is also "the massive saturation of the intellectual marketplace with published knowledge and academic performances of knowledge at conferences, workshops and events," which makes academics "fear exposure of ignorance, because in truth, most of us are ignorant."
In place of "the ceaseless overproduction of derivative, second-order knowledge," Burke calls for a renewed embrace of the teaching mission and a revaluation of the vanishing art of scholarly conversation. Following Hume, we might call this a return to the ideal of conversibility, where the scholar does not work in "monkish seclusion" but rather engages with the problems and concerns of common life, mediating between the world of learning and the rest of the world. In any case, go read Burke's essay.
MORE:
Re: "the ceaseless overproduction of derivative, second-order knowledge." I have to agree that it would be far better to have less production of what could then be more valuable books and articles. But I'm not optimistic.
Less production of better publications would require a much greater emphasis on qualitative rather than quantitative measurements of value. But where would the criteria of evaluation be found? As Burke notes in his essay, since the canon no longer has any authority, there is no longer a "compass to point the way towards what we ought to know." One response to this loss of compass is the move toward ever greater degrees of specialization: the more narrow the field, the easier it is to find evaluative criteria that specialists can agree upon. But this specialization is of course not a solution but rather a major part of the problem of an overproduction of work that will never be read by more than a handful of like-minded specialists.
And then there is the problem of the academic job "market" (the dismal state of which is also related to a loss of authority): an oversupply of candidates has intensified the pressure to publish, which now begins in graduate school.
The other day I posted a link to Michele Tepper's "Doctor Outsider." She updates her account here, assuring us that, so far, her story has a happy ending. Further evidence in support of the shocking and heretical notion that there is life outside the academy. This blog entry entitled "The Droves of Academe" is also worth reading.
"The world's high- and middle-income countries should not imagine that the relatively rich can fence themselves off indefinitely from poverty and misery in the poorest countries. Nationalism has long been a powerful cause of political violence. Nothing is more likely to strengthen nationalism and turn it to violence than a sense that one's own homeland is being exploited-kept poor and powerless-by other nations to satisfy their own selfish interests. The world today is too small for any of us to be able to afford for any corner of it to be left out of the conquest of Malthusianism."
-- J. Bradford DeLong, "The Final Defeat of Thomas Malthus?"
Arts & Letters Daily has linked to this recent article by Brad DeLong (scroll down to Goodbye, Mr. Malthus). Interestingly, he suggests that "population may well decline" after it reaches a projected 9-10 billion around 2050-2100. I know absolutely nothing about this topic, but when did that ever stop me? I have a couple of queries about this statement:
"Literate, well-educated women with many social and economic options in today's rich countries have pulled fertility below the natural replacement rate. The problem is not that such women on average want fewer than two children; in fact, on average they wish to have a bit more than two. But because many of them delay childbearing until their thirties, actual fertility falls short of what they desire."
First, how representative are "literate, well-educated women" with lots of options? Do they make up even half of the women in the more affluent countries, and aren't they a distinct minority in the less affluent nations? Second, how many of these women really do want more than 2 children? The vast majority of "literate, well-educated women" I know (ok, not exactly a scientific sample, but speaking anecdotally here) want between 0-2 children. Many of them want 0 children. Which brings me to the third question: is it really the case that delayed childbearing has a significant impact on fertility rates overall? Again, what percentage of women are delaying childbearing until their thirties, and of this number, what percentage are unsuccessful in their attempts to have the number of children they desire? Or, to put it another way, how much of the falling birthrates in western, industrialized nations can be attributed to delayed childbearing, how much to the fact that some women don't have children at all (which is not necessarily the result of delayed childbearing: some women decide to opt out of childbearing altogether), and how much to smaller family sizes on the part of those who don't delay until their thirties (who have children in their twenties, say, but who only have 1 or 2 children)? Uh, I guess that's more than a couple of queries, but they're all related.
DeLong is not only an expert on Malthusianism but also a prolific blogger with the most complex and comprehensive blog archiving system I have come across. In his spare time, he's a Professor of Economics at Berkeley.
"I shift my weight and launch what is, by now, a practiced speech. 'Well, rather than stay here and do a lectureship next year, I'm going to try my luck in new media you know, Internet stuff'....
She puts a comforting hand on my shoulder and looks deep into my eyes. 'Don't worry. You have to keep plugging away, but I know you'll find that academic job eventually!' And with one more reassuring shoulder pat, she is gone. I gaze after her in disbelief, and I feel another headache coming on."
-- Michele Tepper, "Doctor Outsider"
Here is a very interesting and provocative essay by an English Ph.D. (you can visit her blog here) who made the very wise decision to leave the academy rather than spend years "plugging away" in the hope that she might "eventually" land that tenure-track job. Not surprisingly, Tepper found that her attempt "to build a meaningful professional and intellectual life outside the academy" was "consistently denied, denigrated, or ignored" by those who chose to remain within. Her essay examines the "unexamined elitism and unwarranted defensiveness" behind such opposition.
This is well worth reading, and I'll have more to say about it later, but for now, must attend to that other dimension otherwise known as my real life...
A graduate student in philosophy writes: "I find Invisible Adjunct depressing but I read it anyway."
I'm glad he's reading it, but then kind of sorry he finds it depressing. And it is depressing, isn't it? This is not an upbeat and cheerful sort of blog. Maybe not the kind of blog you'd want to have lunch with. You might think, 'Sure, it makes a couple of good points, but then it's rather relentless in pursuit of those points, and altogether too negative; It's really kind of draining, and geez, I just want to have a nice, relaxing lunch and I don't know if I have the energy.' No, this is not a "let's-do-lunch" blog. It's more a late night, perhaps even a 3 a.m. phonecall, sort of a blog. In short, it's a bit of a downer.
Well, adjunctification is depressing, and in more ways than I have time to name. There's the depression of wages, of course (and not only for adjuncts: reliance on part-timers exerts a downward pressure on the salaries of the tenure-tracked and the tenured, too). And then there's the depression of status (again, a downward pressure on the status of the relevant disciplines overall. Take English literature, for example, surely this is the near-perfect example. Some attribute the degradation of English to the tendency of its professors to jump on the latest theoretical bandwagons. And they do seem to go in for the latest fads and fashions over in the English department...well, let's face it, they seem to have deconstructed and to have undermined the very basis of their own discipline and to have done so from within: this is curious and probably worrisome for anyone who cares about the long-term prospects of English literature as an academic discipline. But English literature is also one of the main offenders in the reliance on casual, part-time teaching staff, and surely there is a connection? perhaps there is even a relationship between reliance on contingent labor and a theoretical concern with the marginal and the contingent: is the adjunct instructor meant to serve as some sort of experiment in the decentering of the subject?) And then there is that other kind of depression that can result from the depression of wages and status. Unemployment and underemployment are depressing topics all around.
Someone asked me the other day, "You said the purpose of your blog was partly therapeutic. Is it working?" Ack. How embarrassing. Welcome to my me-zine: it's all about me and I'm all about therapy. How cheesy does that sound? This is one of the main reasons why I write this stuff under cover of a pseudonym. It's not that I'm worried about losing tenure, I don't have tenure to lose. It's more that, in pursuit of those couple of points that are the main focus of this blog, I'm also throwing out bits and pieces of myself, and who knows where they will land and how they will be received?
Anyway, the funny thing is, I think it's actually working. The fact is, I'm feeling much better these days. Well, maybe it's partly the weather, but I'm pretty sure it's also the blog. It helps to say the things that I say on this blog. I suppose it helps to "get it out of my system," as the saying goes. Of course, there's a fine line here. I want to get it out of my system and then move on. I don't want to spend the rest of my life obssessing over these cheerless themes, embittered and angry and what have you. But then I can't seem to move on until I get it out of my system. However, I'm beginning to see the exit, I think I can cut a path and find my way out. I'm not quite finished with getting it out of my system, there are still a few more things I want to say. I will say them, and then move on.
But meanwhile, maybe I'm bringing others down? Damn. There's always a catch, isn't there? Well look: I'm only one person offering one account of one side of the story. It's a side that doesn't get enough attention, I think, but still it's only one side. There are many other sides, too. Just take this side and place it alongside the others, not as a replacement but as an accompaniment to those other sides. And for heaven's sake, don't read this stuff if you are nearing your comps or your dissertation defense: I think you should know about the side that I cover in this blog, but I don't think you need to know it as you are approaching any of these hurdles.
"The university had hired Starkwell in the late 1960s, when faculty jobs were plentiful. The story goes that the dean at the time drove around to the big graduate schools in the Midwest and the Northeast, offering jobs to anyone with a pulse and a dissertation nearing completion.
Of course that's not what actually happened: There were real faculty searches then, just as there are now."
-- Dennis Baron, "Promoting Late Bloomers"
Virgil Starkwell was hired back in those halcyon days when there was a chicken in every pot and a tenure-track position for every Ph.D. who wanted one. Ok, so the dean didn't really drive around to graduate schools handing out tenure-track jobs to anyone whose vital signs were good and whose disseration was three-quarters finished. And of course there never actually was full employment for Ph.D.s within the academy: the golden age was never that golden, golden ages never were. Still, once upon a time, during the boom of the late 1960s, "faculty jobs were plentiful" and one of those jobs went to Starkwell.
Virgil Starkwell is the subject of the latest in a series of columns on tenure review that Dennis Baron is writing for the Chronicle of Higher Education. (I blogged about Baron's Alison Porchnik case here; we now learn that Porchnik did win tenure despite her failure to "shift the paradigms" of her area of specialty.) This is an interesting and even, in its way, an entertaining series. Baron offers an inside view of the many (the very many) perils and pitfalls of the tenure review process, and he does so with wry humour. He also displays a sense of decency that some of us worry is in rather short supply in today's academy. In other words, Baron is not only the chair of an English department, he's also a mensch.
Starkwell managed to get tenure, Baron explains, just about the time that standards for tenure were tighening up (they are now much tighter still: though I don't know firsthand, of course, my numerous sources inform me that the standards for tenure now hold junior faculty members in a vice-like grip). But he seemed destined to become one of the "'lifetime' associate professors, stuck at that rank for the rest of their careers." He was, Baron writes, "one of several department members hired in the golden years whose research seemed to stall out after he got tenure." In short, Starkwell didn't publish. But neither did he perish. Instead, he become "a solid department citizen." Though he was, Baron concedes, "a little retro perhaps, when it came to new turns in the curriculum," he was "always a beacon of personal integrity," who could be relied on to "deal fairly and efficiently with colleagues and students" and who had "managed to head just about every department committee as well as serve in a number of key administrative roles."
One day, "after 20 years of scholarly doldrums," Baron recounts, "like Rip van Winkle awakened, Starkwell burst through his writer's block." He started publishing: peer-reviewed articles, a scholarly reprint of a poet's work, a monograph with "a second-tier university press." And then, "with a service record that was legendary, a revived scholarly career, and retirement not too far off," Starkwell decided to try for full professor.
Not suprisingly, objections were raised. Among the questions asked by the review committee were: "Why now?... Why not wait to see whether this burst of energy would be sustained? Did we really want to promote a consistent underperformer?" While some were willing to reward Starwell's "new productivity" with "a handsome raise" that might spur him on to continue, they worried that "promoting him would cheapen the title of full professor."
With the help of a sympathetic dean and a couple of external reviewers, Baron managed to push through Starkwell's promotion. Virgil Starkwell is now a full professor.
Should he be?
Not that I have any say in this or in any other matter relating to faculty hiring and promotion (if I did, you can be sure I would vote to give myself a tenure-track job), but I'm inclined to say Yes, they made the right decision.
Of course I am well aware (only too well aware: welcome to my blog) of some of the problems this type of case raises. And of course I am bothered by just such problems. A Starkwell hired a generation later, it's fair to assume, would not be coming up for promotion to full professor, because a Starkwell hired a generation later would never have made it to the level of associate. Publish or perish? Yes, and what's more, in today's academy it is also possible to publish and perish. There are academics on the margins who have published more than Starkwell published during those twenty years, and who have even published more than Starkwell came out with after his scholarly awakening. I personally know a few of them, perhaps you do too.
Is it fair that someone who didn't publish for twenty years now makes it to full professor while others who publish -- and who do so without the very real and material and psychological aids of an institutional home, an office, the help of support staff -- cannot even make it onto the tenure track? No, it is not. It is patently unfair: the two-tiered academic labour system that is now firmly entrenched within the university is an unfair and an unjust system.
But there are systems and then there are individual cases within those systems. And it hardly seems fair to penalize an individual for these vast systemic problems. Not when he has been a solid citizen and a "beacon of personal integrity" for twenty years, and has then topped it all of with a late but no less impressive flurry of publication. Moreover, not giving Starkwell a full professorship probably wouldn't do much to redress the job problem: since he would still be there as an associate, refusing him a full professorship wouldn't open up a tenure-track job for a junior scholar.
Then again, I have to wonder: are there adjuncts teaching in Starkwell's department? Well, it's looks as though there might be. It's hard to know for certain, of course, when dealing with such thorny and delicate issues. Many departments are rather shy about their reliance on part-timers, they like to hide their "extra" faculty, which is where we get our invisible adjuncts. But certainly there are quite a few "lecturers and instructors" (look under "People," then look under "Lecturers and Instructors") in addition to the regular faculty (in addition? well yes, adjuncts -- though it looks as though a good deal of teaching is done by these "lecturers and instructors": at what point should we say that the regular faculty are in addition to the adjuncts?). Anyway, some of these additions are obviously graduate students, so we won't count them. Marc Bousquet would say we should count them ("The myth is you work for four or five years as an apprentice, then find a full-time job," Bousquet said. "The reality is that you work 10 to 12 years as a part-timer, then find another line of work.") I'm pretty sure Bousquet is right about this. Cary Nelson, too, would undoubtedly say that we should count them, and Nelson is in fact a regular faculty member of Starkwell's department. But for the moment, let's not count them: Let's agree to the agreeable fiction (it is highly agreeable, though alas! largely a fiction) that these Ph.D. candidates who teach so many courses are serving an apprenticeship that will allow them to move up through the ranks from the lowly grind of graduate student life to the lofty heights of full professorship. Still, even taking out the graduate students, it looks like there might be a few, perhaps more than a few, post-Ph.D adjuncts teaching in this department. And what are they publishing, I wonder? And how much are they being paid to teach those courses? (actually, I don't wonder about this one, I already know the answer).
So it's all a bit of a muddle. Virgil Starkwell was hired under one set of rules, and then the rules changed, and though he spent years not following the new rules, he did make a 20-year contribution of another sort before making a real and apparently successful attempt to catch up with the new rules. And then there are the adjuncts, and another set of rules altogether. I'm still inclined to think they did the right thing. But I can't help thinking about the adjuncts.
Jennifer ("I'm Just Jenny from the Block") Lopez and Ben Affleck "have secured a deal to remake the classic movie Casablanca." An unnamed sycophant friend reports that the Hollywood couple are "overjoyed" at the chance "to show how much they love each other through their on-screen chemistry." Don't play it, Sam, please don't.
"One short interaction with such an administrator prompted me to get out of teaching for good. Near the parking lot one day, I introduced myself to the president of our community college. 'My name's Matt Hall,' I said. 'I teach English here part time.' Our president looked at me and said, 'Thanks for helping out.'
Helping out? I watched as he got into his brand-new Lexus and drove away..."
-- Matt Hall, "Why I Quit Adjunct Teaching"
I sometimes complain that the Chronicle of Higher Education's coverage of academic labour issues is altogether too optimistic. But here they have published an essay that speaks to some of the grim realities -- and that captures some of the fundamental absurdities -- of adjunctification, and with wit and humour. A must-read for adjuncts.
Be still my Jacobite heart.
Yeah, I'm part Jacobin, part Jacobite. I will not speak of a "Glorious Revolution," the principles of which I basically support. I call it the "Revolution of 1688." It's my Irish Catholic upbringing: the iron entered the soul.
Via Rebecca Goetz (Tueday, April 22, permalink bloggered), Scott Sowerby, a history graduate student at Harvard, finds evidence suggesting that James II really did support religious toleration. The Whig narrative: the Catholic James II spoke of toleration for Quakers and Catholics, but would have shown his true colours -- his true, continentally Catholic and absolutist and "divine right of kings" autocratic colours -- had he been allowed to sit on the throne. For what it's worth, Sowerby finds an entry in James II's personal diary that reads as follows:
"Suppose there should be a law made that all black men should be imprisoned, it would be unreasonable. We have as little reason to quarrel with other men for being of different opinions than as for being of different complexions."
Like Rebecca, I look forward to Sowerby's book.
How long can I be a wall around my green property?
How long can my hands
Be a bandage to his hurt, and my words
Bright birds in the sky, consoling, consoling?
It is a terrible thing
To be so open: it is as if my heart
Put on a face and walked into the world.
-- Sylvia Plath, "Three Women"
"She is very clever, and has learn'd all her letters. She dances very prettily to her own shadow."
-- Jane, Duchess of Atholl, describing her 3-year old daughter in a letter to her sister, the Honourable Mary Graham (c. 1770s)
When the Duchess of Atholl lost her young daughter (lost her to who knows which of the many illnesses we now immunize our children against), she comforted herself with the thought of Heaven. The heavenly Father, she believed, had reached down and taken the child to His home because she was too bright, too pure, and too innocent for this world. Her sister encouraged her in the thought, and pointed out that the child would now be spared "much suffering," the suffering that was our lot in this our earthly home.
It is easy enough to smile with indulgence, or less kindly to sneer with contempt, at the credulity of past believers. And I am grateful enough, I suppose, to have been rescued from the idiocy of rural life. But when belief in God was just the air that people breathed, so too was a rate of infant and child mortality that we would now find unbelievable and just shocking.
And what other comfort could the Duchess have found? She had a child who "danced very prettily to her own shadow." The child danced; and then she fell ill. The child died.
It's been quite a while now since I have even looked at any of the parenting books. They lie heaped in a stack, collecting dust on a shelf in the bedroom.
A few months ago, on a Friday night (late January? early February?), 4 teenage boys, aged 16 and 17, stole an 8-foot dinghy and tried to row out to an island in Long Island Sound. Something horrible happened. Maybe someone rocked the boat, maybe the boat had a hole in it. Toward the end, one of the boys made a frantic 911 call from his cell phone and said the boat was filling with water. "My God, we're all going to die" were his last recorded words.
I am haunted by the picture of one of the fathers, taken on the day of his son's funeral and published in the local papers. How sad and crumpled he looks, how truly stricken with grief. What happened? Where is his boy? How could this happen? He is dull-eyed with pain and loss and bewilderment. He has been stricken.
It's crazy, of course, it's just silly and stupid, but when I read about it in the papers, I wished I could turn back the clock and let those boys have another chance: Please (oh please) don't do that stupid thing; just talk about doing it, joke about how much fun it would be to row out to Hart Island on a Friday night at midnight; and then think better of the idea, and go home safe and sound to your parents. And when I thought of the parents, in addition to something we can call sympathy, I felt a prickle of fear (This could happen to anyone. Will this happens to us?) and even a momentary and very guilty sense of relief (It wasn't us that this happened to. Well, not this time. And please, oh please, not ever).
I sometimes wonder what the parenting books have to do with this mixture of sympathy and fear and guilty relief and hopelessly stupidly wishing that the clock could be turned back so that the catastrophe would not happen. And not just books. In addition to the books, there are websites and videos and products galore: an entire industry built around parental anxiety. The parenting industry trades on the notion of parenting as a massive and highly specialized project, a goal-oriented endeavour with inputs and outputs and an intricate and extensive set of rules and regulations that range from the banal to the bizarre. It is not enough, for example, to feed your infant, you must devote yourself to the rather daunting task of "building a brighter brain." Play itself is a serious business, apparently involving, at last count, "5 main developmental play stages" (can you name them? I confess I cannot, I would need a crib sheet: bad mummy me). Through threats and promises, flattery, cajolery and downright bullying, the industry experts encourage you to micromanage not just the basic care and feeding of your child, but the whole range of emotional and intellectual and psycho-sexual developmental issues for which, they exhort you, you are uniquely responsible.
As parents, we are to understand our task as a kind of quest for perfection. Perfect Parenting is the title of one manual (and no, this one is not collecting dust on my bookshelf: even before I had become intimately familiar with this genre, I knew enough to draw the line at anything that would dare to give itself such a title). We are to adopt and to internalize an unrealizable ideal of perfection. (I yelled at my kid this morning: will she end up in therapy?; I gave him a jar of Gerber's instead of homemade organic: is he destined for Type II diabetes?). Martyrdom is an an obvious subtext to this literature (so glaringly obvious that it is hardly subtextual, it is just barely hidden), and so too is a thinly veiled kind of narcisissm: the suggestion that we can redo and thus relive our own childhoods, make it all better this time, make it just perfect, make things turn out right.
Critics of the genre point out that this literature pretends to offer reassurance while serving up an unhealthy dose of anxiety. I can only agree. But I'm wondering whether this anxiety isn't actually the point? What if he dies? Well, of course he will die, we all die, that is our human condition. But what if he dies before me, dies before his time? Unthinkable. And yet it could happen. An illness, a car crash, a crazy teenage prank...The parenting books say, Do this, that and the other, and you are in control.
But it doesn't matter what they say. Sometimes it just does not matter. Things happen. It doesn't matter how good a parent you are, how conscientious, how loving, how much you care. Horrible things can and do happen, and to me or to you or to anyone. 16-year olds will sometimes do stupid things, and sometimes even fatally stupid things, despite the breastfeeding and the Montessori and the cultural enrichment. We are not in control. Look, you can be the best parent imaginable, but you are not in control. So maybe the point is to dwell obssessively on all manner of small detail in order to distract oneself from the thoughts that lurk, from the awful thoughts that lie hidden in the dark recesses of the brain? To think of the small stuff in the hope of fending off the big catastrophe? the thing that might happen, the thing that you hardly dare imagine might happen? If I worry about baby food, he won't get meningitis. If I focus on vocabuary acquistion, he won't be hit by a car.
I'm not the first person to note the basic religiosity of the parenting books, the way they seem meant to serve as manuals of devotion. The parenting guru as high priest of a new cult of child perfection, the parents as eager, desperately and confusedly eager, acolytes. But I wonder if it doesn't cut deeper than we care to realize? I wonder whether, for all their obssessive focus on the minutiae, these books have something to do with something much larger, the something awful and terrfiying that we do know but do not want to know? A way of coping, in other words, with parenthood as a confrontation with mortality -- and as strange a way, in its way, as the thought of a heavenly Father who would reach down and take a child to His home.
We who are middle-class westerners are so lucky! so much luckier than we often realize. We worry about play stages, vocabulary acquisition, psycho-sexual development. These worries are luxuries. These worries are the function of an unprecedented level of affluence that we can even afford to take for granted. And the odds are very much in our favour. If you bring a child into this our earthly home, the world of the middle-class westerner, the odds are very much in your favour that you will live to see that child grow to adulthood. We have immunizations, vitamin-D enriched milk, indoor plumbing, a surplus of food. We can worry about developmentally appropriate playthings because we don't (no, not once, not ever, if we are middle-class westerners) have to worry about food. We have modern medicine. (Yes, modern medicine: please spare me your critique, at least for the moment, I have read the critiques, and I will go halfway, but no more than halfway: my son had surgery at 6 months old for something potentially life-threatening, it might have taken the life of a child of the Duchess of Atholl, she might have been thinking of her child in heaven where I sit on the couch with my son reading Go Dog Go). We are so much better off than was the Duchess of Atholl: a nobody in the middle-class west of today is so much better off than was a duchess in the eighteenth century.
Unless the thing happens. Unless and until the awful, unthinkable catastrophe happens. In which case, I suspect (and I hope I never need to know, I hope this will always be nothing more -- please, oh please, may this never be anything more -- than a suspicion), we are not better but rather a good deal worse off than was the Duchess of Atholl. The Duchess could find comfort in the thought of heaven. But where would be comfort for the parents of those 4 boys?
A number of people have emailed me recently with harrowing tales of that fundamentally absurd and unjust form of trial by ordeal known as the "academic job market." Much as I'd love to quote some of this stuff, I will not do so without explicit permission. Since I didn't have a clear policy on citation when these people emailed me their stories, I've decided to err on the side of caution rather than risk making them feel uncomfortable. I really appreciate the emails, and don't want anyone to regret having taken the time and trouble to write to me.
But I've decided to set a new policy, which reads as follows:
"Unless you indicate otherwise, I will assume that I can cite and post quotations from your email, referring to you by whatever name, pseudonym or initials you choose. If you inform me that you do not want me to make reference to your email on this blog, I will not do so."
This policy is effective immediately but does not apply to any email I have already received. Again, if you have already emailed me, I'm going to assume that you don't want me to cite or quote from your message -- unless you email me with the go-ahead, in which case I may cite your tale as yet another case study in the adjunctification and deprofessionalization of academic teaching.
Via Matthew Yglesias (who now has comments, along with a new photo), Amitai Etzioni decrees that "anonymity is anti-Communitarian." Anonymity, Etzioni writes, "makes for much poorer conversations, meager relationships and impoverished communities. People are free to disregard the feelings of others, to deceive, and to prevent the formation of the true connections that result from gradually getting to know more and more about a person." And "above all," he adds, "they are able to avoid assuming responsibility for what they are saying."
Since I'm not a communitarian (though I hasten to add, not a libertarian either, and probably closer to the communitarian end of the spectrum on many issues) I'm not overly concerned about the anti-Communitarian charge. Still, the notion that those who use aliases "don't dare to show their true colors" does hit a nerve. The fact is, I don't feel entirely comfortable using a pseudonym for this site. There is a certain kind of anonymous internet character that I think of as the AnonyMouse: the person who is often, I suspect, rather timid and nonconfrontational in real life but who uses the internet's cloak of anonymity to flame and bait and provoke while hiding from responsibility. Well, ok, I really don't see myself as timid and certainly not as non-confrontational in real life, and I don't believe I'm a flame-and-bait type on the internet (though I do love a good argument). Still, I worry that there is an AnonyMouse quality to the decision to go anonymous.
But if I don't feel entirely comfortable using a pseudonym, I would feel even less comfortable using my real name (which I briefly considered doing, before deciding against it, or perhaps, before mousing out). I gave a rather tongue-in-cheek explanation for my anonymity when I first started this blog. I suppose the very fact that I felt the need to address this issue is a measure of my anxiety over it. This is not about the expression of political dissent, obviously, but about the expression of a kind of dissent concerning the academy. A far less serious business than politics, admittedly, perhaps even a rather silly business altogether. But serious enough for me at the moment. Since I don't enjoy academic freedom or job security, I don't feel safe making certain kinds of statements. I see my options as either adopting a pseudonym or remaining silent on the very issues that are the main focus of this blog.
This concern over anonymity intersects with the issue of blogging and truth (and/or perhaps blogging and authenticity), which is the subject of an interesting discussion over at Liz Lawley's. As I see it, I tell the truth on this blog (the truth as I see it, needless to say), but not the whole truth. I sometimes talk about my personal life, for example, but there's only so far I will go in that direction, even under cover of a pseudomym. I have my limits, most of us do (the people who don't have limits are the ones who really worry me). I believe (though I could be wrong about this) most of the people who read my stuff would agree they are encountering a more or less coherent identity: I'm not trying on different masks, self-consciously experimenting with a dizzying array of contingent identities, or doing anything at all theoretically glamorous or sexy. Grant the decentring of the subject, the death of the author (which I actually won't grant, but that's another topic), and so on and so forth, the person who writes the entries on this blog is as unitary a subject as a person can be, and that person is me using a pseudonym.
In a related vein, Henry Farrell picks up on the criticism of anonymity to propose a rather different model than that of Etzioni's communitarian society (er, communitarian community? or is that redundant?): that of the eighteenth-century coffeehouse as a sphere of sociablility. I find this very attractive as an idea/ideal of the kind of Habermasian public sphere that the blogosphere might provide.
I'm about to descend into the ninth circle of the hell that is grading. I won't bore you with all of the details, they are exceedingly tedious and tiresome. Just a few of the highlights (or rather, lowlights):
Term Paper-Related:
The B paper that will never be an A. The A papers are easy. Not because they are "easy," of course, but precisely because they're not: they show evidence of complexity and a sense of difficulty, they do something interesting that makes them worthwhile to read. The C papers are depressing and dispiriting, but also easy enough to grade in their own way: it's a matter of pointing out all of the ways in which the paper fails to meet the minimum standard (the minimum is now a B to B-) in style, substance and mechanical execution. But there's a certain type of B paper that always gives me trouble. Not the B paper that could have been an A if only the student had done X, Y or Z, but the B paper that will never be anything but a B. There's nothing really wrong with it, there's just nothing about it that makes it more than good enough. In other words, it's average (and once upon a time would have been a C, which is now the punishment grade for work that falls below the average). It's really hard to explain average to many of today's students: the expectation is that everyone is above average, and many students now see an A- as the default. Though I have to say, the students I have this semester are not grade-grubbers, which is a refreshing change from what I have come to expect.
The paper that reads as though it were hastily scrawled on the back of a brown paper bag. Of course it doesn't look as if it were quickly scratched out on scrap paper: word-processing and laser printing make everyone's paper look clean and tidy and letter-perfect. Looks can be deceptive. I feel cheated.
Exam-Related:
Handwriting that resembles some ancient hieroglyphic code for which I lack the keys to decipherment. I don't care about the trees: if your writing is hard to make out, skip lines. Skip two lines. This is not a waste of paper. Have another blue book.
The essay question "answer dump." Here the student responds to an essay question not by writing an essay but by dumping as much material as possible onto as many pages as he or she can churn out before the buzzer goes. Sifting through the extraneous and unrelated detail is like going on an archealogical dig. I get nervous when a student asks for another, and then yet another, blue book. That is too many blue books. Don't you care about the trees?
The cut-and-paste. Here it seems the student has attempted to perform by hand what word processors now do for us automatically. It doesn't really work to do this by hand. Answer A is running along nicely for 4-5 pages, but now here are several pages scratched out with instructions to see end of Book 2, following Answer C. But no, that's wrong, because the student wrote the instructions before completing Answer C, completion of which required another blue book, so now we have Answer A, part 2 following Answer C in book 3, and what happened to Answer B?... I'm getting dizzy.
Yes, I am feeling a little bit cranky. In fact, there's a part of me that looks forward to reading my students' work to see what they have done and discover what they can come up with. But at the moment I am dreading the descent into grading hell.
My fellow adjuncts,
Are you in disgrace with fortune and men's eyes? It sure feels like it sometimes, doesn't it? You meant to be a tenure-track professor, instead you are an adjunct. The worst is trying to explain it to your parents...all that booklearning, all those years, and all for naught?...I want to write about this, I am working on an entry, but honestly, it's so painful, it is so mind-****ingly painful, that I keep putting it aside. We need to leave the academy. You know that, don't you? Yes, and I know it too. Somehow it's tough to find the exit: it is not very well-marked, and it's hard to see clearly when the lights have been dimmed. But find it I will, and I hope you find it too. In the meantime: Listen, I know all about the sense of disgrace, and here is my modest proposal: Don't trouble deaf heaven with your bootless cries. Get yourself a blog and trouble the blogosphere. You never know, someone might actually hear you. Strapped for cash? (yeah, I know). Try Blogger: they will host your blog for free.
Yours in fellowship,
Invisible Adjunct
"My own path to Ivy League employment, by contrast, was ridiculously easy. One day in 1962 the chairman of the history department at Princeton phoned my Hopkins adviser, C. Vann Woodward, and asked him if he had a 'young man' to recommend for an instructorship (then the first rung on the tenure-track ladder). Woodward recommended me -- I don't know if he even had to put it in writing -- and Princeton offered me the job, without a real interview and without having seen any dissertation chapters. This was the infamous 'old boy network,' surely the most powerful instrument of affirmative action ever devised."
-- James McPherson, "Deconstructing Affirmative Action," Perspectives, April 2003
Damn. We used to joke about this kind of thing in graduate school. But I thought the stories were fictions, or at least grossly exaggerated and highly embellished accounts that did express a fundamental truth about the differences between the generations, and more specifically about the diminished expectations of our own. Turns out at least one of the stories that circulated was factually true.
McPherson is a history professor at Princeton and current president of the American Historical Association. His reflections on affirmative action are worth reading.
Dorothea offers clear and readable and user-friendly explanations of preferences, placeholders, angle brackets and more. MT newbies should check this out. Make friends with your templates!
"The man of system, on the contrary, is apt to be very wise in his own conceit; and is often so enamoured with the supposed beauty of his own ideal plan of government, that he cannot suffer the smallest deviation from any part of it. He goes on to establish it completely and in all its parts, without any regard either to the great interests, or to the strong prejudices which may oppose it. He seems to imagine that he can arrange the different pieces of a great society with as much ease as the hand arranges the different pieces on a chessboard. He does not consider that the pieces upon the chess-board have no other principle of motion besides that which the hand impresses upon them; but that, in the great chess-board of human society, every single piece has a principle of motion of its own, altogether different from that which the legislature might chuse to impress upon it. It those two principles coincide and act in the same direction, the game of human society will go on easily and harmoniously, and is very likely to be happy and successful. If they are opposite or different, the game will go on miserably, and the society must be at all times in the highest degree of disorder."
-- Adam Smith, Theory of Moral Sentiments, VI.ii.2.17
Many conservatives like to claim Adam Smith as a founding father and intellectual predecessor. I suspect many of them have not read much beyond the famous "BBB" passage ("It is not from the benevolence of the butcher, the brewer, or the baker, that we expect our dinner, but from their regard to their own interest," An Inquiry into the Nature and Causes of the Wealth of Nations, I.2). Perhaps they should delve a little more deeply.
And I don't mean this in some silly and snooty intellectually snobbish kind of way (you may only invoke or comment on a text if you've devoted your life to its explication). It's just that, with all this talk of regime change, and remaking and reordering the world, and drawing up constitutions from scratch, and free people having the freedom to commit crime, and so on, I'm beginning to wonder what exactly conservatives now mean when they call themselves conservatives?
(Btw, and just for the record: within the context of his own times, Adam Smith was most emphatically not a conservative).
"Henry the 4th ascended the throne of England much to his own satisfaction in the year 1399, after having prevailed on his cousin & predecessor Richard the 2nd to resign it to him, & to retire for the rest of his Life to Pomfret Castle, where he happened to be murdered. It is to be supposed that Henry was married, since he had certainly four sons, but it is not in my power to inform the Reader who was his wife."
-- Jane Austen, The History of England, from the reign of Henry the 4th to the death of Charles the 1st. By a partial, prejudiced, & ignorant Historian
Comfortably ensconced Precariously placed as I am within the walls just inside the gates of the Ivory Tower, every now and then I am reminded that in the world beyond the narrow confines of academic history, the coinage of "herstory" still circulates as valuable currency. I greet each reminder with impatience and dismay.
The purpose of this blog entry is to argue that herstory should be history.
1. It should be history, first of all, for reasons of etymology. Quite simply, the term "history" does not derive from a running together of "his" and "story." Rather, the term derives from the Greek term for "knowing by inquiry," which the Romans rendered as the Latin historia. The his in the Latin from which our term history emerged (via the French histoire) did not and does not denote the third person masculine possessive pronoun. Not in Latin, not in French (where "his history" or "his story" is son histoire), and not in English either.
Having said this, I have of course said very little, which is to say, I have merely stated the obvious. Problem is, the term herstory works to obscure the obvious, and seems to encourage people in the misguided belief that the word "history" really can be broken down etymologically into "his" and "story."
This is no mere exercise in pedantry. As I see it, there is both a defensiveness and a defiance to the term herstory. It is a term that boldly announces something, a term that intends to make a statement. This it does through a pretended play on etymology. It is worth asking whether this play is effective, and whether indeed this play is even very playful.
2. It should be history, secondly, for reasons of historiography (or, if you will -- though I hope you won't -- herstoriography: but can anyone say herstoriography with a straight face?). The fact is, professional historians of women and gender (and even deprofessionalized historians of women and gender such as myself) do not use the term herstory, do not call themselves herstorians, do not talk of herstorical trends, do not contribute to the herstoriography of women and gender, and so on. Are we merely dupes of "the patriarchy" (another term that needs to go, but I'll take this one on in later entry)? Or do we have some good reasons to call ourselves historians and to view our work as history? I have think we have some pretty good reasons, which I'll briefly explain as follows:
First, though it's quite true that women are largely excluded from traditional history, it is not at all accurate to characterize traditional history as "his" story. From Thucydides on, traditional (or classical) history addressed itself to an elite male audience, narrating the great deeds of great men in order to instruct a ruling class in the art of politics. The authors and readers of this history had as little interest in the great mass of men as they had in women. Indeed, while the ocassional female ruler such as Elizabeth I could certainly figure in the traditional or classical account, you will search its pages in vain to find the story of Tom, Dick or Harry.
Second, the term herstory smacks of a special pleading that is no longer warranted. The history of women and gender is now a well-established field that is firmly entrenched within the academy. The historiography of women and gender, moreover, is both enormous and enormously rich and complex. Anyone who claims that women have yet to figure in the historical record has simply not done her homework. Are the past thirty-odd years of careful and creative historical research and writing by hundreds of historians of women to be dismissed out of hand, to be accounted as nothing at all?
Third, the term sets up a kind of "separate spheres" approach that can only contribute to the marginalization of women as historical actors and historical subjects. In my opinion, the point is not to create two separate (different but equal?) streams of history -- blue for the boys, pink for the girls, or what have you -- but rather to integrate our knowledge of previously excluded groups (including women, but I would hope not to the exclusion of other previously neglected historical subjects and historical actors) into the main lines of mainstream historiography.
3. Finally, I want to conclude with my own personal and admittedly idiosyncratic reasons for objecting to herstory. I am of course well aware of the fact that my own distaste for the term hardly constitutes a valid reason for others to abandon its use. For such valid reasons, please see points 1 and 2.
If I am not mistaken, the term was first coined by Robin Morgan in her Sisterhood is Powerful (1970). Now, I will readily give Morgan credit for the mythopoetic impulse that inspired her coining of the term. It was inspired, it was perhaps even a brilliant flash of insight. But it belonged to an historical (no, not herstorical, but historical) moment, and that moment has passed (see point 2).
When I hear herstory, I think of patchwork skirts, Moosewood broccoli forests and macrame plant holders. What I don't think of is a valuable research agenda that would make a meaningful contribution to our ever-increasing knowledge of the history of women and gender. The term evokes the issues and concerns of an earlier era (which makes it of interest historically), but without translating into the issues and concerns of the present (which makes it ill-suited to serve as a designation for present and future historial practice).
Finally, (and again, this is my own idiosyncratic opinion), for all its "play" on male-oriented history, the term is remarkably devoid of wit and humour. It's just not funny.
If we are looking for playful criticism of male-oriented history, we could do worse than consult the young Jane Austen's brilliant send-up of the dull, plodding history that she was exhorted to read as a schoolgirl (though please note that Jane Austen did not object to all male-authored history, and had the very good taste to appreciate the works of William Robertson and David Hume). Among the characteristics that Austen satirized was the pretence of impartial omniscience (hers was a history by "a partial, prejudiced and ignorant historian"), and the absence of women from its pages ("It is to be supposed that Henry was married, since he had certainly four sons, but it is not in my power to inform the Reader who was his wife."). For me, this translates. Some two hundred years later, it is still fresh, and it is still funny. Will the same be said of herstory in two hundred years' time?
I could have been someone
Well so could anyone
You took my dreams
From me when I first found you-- The Pogues, Fairytale of New York
Sometimes, the rhetoric of economic self-sacrifice that prevails among teachers reflects what the old Marxists called 'false consciousness.' There is an artificially constructed 'supply-demand imbalance' between faculty positions and qualified candidates. So, in the desperate competition for academic jobs, wages can be lowered and benefits eliminated (with the questionable promise of a 'real' job later). Seeking to preserve their dignity and enhance their status, many teachers come to believe that their unrequited toil is a form of good citizenship or even spiritual devotion. The intensity of their rhetoric is often in direct proportion to the degree of their exploitation.-- Thomas H. Benton, "Should We Stop Fooling Ourselves about Money?"
Ah, the old Marxists. I miss the old bastards. I really do.
My introduction to the old Marxism was the "Introduction to Political Science" course that I took during my first year at university. The professor (whom I now realize was either an advanced graduate student or an adjunct lecturer, but at the time I knew nothing of academic rank and hierarchy...would that I had remained in this state of innocence!), the professor was a young, but not too young -- say early thirtysomething -- Marxist from the old school. He was smart, he was articulate, he displayed flashes of wit and occasionally of brilliance, and he spoke with withering scorn of the mystifications of bourgoeis ideology. And all of this in a German accent. Be still my heart. Of course I had a crush on him. I can still see the diagrams that he furiously sketched out on the blackboard: base and superstructure, the Canadian class system, the whole rotten-borough system that was rotting to the core. He went at that board with anger and eloquence, chalk to chalkboard as though launching a campaign: the words rang, the chalk dust flew, and my heart thrilled to the attack. Talk about sublimation.
But I digress.
On to part II of my ranty-flavoured "Thinking about Graduate School in the Humanities?," in which I betray my bourgoeis heart. Petit bourgeois am I in upbringing (a topic about which I may blog in future), bourgeois am I to the core.
So then:
In my opinion, the application forms for humanities Ph.D. programmes should carry the warning: "Enter at your own risk." The fine print should read: "The risks include poverty, shame, humiliation, and clinical depression." You will of course find no such warning on the graduate-school application forms. And incredibly enough, even at this stage in the game, you may still encounter tenured faculty members in said programmes who refuse to even consider the very sensible proposal of limiting graduate-school admissions in order to address the problem of an oversupply of academic job candidates, and who justify their position with such nuggets as, "Well, nobody's forcing them to go to graduate school." The more fools they. And the more fool you if you don't ask yourself some pretty tough questions before you sign on with them.
Now, when I entered grad school 9 years ago, the tenured faculty members who were actively recruiting and encouraging new entrants should have known that many of these aspiring members of the profession would not find jobs. But let's cut them some slack. Let's grant them the somewhat dubious claim that they didn't realize what was going on in their own profession right under their very noses. Today they do know. Nobody can now claim not to know what are the dismal prospects for employment in the humanities. And yet there are many humanities faculty who still refuse to limit entrance rates to their graduate programmes. What does this suggest about these "professions"? To me it suggests an impulse toward professional suicide. A course here, a conference there, another grant proposal due today, another article due tomorrow, the games must go on, the life of the mind must run its course...But make no mistake: we are in sudden death overtime, with only ten minutes remaining.
Now let me try to explain just why it is I think you should think twice (no, thrice) before embarking on a Ph.D. in the humanities.
A humanities Ph.D. takes many, many years to complete. The pursuit of this degree involves an enormous investment: not just financial (e.g., salary foregone) but mental, psychological and emotional. And entry and/or attempted entry into the profession places you in a peculiar sitatution, wherein you experience a strange combination of the conditions of both alienated and unalienated labour. The conditions of alienation are bleak enough, and they are real: low wages, unemployment, under- or sub-employment, genteel poverty, exploitatation, and ramen noodles. For more on this theme, I recommend you spend some time perusing the pages of workplace: the journal for academic labor.
At the same time, if you have the passion and the interest to stick it out and finish the degree, you will probably also experience a kind of unalienated labour. You're not punching a time clock and putting in X number of hours to earn X number of dollars. No, no, you have your "work," and your work becomes an important part of who you are. You will develop and deeply internalize an identity as someone who does/as someone who is this work. You are your work, and your work is who you are. Well: I've punched a time-clock, and I've completed a Ph.D., and I'd like to let you in on a dirty little secret: unalienated labour is not all that it was cracked up to be by the old Marxists. A bit of fishing, a bit of criticism,... well, it's just not like that. You don't go fishing. Or at least, you don't go fishing very often. And when you do go fishing, you can't really fish, because you're too busy fretting about the criticism you should be doing instead: "Why am I fishing?! I should be criticizing!"
And if upon completion of your work, you fail to find a position (and this is a very real risk), you will experience it as a personal failure, and you will view your own person as an abject failure -- and this, no matter how much you know about the structure of the job system, the ratio of candidates to jobs and so on.
Now, if you were looking at a programme that took one or two or even three years to complete, it wouldn't be such a very bad thing if upon completion you couldn't find employment in your field. Granted, it wouldn't be a great thing; it would probably feel awful for a while and you would no doubt have some regrets about having spent those one to three years in pursuit of a degree that would not actually give you a reasonable shot at employment.
But how much worse to spend five or six or seven years! At a time when you could be building a viable career, and also creating a life for yourself (which might involve marriage, maybe having a child or two, possibly even buying a home or at least moving into a half-decent rental), you are toiling away in relative poverty, perhaps accumulating debt, and living under conditions of massive anxiety and insecurity. You must delay and defer so much of what many people (perhaps including yourself? be honest, now) would consider a decent, liveable life, and without even a reasonable chance that it will all be worth it in the end. And I'm not talking about fame and fortune, the pursuit of filthy lucre and lots of it, but just the basics of a modest middle-class life: say, a living wage with health insurance. Be still my bourgeois heart.
Does my desire for a modestly middle-class life betray a lack of real passion for my subject and field? Perhaps so. Certainly, I didn't always see things this way. Alas, I now shudder at the mixture of naïveté and arrogance that motivated my decision to pursue a doctoral degree in history. When I first entered graduate school, I was fully committed to what I thought of as "the life of the mind" and didn't pay much attention to such sordid practical concerns. Or at least, I tended to repress all nagging doubts and questions. But I gradually came to realize that this wasn't enough, that this would not do. That though I had no interest in becoming rich, I simply didn't want to spend the next 20 years eating ramen noodles and living in a one-room apartment.
And where I had once rather looked down at those who were busily pursuing jobs/careers/marriages/family out there in the real world while I engaged in something loftier and more pure... Well, let me conclude this overlong entry by saying two things: First, if I had to do it over again, I would not go to graduate school; and second, I try hard, really hard, not to hold it against those undergraduate professors of mine who encouraged me to go to graduate school and who actively discouraged me from going to law school because I was "too smart" for a legal career. Ach. I was smart enough, I suppose, in the booksmarts way, but it turns out I was actually rather stupid: not smart enough, that is, to not listen to such silly advice.
Matthew Yglesias has an interesting post on the Canadian Constitution. Since his comments are not working, I want to make a couple of comments here.
First, in citing the Canadian constitution as an example of the broad range and variety of constitutional arrangements found in Anglo-American liberal democracies, Yglesias repeats a common misperception (and one that is actually shared by many Canadians). "Canadas Constitution Act, 1982," he writes, "is quite long and is supplemented by the earlier Constitution Act, 1867 which is also long." In fact, the Constitution Act of 1982 supplements not only the Constitution Act of 1867 but also a number of other written documents and unwritten conventions which still have the force of law. These written documents include the all-important Statute of Westminster of 1931, through which Canada basically and rather quietly went from colony to nation (section 4 of which was repealed by the Act of 1982, but the rest of which is still in force), the Newfoundland Act of 1949, and many more. For more on this point, see William F. Maton's Canadian Constitutional Documents: A Legal History, which includes links to all of the relevant documents. As Maton explains,
"Unlike the majority of countries whose basic law derives from one document, Canada's basic law derives not only from a set of documents known as Constitution Acts, but also a set of unwritten laws and conventions. This comprises of all the acts passed since 1867 up to and including 1998. As a result, all constitutional documents during that time period have the force of law."
Second, while Yglesias's characterization of the "notwithstanding clause" is certainly a valid interpretation, it is by no means the only interpretation available.
The "notwithstanding clause" refers to section 33 of the Canadian Charter of Rights and Freedoms set forth in Part I of the Constitution Act of 1982. It reads as follows:
"33. (1) Parliament or the legislature of a province may expressly declare in an Act of Parliament or of the legislature, as the case may be, that the Act or a provision thereof shall operate notwithstanding a provision included in section 2 or section 7 to 15 of this Charter.
(2) An Act or a provision of an Act in respect of which a declaration made under this section is in effect shall have such operation as it would have but for the provision of this Charter referred to in the declaration.
(3) A declaration made under subsection (1) shall cease to have effect five years after it comes into force or on such earlier date as may be specified in the declaration.
(4) Parliament or the legislature of a province may re-enact a declaration made under subsection (1).
(5) Subsection (3) applies in respect of re-enactment made under subsection (4)"
In other words, if they are willing to pay the political price, a legislature may override (declare an Act notwithstanding) sections 2 and 7-15 of the Charter of Rights and Freedoms for a period of 5 years. "This essentially gives parliament," writes Yglesias, "the power to override the constitution by simple majority vote."
Well, yes and no. I think it is important to note that it gives a legislature the power to override a section of the constitution as that section has been interpreted by a court. A written constitution is never a straightforward and transparent document, the meaning of which is clear and self-evident to all who have a stake in its proclamations. Instead, it is subject to competing and conflicting interpretations. The question is, Who gets to have the final say in its interpretation?
The issue is not only, and perhaps not even primarily, a question of majority vote versus contitutional protection, but also a question of legislative versus judicial authority. This is not to deny that the "notwithstanding clause" raises the troubling possibility of the will of the majority trampling on the rights or interests of a minority. However, a perhaps equally troubling possibility is that of a small body of unelected judges imposing their will on a constituency that lacks any means of countering the power of the judiciary (an especially troubling possibilty, I would suggest, in the case of a court that is quite clearly divided into political/ideological factions). If you don't agree with a legislature's invocation of the nothwithstanding clause, at least you can hold them accountable for its invocation and attempt to throw them out of power at the first available opportunity (ie, at the next election). If you don't agree with a court's interpretation of a section of the constitution, your options are at best limited.
In the interminably long process through which the Constitution Act of 1982 was hammered out, critics of the Charter were quite explicit in their opposition to an American-style constitution. This is not because they were tyrants who opposed the guarantee of fundamental rights and liberties. Rather, it is because they objected to what they defined as an American notion of judicial supremacy, against which they sought to uphold what they considered the more democratic principle of parliamentary supremacy. The "notwithstanding clause" represents a compromise between an American-style constitution, with an attendant American-style principle of judicial review and judicial supremacy, and a British-style consitution (ie constitution not as one document but as many documents, acts, and unwritten conventions) with its attendant principle of parliamentary supremacy, through which acts of parliament are the supreme law of the land. Like many such compromises, it seeks to please everybody and ends by pleasing nobody.
UPDATE: The still commentless (is there no techie out there who can help him out with this?) Matthew Yglesias has a comment. He notes, quite correctly, that I had incorrectly interpreted his commentary on the "notwithstanding clause" as a kind of attack on said clause. He explains that he cited the clause as evidence of the very different forms that entirely legitimate constitutions can take. In so doing, he reveals a spirit of compromise that marks him as worthy (if, indeed, that is the term) of honorary Canadian citizenship. I disagree with his suggestion re: a referendum (imho, referenda are problematic at best). As for supermajority: given the very infrequent but nevetheless very real possibility of defeat by a vote of non-confidence, I think it's safe to assume that no legislature would invoke the notwithstanding clause without a very solid parliamentary majority.
"If you call Columbia University's main switchboard and ask for Nicholas De
Genova," writes Thomas Bartlett, "you will not be connected to his office. Instead, you will hear a recording of a statement by the university's president, Lee C. Bollinger, saying he is 'appalled' by the anthropology professor's 'outrageous comments.'"
Bartlett interviewed De Genova for the Chronicle of Higher Education. De Genova apparently answered all but one of the questions posed to him: he wouldn't comment on whether he thought the controversy over his "million Mogadishus" comment would weaken his chances at tenure. You can read the interview here.
You seek new monsters from the world new-found?
New ways of life, drawing on different springs?
The source of human virtue? The profound
Evil abyss? The void beneath all things?
Read here what's traced by More's ingenious pen,
More, London's pride, and Britain's first of men.
-- To the Reader [of Utopia] by Cornelis De Schrijver
Adjunct + Dystopia = Adjunctopia
My fellow adjuncts,
I'm not trying to bring you down. I'm just trying to bring you and me and all of us around to a keener sense of an underlying reality. "The void beneath all things."
Behold the void that fills the void:
"For Adjunct Professors,
Adjunctopia is changing the way experienced professionals enhance their careers...
Adjunctopia + you = less effort + better career"
Well, it's some sort of job listing clearinghouse designed to match up potential employers with potential adjunct employees through the use of a "database." I suppose they might have some basis for the "less effort" claim, but a "better career"? I guess "better" in the sense that it would allow you to view yourself in the light of an adjunct-entrepreneur, a dynamic career-enhancing professional who knows how to make use of the new e-tools that define synergetic overachievement?
But not too dynamic, not too overachieving. Check this out:
"At Adjunctopia, we know that our human capital is our most valuable asset. As a result, we focus our efforts on working only with the best and brightest.
If you are a highly talented individual with an overachieving personality, get riled when your boss says, No. It can't be done, and you truly want to change the world, then we are interested in you!
Email your resume to jobs@adjunctopia.com.
Important Note: If you're interested in opportunities at colleges and universities, this page isn't for you. Instead, you should register as a candidate by clicking here.[emphasis mine]"
Got that? If you are an adjunct, this page is not for the likes of you. But never mind. Your own page is a click away. Click here and register now.
But I'm wondering who is their target audience? Who amongst present and future adjunct faculty would need to be reminded that most insitutions "require you to have at least one advanced degree:"
"You might have no experience teaching in a college or university setting, or you might be a leading researcher [!?]. It doesn't matter. Please note: Most institutions require you to have at least one advanced degree."
"It doesn't matter," says Adjunctopia. They promise to "eliminate the pain":
Over the past 30 years, colleges and universities have increased their reliance on part-time and adjunct faculty instruction. The use of adjunct faculty in higher education continues to grow as the number of people looking to further their education increases. Life-long learning may be considered merely a buzzword today, but it is quickly becoming an imperative."
I like that bit about going from buzzword to imperative. They come so close to admitting they are the void beneath the void. But there's something else here that deserves a moment of scrutiny, a half-truth that constitutes a lie. "The use of adjunct faculty in higher education continues to grow as the number of people looking to further their education increases." This suggests an inevitable causal link: more students leads to increased reliance on adjunct faculty. But this leaves out an important part of the equation: more students plus lack of funding and decreased support for education leads to increased reliance on adjunct faculty.
Adjunctopia also offers "new" and "unique" services for employers (and "at pricing plans that fit [their] budgets"):
"We're tailored specifically to your industry to eliminate the static of unrelated professionals. Our profiles are targeted specifically to professionals interested in teaching at colleges and universities, so you won't get phone calls for insurance jobs."
Well, you know, if *I* were looking for a job in insurance (and, uh, maybe I should be), I know the first move I'd make would be to call up the chairs of some academic departments to ask if they had any openings. I bet academic chairs get a lot of these phone calls. It's about time someone did something to eliminate this "static."
You too, my fellow adjunct, can become an Adjunctopian. But I'm curious: Can a company really make any money with this scheme?
"When you purchase pursue your degree through AIU's online campus, your classroom is as close as your Internet connected computer. No need to rush to class right after work, spend your weekends inside a classroom, or put your life on hold. At AIU Online, class starts whenever and wherever you log in. You can participate at any time of the day or night and have the same great experience...
....Whatever your situation, AIU can take you to the next level of education and opportunity, at your convenience."
-- "Virtual Class," an unsolicited email sent to me by American InterContinental University Online
When I received this spam from the American InterContinental University Online, my finger paused over the delete button. Oh, I knew I should trash the message immediately, but (it's like picking at a sore, I suppose), I just had to read on. So I clicked on the link that shouted "Visit For Your NO COST information."
I have visited the online university, which bills itself as "A Modern Virtual Campus with a Brick & Mortar Pedigree," and I now offer the following traveller's account for the edification of my fellow adjuncts. Never let it be said that I haven't done my best to bring you up to (hyper)speed on the latest teaching opportunities for those who wish to join a team which "[creates] incredible synergiesfrom which remarkable success and overachievement is produced."
Do you want to become an online faculty memberof the AUI Online? You are but a mouseclick away from submitting your application, which should include the usual elements -- cv; cover letter; transcripts; references; statement of teaching philosophy (philosophy? that has rather an antiquated ring, but I suppose it is part of the "brick and mortar pedigree" to pay homage to the dead) -- along with something they call "datasheets," to be obtained how and where and why I cannot say. There's no mention of salary rates on the website, but they boast an impressive line of benefits: Flexibility (again, the early-modern putting-out system); Add to Your Experience; Networking with Others. Description of the requirements for employment is just the standard blather, though with the additional requirement of willingness and ability to master the principles of Fourth Dimension Learning (TM). If you are selected for one of their exciting employment opportunities, you will be required to "attend an online orientation, hosted on our Virtual Campus," for which you will almost certainly not be paid. After successful completion of this virtual (dis)orientation, you will then "teach" courses designed by their own "internal course development staff, [who] work with subject matter experts to develop all courses in-house, [so that] online faculty have more time to teach, coach and interact with students." In other words, prefab syllabi to ensure a standardized course delivery system.
Teaching I know. But "coaching" and "interacting" in a virtual dimension? This they did not tell me about in graduate school. Ah well, there's a lot they didn't tell me about in graduate school.
But let's stay positive, shall we?
If the AIU Online website is offering an accurate representation of its student body, you will also have the satisfaction of knowing that your students look like Banana Republic models. You know the look: casual elegance with a slight hipster edge, the "new neutral" tones with clean, uncomplicated lines. Not that you will ever meet any of these beautiful young people in the flesh. But you can imagine them in your virtual classroom. In so doing, please conceive of at least one exceptionally good-looking female student who is very, very blonde and at least one exceptionally good-looking male student who is a very dark-skinned African American. Imagine the two of them sitting next to one another in order to highlight the harmonious contrast. Very chic, very cool, very Banana Republic. Of course these two students wouldn't really be sitting next to one another, each would be attending class from the privacy of his or her own home, perhaps thousands of miles away from one another. But you may fantasize and synergize as you please: the virtual university is everywhere and nowhere.
There is more. For example, among the amenities it lists is something the AUI Online calls "an award-winning Cybrary." But I find that I am having trouble with my synergy. (I am redundant. Aren't I too young to be so very redundant?). So I will have to leave you to conduct your own online exploration, but not without offering this final word of warning: The Webmercial is a must-see, but please don't watch it if you are in a "down-and-out-in-the-academy" mood.
Via F∗ck That Job!, someone has put himself up for bid on eBay. Or, to be more accurate, since "eBay rules as well as US Federal law forbid the buying and selling of humans,...the person Brendan Grant is not for sale, however his time and services are." Bidding started at $1.00, the current high bid is $107.50.
Job advertisements supply direct and concrete evidence of the current state of the profession and also offer hints and clues concerning the profession's future (or perhaps lack thereof).
Here's a recent advertisement for a position as Visiting Assistant Professor:
"The History Department at Miami University invites applications for a visiting assistant professor in American History for the 2003-04 academic year. Any specialty of U.S. history is acceptable, but preference will be given to 19th century U.S. and/or women's history and/or race and ethnicity. Duties include teaching both halves of the U.S. history survey and upper-division specialty courses. Ph.D. in hand by date of appointment (August 19, 2003). Send letter of application, c.v., evidence of teaching experience, and letters of reference to the address below. Screening begins April 15 and continues until position is filled."
Ok, that's straightforward enough. For whatever reason, this department wants to hire a one-year replacement. Perhaps their regular 19th-c. American historian is going on sabbatical; perhaps it is a female faculty member who has been "irresponsible" enough to contribute to the continuation of the species; perhaps the 19th-c. American historian recently retired or is about to retire, and the department does not yet have the go-ahead (approval plus funding) to conduct a tenure-track search, but still has to meet student demand or curriculum requirements for courses in 19th-century American history. They will hire a recent Ph.D. who is currently unemployed or underemployed to do the work of an assistant professor. They will pay this person much less money than they pay a tenure-track assistant professor, but this amount will still be far more than is the current rate for "adjunct" teaching, and they will probably throw in a few of those benefits (eg health insurance) of which adjuncts can only dream.
Given the dismal state of the academic job "market" in history, this is not a bad gig. Certainly, it is far better than adjunct teaching. By the way, this department's decision to hire a "visiting assistant professor" rather than an adjunct assistant professor should not necessarily be attributed to their kind hearts and keen sense of responsibility to the profession. Such decisions generally have a lot to do with access and availablity. In an urban area, where unemployed and underemployed PhDs are a dime a dozen, a department will usually hire adjuncts to fill in their teaching gaps, because this saves them a lot of money. But the department of an undergraduate college in a small town that is isolated from PhD-producing research universities will not have access to a pool of cheap teaching labor. And they couldn't possibly induce people to move to the town for the purpose of adjunct teaching: the pay rates are too abysmally low. Instead, they must rely on a "visiting assistant professor," which is to say, someone who will hold a full-time but limited term (most often a one-year) contract.
Now here's another advertisement for a position as Visiting Assistant Professor:
"The University of North Florida seeks to hire a Visiting Assistant Professor with a specialty in modern history, to begin in academic year 2003-2004. This is a non-tenure earning position. The person hired must be able to teach both small and large-lecture sections of the second semester of our western civilization course, as well as upper level courses in the area of specialization. Candidates must have completed requirements for the Ph.D. before the contract begins. Teaching experience and evidence of potential for teaching excellence are required. Send complete dossier, including a letter of application, curriculum vitae, graduate transcripts, and three letters of recommendation to the address below. The search committee will begin considering applications on March 31. The search will remain open until the position is filled. For more information visit our Web Site. UNF is an Equal Opportunity/Equal Access/Affirmative Action Institution."
Notice anything different? Of course you do. This department is not looking for a one-year replacement, someone to fill in for the year 2003-2004. The position begins the year 2003-2004. When does it end? Impossible to know from this advertisement, but I find it interesting that they need to emphasize that "this is a non-tenure earning position." Everybody (at least everybody who would be reading this advertisement) already knows that a "visiting assistant professorship" is a non-tenure track position. A visiting assistant professorship is a full-time but temporary (usually one-year, sometimes two-year, ocassionally three-year) position. Kind of makes me wonder. For how long, exactly, is this department planning to pay host to its "visiting" assistant professor? Longer than one year, clearly, for it begins but does not end in 2003-2004. For two years? Three? Five? Permanently? I have heard of a new phenomenon whereby the term "visiting assistant professor" is used as a euphemism for "semi-permanent to permanent full-time non-tenure track position at half the salary of a regular tenure-track professorship." I may be wrong, of course, but I think this advertisement may offer a concrete example of this new variation on the ongoing indirect attack on tenure.
My husband is after me to go to law school. My husband is probably right. Off to practice logic games for the LSAT.
"Having gone through the transformation, over the past 2-3 years, from a demoralized, depressed adjunct barely hanging on to his career to an assertive, angry activist who's won some respect from both administrators and tenured faculty (along with higher pay and benefits, thanks to collective, union action), and knowing that some of you are just beginning this process, I'd like to offer my summative perspective and advice."
-- Larry Kaye, "How to Become a Successful Activist"
A few weeks ago, I posted some of my thoughts on the "adjunct as entrepreneur"model. Arguing that the adjunct instructor is in no position to act as pedagogical entrepreneur, I suggested that the attempt to recast part-time, low-wage work as a form of entrepreneurship is little more than a compensatory fiction. I might have added, I suppose, that, though regrettable, it is not at all surprising to see the development of a new deformation professional in response to the deprofessionalization of academic teaching. For the moment I don't have much more to say about this deformation, which strikes me as both ineffably sad and almost comically silly.
Next up on my agenda is the "adjunct as activist" model, which I think deserves more serious and sustained attention. But though I don't dismiss it outright, I have to say that I view the "adjunct as activist" ideal with a good deal of ambivalence and a certain degree of scepticism.
On second thought, I guess I do have one more thing to add about the "adjunct as entrepreneur" idea. The idea -- to put it bluntly and perhaps not entirely fairly -- gives me the creeps. It makes me think of creeping through the corridors of academe, Uriah Heep-like, in a posture of sham humility, mouthing the pieties of one's betters, perhaps even going them one better ('who wants a full-time job in the academy? all that committee work and those endless dreary faculty meetings! no, no, I'm fine, thanks for asking, why no, I don't need health insurance, never sick a day in my life, why yes, I have read the university's mission statement on the pursuit of excellence in research, teaching, faculty development, student retention, alumni relations, and parking lot maintenance, excellent plan, excellence all around and I for one am fully committed to the pursuit of all-round excellence, and no, I don't need an office, I'll hold office hours in my car, no need for old-fashioned amenities like offices and support staff when you're on the cutting edge of efficiency, it's all about economies of scale, by the way, in the production of your commodity, in short, these are exciting times for educational entrepreneurs and I'm just darn lucky to be here'), all the while keeping a sharp eye on the main chance...except that there is no main chance, no Agnes Wickfield, no partnership in the firm, nothing but more part-time low-wage teaching, unless one tries to carve out a niche as adjunct "coach" and consultant, or perhaps turns one's sights toward a career in college administration. But if I don't want to be an 'umble charity school pupil, I'm not so sure I want to be an angry young man either. Did I spend so many years, first in graduate school, then in postdoctoral research, and devote so much of myself to teaching and research and writing, only to be told that I must now mount the barricades and storm the edifice that for so long I have considered my second home? "False consciousness," replies the adjunct-activist, and not without good reason. What kind of home is it, after all, that could make me feel so little at home? Anyway, I'm pretty sure I'm not alone in this ambivalence, and I'm almost certain such ambivalence is a major obstacle to activism.
It will probably take me two or three postings over the next couple of weeks to work through the sources of my scepticism and ambivalence and try and figure out whether or not I should be this sceptical and this ambivalent. I plan to start with Marc Bousquet's "The Waste Product of Graduate Education: Toward a Dictatorship of the Flexible" (Social Text 20.1 [2002], 81-104; available online through Project Muse, but access requires subscription), which will be the focus of my next "adjunct as activist" entry. Bousquet's article is smart and provocative and a little bit infuriating, and it challenges me to think harder about the complex issues surrounding academic employment and academic restructuring. As such, I think it will serve as an excellent point of entry.
"....We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.--That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed, --That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness. Prudence, indeed, will dictate that Governments long established should not be changed for light and transient causes; and accordingly all experience hath shewn, that mankind are more disposed to suffer, while evils are sufferable, than to right themselves by abolishing the forms to which they are accustomed. But when a long train of abuses and usurpations, pursuing invariably the same Object evinces a design to reduce them under absolute Despotism, it is their right, it is their duty, to throw off such Government, and to provide new Guards for their future security..." (more)
I'm a newbie, a blunderer, an ignoramus, call me what you will. I don't mind admitting that what I need is a dummies guide for Movable Type. I'm sure the documentation is very thorough, but it presupposes a basic understanding of the basics, and a basic understanding is what I do not have. Some of it flies right over my head.
I've only just discovered, for example, that the "date posted" at the bottom of a blog entry is in fact its permalink. We're talking basics here, a basic knowledge of which I lack.
Ignorance breeds fear. I want to make some changes, but I'm afraid to tinker for fear of messing things up. If something goes awry, I will automatically assume that I've done something awful and irrevocable.
Categories, for example. It looks easy and straightforward, but this apparent simplicity makes me nervous. There is a Categories button under "Manage" in the main menu. I see how I can enter up to 5 categories. But is that all there is to it? Or is that just an initial step, before going into the templates to do something more? And are there any perils and pitfalls of which I should be aware?
Trackback I still don't quite get, even after having read Mena and Ben Trott's new Beginners Guide to Trackback (yeah, I know, it's kind of sad, isn't it?). Liz Lawley posted something useful the other day, but I'm still not sure I understand. I've now turned on the autodiscovery, so I suppose I can leave things at that.
Three columns instead of two. I've seen this on many blogs and I really like it. But then I've come across comments suggesting that three columns can cause problems? Again, I don't know enough about any of this to understand how or why this might be problematic, though I assume it has something to do with floating/alignment issues?
Backing up one's blog content. Is there a best way to do this, and if so, what is it?
Well, sooner or later I suppose I will figure this stuff out. But I welcome any hints or suggestions...
"Where are the novelists and poets of the daily grind of the war," asks Timothy Burke in yet another thought-provoking blogpiece, "the people who call us to some deeper meditations about the meaning of it all, who bring us together in a contemplative pause where the lion lays down with the lamb and the warblogger sighs heavily in sympathetic unison with the critic of the war?" Where, he continues, "is the general humility in the face of events vastly larger than ourselves, the reflective pause?"
Well, this is an age of immodesty. And it is also an age of lightning-fast reaction and response.
There are those among us who would speak truth to power. And there are those among us who would speak the language of power as that of the unvarnished truth. But what we all share is a perspective that is at best partial and incomplete. It is from this shared vantage point that we view the unfolding of events that are, as Burke reminds, "vastly larger than ourselves."
We are distant spectators. But we don't much like to acknowledge this distance. What we want is a sense of immediacy, a sense of being, in some small way, a part of it all. For some, let's face it, this desire for immediacy translates into an exaggerated sense of the significance of anti-war statements: if I say X, then I am part of a world-historical struggle, an unofficial opposition to the powers-that-be. For others, let's face it, this desire for immediacy translates into an exaggerated sense of the significance of pro-war statements: if I say Y, then I am hobnobbing with the power structure and cosying up to those insiders who will soon prove themselves the forces of victory. I think we do (or at least most of us do) realize that events are vastly larger than ourselves. But our desire to "only connect," combined with our ability to read and comment at the speed of light, argues against the humility that the distance and the vastness should recommend.
Not only the individual advances from infancy to manhood, but the species itself from rudeness to civilization.
-- Adam Ferguson, An Essay on the History of Civil Society
Where dips the rocky highland
Of Sleuth Wood in the lake,
There lies a leafy island
Where flapping herons wake
The drowsy water-rats;
There we've hid our faery vats,
Full of berries
And of reddest stolen cheries.
Come away, O human child!
To the waters and the wild
With a faery, hand in hand,
For the world's more full of weeping than you can understand.
-- William Butler Yeats, The Stolen Child
Where is childhood?
The other day I saw a CN (Canadian National Rail) freight train on the Hell Gate Bridge. I wasn't expecting to see it, though indeed there's no reason why it should have surprised me. But it did surprise me, and it stopped me in my tracks. All of a sudden I was transported back to childhood...wheat fields, prairies, blue skies, vast empty spaces, loneliness ... the sight of a CN freight train that I didn't expect to see somehow unleashed a flood of memories, or of those bits and fragments that we call memory.
But transported how, and to where? I don't know wheat fields, prairies, vast open spaces. That was not my childhood. Well, not directly, at any rate.
We Canadians do not have a history. Oh, sure, Canada has a history, and there are of course many histories of Canada. But Canadians do not have a national history, a grand narrative the meaning of which is revealed through the unfolding of significant events in accordance with a unifying theme which gives them their meaning. We just don't. We make fun of Americans' historical mythology, and partly we are right, but partly we are just jealous. Anyway: want to stir up the patriotism that lies dormant in the Canadian soul? Forget history (Laura Secord? that's a box of chocolates; Confederation? hit the snooze button; the Statute of Westminster? well, there is something impressive about the move from colony to nation without bloodshed, yes, it is a blessing to live in uninteresting times, but it hardly makes for compelling narrative). No, if you want to call forth Canadian patriotism, history won't do. Instead you must turn to geography. A mari usque ad mare is our official motto. In geography we find our mythos.
The sight of that CN freight train called forth memories of something already once removed. Memories not of a direct experience of prairies, wheat fields and vastness, but of the invocation of prairies, wheat fields, vastness, and freight trains running from sea to sea as the essence of Canada. The grade four geography textbook, the childhood experience of watching countless CN freight trains go by, and thinking of the pictures in that textbook, and imagining the vastness, and somehow feeling both small and insignificant and also part of something large and lonely and magnificent.
I often wonder what my son will remember of his early years, and how he will remember it. I take it as a given that he has, or will have, "that within which passeth show." Though some people tell me differently: "A boy? Well boys are easier to handle: they're more transparent." I am just a little bit irked by this, it sounds like saying boys are just a little bit stupid. Not my boy, I think.
How complex is the modern self! We have inward depths and multiple layers; we possess a rich interiority. It doesn't matter whether or not we really "have" this self, whether or not there really is this "self" that we can have. If we think we have it, then we do. It's not a question of whether or not this sense of self is accurate or true. This sense of self is very obviously a culturally and historically specific construction (quite simply, people have not always and everywhere conceived of themselves as having inward depths that they could plumb), but the fact is, we either are or else we do possess the sense of ourselves that has been constructed along these lines, and I doubt very much that we could deconstruct this construction of self, and in any case, I, for one, would not be much interested in trying.
This has all been brilliantly and beautifully explicated by Charles Taylor in his Sources of the Self: The Making of Modern Identity. But what Taylor doesn't deal with is our idea of childhood, and our notion of the child as the source of the adult self. This is the subject of Carolyn Steedman's Strange Dislocations: Childhood and the Idea of Human Interiority, which argues that the adult's interior self is perceived as the product of its own unique and personal history, the internalization of childhood memory as that which is "lost and gone." We all, or at least most of us, believe this, and not only those of us who pay our devotions to the secular religion that is psychotherapy. We think we can excavate this memory, and that the further we dig, the closer we come to the truth about ourselves that lies buried in childhood. I don't think we really can recover this truer, more authentic, and prelapsarian self, which is one reason, I suppose, why I find the idea of nurturing one's "inner child" so cheap and tawdry: it is both a shortcut and a dead end.
We think of childhood in terms that are at once developmental and historical. At its most simplistic, the child as our future, as a repository of our hopes and dreams, or, more social-scientifically, as a kind of "human resource" (ghastly term) to be carefully studied, managed, and marshalled in support of some socially useful goal or another. But more often, I believe, we think of childhood in terms of time past. And not in terms of a past time that can be represented as a series of discrete events which we then connect in accordance with a grand narrative, but in terms of time past as the unfathomable depths of our being: when it comes to the history of self that we apprehend through the memory of childhood, we are all historians of the annales school, committed to a notion of the longue duree. We long for a past which we would, but cannot, retrieve and recover. And of course we romanticize the pre-modern self as the self that we ourselves once had in the past that we call childhood. We think of this self as simple, not complex: as a self with integrity (wholeness) instead of a self made up of those layers and divisions that hide "that within which passeth show."