Sunday, June 26, 2022

Change for the Better

“Everyone always pays with plastic now”—you hear it constantly. But of course it’s not true. For one thing, there are still a good many Luddite oldsters like me who still use cash for the odd purchase. But there are also a lot of people who don’t carry plastic—people who are very poor, people who are homeless, people who don’t even have a bank account. It’s largely for that reason that we should make a point of always carrying some cash—all of us, that is, who don’t always say no to those asking for “spare change,” or to those who entertain us by busking on our streets. Imagine New Orleans or Nashville without street musicians; that vision could well become a reality if we give up carrying cash.

But it’s not just homeless people and street musicians who need us to be able to offer them small amounts of cash; it’s also a great many people who have traditionally depended on tips for a significant part of their income. Our tendency has long been to think of tips mainly in connection with dining out—and especially, in connection with dining out in sit-down restaurants. Over the course of the pandemic that perception shifted somewhat, as take-out and home delivery from restaurants became more and more popular. We grew much more used to adding 10% or 15% or 20% as a tip to the bill when paying with plastic in those situations. But there’s often no option to add a percentage when we buy fast food; the only way to tip those low-paid workers is usually with cash. And the same is true of the low-paid workers who clean your hotel room, or who do a range of other low-paid jobs. In an ideal world those people would simply be paid a living wage, and tips wouldn’t be needed. Sadly, we’re a long way from that sort of transformation; without the few dollars left with a hastily scrawled “thank you” on the bedside table for the hotel housekeeping worker, and without the fifty cents or dollar thrown in the tip jar for the fast-food worker who just served us, those people will struggle even more.

Let’s make their lives a little easier instead of a little harder. And let’s save street music in our cities. Let’s always keep a few coins in our pockets, and a few small bills in our wallets and purses. Change for the better.

Monday, April 25, 2022

The Protagonist of My Novel Is a Young-ish Working-Class Mother; I’m not. Is that OK?

One of the first questions people may ask about Lucy and Bonbon is this: as a matter of ethical principle, should someone such as me be writing such a book? The leading character, Lucinda Gerson, is a woman; I’m not. Lucy is working class; I’m not. She’s thirty-ish, and I am a very long way from that.

Another significant character, Ashley Rouleau, is a woman in her early twenties, from an economically privileged background. I come from an economically privileged background, though not quite as economically privileged as Ashley’s. But I am even farther away in age from Ashley than I am from Lucy. And one other thing; she’s black, I’m white.

Is it OK to write about people who are in numerous respects so different from the author? Or should writers, as the expression goes, “stay in their lane,” and write about people like them?

In a special category is Lucy and Bonbon’s second title character. Is it Ok for someone such as me to write in the voice of someone who is half human and half of another great ape species? That is of course an absurd question. Imagine the alternatives: it would be as ridiculous to say let’s leave it to the hybrids to tell their own stories as it would be to say Mary Shelley should have left it to the creature to tell his own story. But it’s not absurd to ask such questions where characters such as Lucy and Ashley are concerned. Working-class thirty-ish mothers and young women from economically privileged Toronto backgrounds exist in abundance in real life and are quite able to write their own stories, or stories about people like themselves. They certainly don’t need a sixty-eight-year-old white guy to try to do it for them.

So what’s my justification for writing characters of that sort into Lucy and Bonbon?

To answer that question, it may be helpful to try to disentangle the issue of “staying in one’s lane” from two related issues. One is the issue of representation in publishers’ lists. When I was young, the people whose fiction was published in North America (and in much of the rest of the world too) were overwhelmingly white—and disproportionately male. Thankfully, that has changed dramatically. There is surely still room for further change in some areas.* Overall, though, there can be no question that women, people of color, and people from minority backgrounds of almost every sort are far better represented on publishers’ lists than they were when I was young. Conversely, there are fewer “spots available” on publishers’ lists for people like me. It’s much harder than it used to be for a privileged, straight, white, aging male to get published—and that is a change entirely to be welcomed!

But that’s a separate issue from the question of who it should be considered OK to write about.

So too is the issue of appropriation of story material a separate question. When I was young it was often seen as entirely unproblematic if a white author “borrowed” an Indigenous myth or story as raw material for fiction—and there was rarely or ever any thought given to consulting anyone from the relevant Indigenous group. Thankfully, those days are long gone!

But that sort of appropriation is also a separate issue from the question of whether or not it should be considered OK to write about people other than those in the group(s) to which an author belongs.

So why is it these days widely considered to be a highly dubious practice to write “outside your lane”? I think the explanation is in part tied in with the way in which the issue of representation of authors from different backgrounds on publishers’ lists has been approached. As publishers have signed more and more fiction writers from minority backgrounds, publishers (and readers) have tended to develop expectations that these authors will be “telling their stories”—writing thinly veiled autobiography or, more broadly, telling stories about people from their own communities. And from that has developed a broader expectation that literature itself is fundamentally rooted in humans telling their own stories.

There is of course nothing wrong with people telling their own stories through fiction; many of the finest works of fiction unquestionably fall into that category. But a great many others do not. Most writers of fiction have considered one important function of literature to be imagining the lives of others—trying to understand the thoughts and feelings of others, and trying in doing so to craft characters and plots that will excite the sympathetic imagination of readers. So it is that our literary heritage includes characters such as Shakespeare’s Othello and Austen’s Mr. Bennet and Fitzwilliam Darcy and Shelley’s Victor Frankenstein and Eliot’s Edward Casaubon and Tertius Lydgate and Wharton’s Newland Archer. So it is that the literary descriptions of war that have been most highly praised as realistic include works such as Tolstoy’s War and Peace and Crane’s The Red Badge of Courage—both written by novelists who never themselves saw a battlefield. So it is that the most moving fictional depictions of poverty include works such as Mary Barton by Elizabeth Gaskell—who was herself always comfortably middle class. So it is that the character whose thoughts and actions portray the workings of racial prejudice among white people perhaps more persuasively than any other—Dr. Melville in Paul Dunbar’s “The Lynching of Jube Benson”—is the imaginative creation of a black writer. So it is that one of the most moving portrayals of a woman in an abusive heterosexual relationship, Roddy Doyle’s The Woman Who Walked into Doors, is by a man. So it is that the most memorable fictional representative of the tragic stuffiness of mid-twentieth-century British notions of one’s proper place in society—the butler Stevens, in Kazuo Ishiguro’s The Remains of the Day—is the creation not of an author raised in the stuffiness of mid-century British class consciousness but of one raised first in Japan and then in a Japanese family in the UK.

Those are all classics of previous centuries. It’s in the twenty-first century that the “stay in your lane” ethos has truly taken root in literary communities in the Western world, but even in this century some of the most impressive works have been by authors who have been following whatever path their imagination blazed rather than staying in their lane and writing about people such as themselves. I can’t think of two finer twenty-first century novels than Jo Baker’s Longbourn and Andre Alexis’s Fifteen Dogs. Baker—a university-educated woman who grew up in comfortable circumstances in the late twentieth century—has given us an extraordinarily persuasive fictional depiction of nineteenth-century servant life. Alexis—a black novelist whose family immigrated to Canada from Trinidad and Tobago when he was four—has given us an extraordinarily persuasive fictional depiction of non-human animals and their relationship to humans of any color (which along the way manages to provide wonderfully illuminating non-canine perspectives on life, love, and death). Authors such as these are emphatically not telling their own stories and “staying in their lane”; they are, above all, imagining the lives of others.**

Among twenty-first century dramatists, I can’t think of any writer more accomplished than Lynn Nottage; a black woman raised in comfortable middle-class circumstances, she has vividly brought to life the lives of working-class people—white as well as black, male at least as often as female.

In citing these examples, I don’t for a moment want to suggest that imagining worlds different from those one knows best is innately superior to bringing imaginative life to the world one does know best. A great deal of outstanding twenty-first century fiction writing is unquestionably by writers who have given imaginative life to characters from backgrounds similar to their own. (Examples that come immediately to mind include NoViolet Bulawayo’s We Need New Names, David Chariandy’s Soucouyant and Brother, Douglas Stuart’s Shuggie Bain, and Jesmyn Ward’s Salvage the Bones.) In no way do I want to disparage great writing of that sort; my aim is merely to point out that a great deal of fine writing has also come from writers who have not “stayed in their lane.” In the end, authors should surely be judged not on whether or not they have “stayed in their lane” and written about people like themselves, but on whether or not they have succeeded in creating an imaginative world (whether a realistic imaginative world or a fanciful one)—an imaginative world in which the characters feel believable, an imaginative world that engages readers’ attention, an imaginative world that leads readers to think, and to feel. Often that will be a world very like the one the author inhabits, but often it will be quite different.

I should make clear that, in writing Lucy and Bonbon, I did not set out to write a piece of fiction about a working-class mother and a privileged young black woman. I set out to write a novel about a child who is half human and half of another great ape species—and to explore what the life of such a person might say about how humans relate to other animals. My imagination then took me to a place where the characters Lucy and Ashley took shape. Looking back on it now, it seems natural that I would have been led, in writing about the prejudice Bonbon is subjected to on the basis of his biological background, to create characters who are subjected to other sorts of prejudice (whether it be on the grounds of class or race or gender). I hope that my imagination has been able to bring Lucy and Ashley to life successfully, just as I hope my imagination has been able to bring Bonbon successfully to life. And I think there’s a good chance that may indeed be the case—as I know it would certainly not have been the case had I tried to do the same when I was young, and had far less experience than I do now of a wide range of people and their circumstances. But that’s just me. Some authors are able even when young to write brilliantly and persuasively across gaps of gender and age and class and race and culture. Some are comfortable at any age only in writing about people like themselves, and do that brilliantly—whereas I can only imagine what a hash of it I would make if I tried to write an autobiographical novel. We’re all different, in short—and that’s a good thing.

Let me give the last word to Henry Louis Gates, whose fine short essay on this topic (based on an address he gave at a PEN America gathering) appeared last October in The New York Times Magazine: “Whenever we treat an identity as something to be fenced off from those of another identity, we sell short the human imagination. … Social identities can connect us in multiple and overlapping ways; they are not protected but betrayed when we turn them into silos with sentries.”

*This seems to me to be particularly the case when it comes to representing working class points of view. There are precious few working-class fiction writers being published—indeed, there may be fewer being published today than there were in the mid-twentieth century, when issues of class were on more editors’ radar screens than were issues of race or gender or sexual orientation.)
**It is so often expected that novels by black authors will focus largely on race that I feel I should perhaps provide a gloss here; in asserting that Fifteen Dogs is not an example of an author telling his own story, I do not mean to suggest that there is nothing in the book to do with race. Interestingly, race is not a significant issue with the human characters in the novel; we are never even told the skin color of Nira, Miguel, or other humans. (Worth mentioning, however, are asides such as the interesting reference to skin color in this description of a bathroom, as observed by Benjy, one of the dogs:
And then there was the room where the humans bathed and applied chemicals to themselves. The bathroom was fascinating, it being astonishing to watch the already pale beings applying creams to make themselves paler still. Was there something about white that bought status? If so, what was the point of drawing black circles around their eyes or red ones around their mouths?)
It is with the dogs themselves that serious issues of color briefly arise in the story, as Alpha dog Atticus declares that “the black dog” is “not one of us,” and his ally Max opines that, in that case, “it would be better to kill him.”

Open Book Interview about Lucy and Bonbon

It's been too long since I posted on this blog; with Lucy and Bonbon soon to be published, I'll try to post several times over the coming weeks.

I was interviewed about the novel recently by Open Book. That interview is posted online at Why did I dedicate Lucy and Bonbon to a parrot? It's all in the Open Book interview.

Sunday, March 7, 2021

The Case for Individual Reparations: The Privileged Need to Do More than “Stand in Solidarity”

What should those of us who are privileged do about the unfairnesses of the world?

At a minimum, we can and should offer political support to those who are trying through political means to improve things; we can do that through voting, and we can ourselves devote time and donate money to political causes. We can write letters to the editor, we can join demonstrations, we can speak up on social media.

But political action may or may not ever bring results. (Some of us have been arguing for a guaranteed annual income--aka basic annual income, universal income--for fifty years, and there’s no end in sight.) Time and money can also be spent in ways that don’t depend for their success on one’s cause being taken on by a party in power, or coming out on top in a referendum. Can and should we act in other ways to help bring about change?

Yes, is the short answer. We can give both our time and our money. Perhaps we can volunteer as a primary school teacher in a remote Indigenous community, for example, as one friend of mine did for several years. Or volunteer many hours in order to help a disadvantaged youth through difficult times--as my son Dominic did for a number of years. Or volunteer as a nurse in an out-of-the-way sub-Saharan community, in a hospital where there are so few beds that the patients often have to sleep outside, as my daughter Naomi did for several months not long ago. I did something of the sort myself for three years when I was young (volunteering through an aid agency to teach at a high school in rural Zimbabwe), but at this point in my life doing anything of that sort again probably isn’t realistic. To volunteer for a few hours a week as I start to approach retirement, on the other hand (whether at a nearby farm sanctuary, at our local food bank, at a local homeless shelter, at our local literacy center--there are so many good candidates!) certainly is realistic—and certainly it’s realistic for someone in my position (with an income of over $70,000 a year, and no mortgage or other debt load) to commit to donating somewhere between 5% and 10% of my annual income to appropriate charities.

I want as well to suggest one other form of giving that seems to me to be appropriate—wealth-related individual reparations payments.

What is the case for making reparations payments? For going beyond ordinary charitable giving? To understand that case, we privileged folk need to understand the ways in which North America’s legacy of racism and oppression has conferred benefits on us. It behooves us to learn the broad strokes of history—and it behooves us as well to ask questions about the histories of our own families. If the stories my mother told me are correct, one ancestral connection of ours acquired the beginnings of his fortune by acquiring “unowned” land on the Canadian prairie—land that he knew would be in the path of a trans-continental railroad. When he sold that land to the railroad he profited immensely and directly, in other words, by taking land that had been occupied by the Indigenous peoples of the plains. His wife was an aunt of my mother’s, and eventually my grandmother benefitted considerably from the largesse of his son. Further back in time, that side of the family also benefitted directly from slavery; some of my ancestors are recorded as having been owners of enslaved people in New York State in the early nineteenth century (slavery was not abolished in the state until 1827). On my father’s side, my great grandfather, emigrating from Ireland, is said to have spent the 1840s in New Orleans before he moved to Canada; though he may not have himself been an enslaver, it is unimaginable that a white person in that city at that time would not have benefitted directly during that decade from the labor of enslaved people.

None of this resulted in my family becoming fabulously rich—but there can be no question that the degree to which I’m now modestly well-off results at least in part from a direct legacy of oppression. And, quite aside from the degree to which I’m able to trace direct personal connections of this sort, of course, I have inevitably benefitted substantially from our society’s collective theft of the continent’s land from its Indigenous peoples, as well as from numerous other forms of collective oppression (oppression of Black people; oppression of Chinese, Japanese, and other Asian people; for many generations oppression of Quebecois and other French Canadians—the list goes on). We privileged whites whose families came generations ago from Europe have all benefitted from such collective theft and oppression—and many who have arrived more recently to join the privileged classes (not all white, by any means) have also benefitted from it to a considerable extent.

The disadvantaged are disadvantaged in many ways, but disparities in wealth are perhaps the most egregious. Whereas privileged white North Americans (and others of privilege) have more often than not been able to pass wealth on to their children, generation after generation, Black people, Indigenous people and others who have been disadvantaged have been particularly heavily disadvantaged in terms of wealth. For the most part shut out from the sorts of well-paid employment opportunities that help to build savings, they have too often also been prevented from acquiring real estate wealth; even where redlining and other discriminatory laws have not been in effect, racist covenants and unspoken understandings have often been just as effective in keeping wealth out of the hands of the disadvantaged. Privileged whites such as myself, then, who have benefitted from differential treatment through the educational system and through the law enforcement and judicial systems, have also benefitted from favorable economic treatment— higher pay, on average, but also much greater opportunities for building wealth.

That’s why it’s not enough for us to say we “stand together” with demonstrators protesting against the treatment meted out to George Floyd, or Neil Stonechild, or so many others. We have an obligation to act in tangible ways to level the playing field, and to make amends.

Until I read Ta Nahesi Coates’ now-classic 2014 article “The Case for Reparations,” I hadn’t given much thought to the idea that beneficiaries of slavery and other forms of exploitation should pay reparations to the victims and their descendants. When I read Coates’ essay I was immediately persuaded of the merits of reparations paid through governments.

Government-funded reparations seem for the moment to be politically impossible in the United States. In Britain too—where Amandla Thomas Johnson has made a persuasive case for reparations in the Guardian—government sponsored reparations are clearly for the moment politically impossible. In Canada, the government has paid several billion dollars in reparation payments to the survivors of the residential school system, but there has been little or no thought given to the possibility of paying reparations to Indigenous people in consideration of the larger history of oppression. Nor has thought been given to government-paid reparations for slavery—yes, it existed here as well. Government-funded reparations, then, seem unlikely to happen anywhere anytime soon.

But does that mean that nothing can happen right now to move reparations forward? Not at all. We can act as individuals to make a contribution. As Michael Eric Dyson and others have pointed out, individuals can keep their own “individual reparations” accounts by making appropriate donations—over and above whatever charitable donations we make ordinarily.

Given the importance of wealth-related disparities, it is perhaps especially appropriate to think of individual reparations payments in the context of wealth. Moments when we are fortunate enough to see our wealth increase are, it seems to me, appropriate moments to give particular thought to sharing that wealth.

I first put this idea into practice in 2018. Ten years earlier I had bought a small house in New Orleans, thinking I’d one day live in the little back unit for at least part of the year. I rented both units out, and the years went by. By 2017 my partner and I had become quite happy on Vancouver Island—a very long way from New Orleans. When I finally sold the New Orleans house, it had appreciated a fair bit in value. On reflection it seemed to me that about a quarter of the capital gain was an amount I felt comfortable paying in reparations; I sent that sum to a non-profit dedicated to increasing educational opportunities for African Americans.

Over the past year my investments on the stock market resulted in a substantial gain. Last Friday I sold the stocks I owned—and decided to devote roughly a quarter of the gain to a charity that focuses on improving the lives of Indigenous schoolchildren.

Is one quarter of any increase in wealth the most appropriate amount? Some might plausibly argue that a higher percentage would be more appropriate--and in the other direction some might well feel that even a quarter of such amounts would be more than they could afford to give. But regardless of the precise amount, it seems to me difficult to argue in principle against making such contributions. I certainly expect to make more payments of this sort in the future; I hope others who are similarly privileged will consider doing the same.

I should emphasize that arguments about making voluntary reparations should apply only to those with the means to do so (many people have of course never been privileged recipients of a capital gain from any source). And individual reparations shouldn't preclude reparations payments by corporations and other organizations—let alone a more general plan of reparations through government action. Far from it. But for the moment, individual reparations are much better than nothing—and privileged white folks like me who have the means to take such action shouldn't hesitate. If you’re in any doubt as to why, I urge you to read Coates’ extraordinary article.
NB Parts of the above first appeared in an earlier blogpost on this topic: "The Case for Individual Reparations," January 12, 2019 (

Friday, March 5, 2021

The Language of Genocide

The word “genocide” is a difficult case when it comes to defining and classifying. Since the term was coined in 1944, the way in which it has been most widely used is to refer (as the Oxford English Dictionary puts it) to “the deliberate and systematic extermination of an ethnic or national group.” The United Nations, though, adopted a more elaborate definition in its 1948 Convention on the Prevention and Punishment of the Crime of Genocide. As a result of negotiations involving many nations, it defined “genocide” as
… any of the following acts committed with intent to destroy, in whole or in part, a national, ethnical, racial or religious group, as such: killing members of the group; causing serious bodily or mental harm to members of the group; deliberately inflicting on the group conditions of life calculated to bring about its physical destruction in whole or in part; imposing measures intended to prevent births within the group; forcibly transferring children of the group to another group.
In some respects, the UN definition seems vague and far-reaching; exactly what should the phrase “serious mental harm” be taken to refer to? (This phrase was added at the suggestion of China, which had in mind the use of narcotics to alter the mental state of large populations.) In other respects, though, the UN definition is not as far-reaching as many would have liked it to be; most notably, the UN membership decided, after much debate, not to include any mention of cultural genocide. What exactly is “cultural genocide”? The Canadian case is relatively clear. In 1879 John A. Macdonald declared that “Indian children should be withdrawn as much as possible from the parental influence, and the only way to do that would be to put them in central training industrial schools where they will acquire the habits and modes of thought of white men.” Macdonald openly desired, in other words, to destroy Indigenous culture—to commit, as we have come to term it, cultural genocide. Under Justin Trudeau the Canadian government has balked at implementing many of the recommendations of the National Inquiry into Missing and Murdered Indigenous Women and Girls—but it has accepted the inquiry’s conclusion that the Canadian record of mistreatment of Indigenous peoples constituted a form of genocide. “This was genocide,” declared Trudeau in 2017 (thereby unwittingly making things far more difficult for himself on the China file in 2021). Many have disagreed on this point both with Trudeau and with that inquiry. They have argued that, however badly Canada has treated its Indigenous peoples—and all agree that the record has often been appalling—the country’s record is not commensurate with the practice of physical genocide (with Nazi Germany’s extermination of six million Jews, for example, or with the extermination by the Hutu majority in Rwanda of close to a million of the minority Tutsi group in 1994). The critics argue that we should acknowledge the difference between cultural and physical genocide—and that we should not use the plain term “genocide” when we are speaking of cultural genocide.

But if Canada has committed cultural genocide, has it not, by definition, committed genocide? Surely cultural genocide must be a form of genocide—just as domestic violence is a form of violence, just as religious freedom is a form of freedom. Don’t the very words make this clear?

No, is the short answer. Domestic violence is indeed a form of violence—a subset, if you will, of the broad category “violence.” But let’s look at some other examples of grammatical compounds. Is “political suicide” a form of suicide—a subset of the broad category “suicide”? Not at all; that’s a compound that involves a metaphorical use of the noun “suicide.” What about “online sex”? Is that a form of sex? People have argued both sides of that one.

The point with compounds is that the relationship between the elements that make them up does not follow a single pattern. Even if it’s agreed that Canada has committed cultural genocide, it does not automatically follow by any self-evident rules of English grammar that we have committed genocide. In a case such as this—as in many others—definition and classification turn out to be anything but straightforward.

That leaves plenty of room for discussion and disagreement over the words we use. But one point above all should not be lost sight of in such discussions—the importance of taking substantial action now both to atone for past injustices (however we name them) and to bring real improvement in the present to the lives of Indigenous peoples.I'll post on that topic shortly.

Tuesday, August 25, 2020

Eating for a Greener Planet

I've been very quiet on the blog the past few months--largely because a lot of my spare time has been taken up with working as part of a group trying to try to bring about change within the Green Party of Canada. I can now report that five of the nine candidates running for the party leadership (Judy Green, Meryam Haddad, Amita Kuttner, Dimitri Lascaris, and David Merner) have declared their support for a new approach to animal agriculture—phasing out subsidies to animal agriculture (and increasing support for plant-based alternatives), as well as including animal agriculture GHG emissions under the provisions of the Greenhouse Gas Pollution Pricing Act. In other words, there’s now a very good chance that we can elect a party leader who will make the Green Party of Canada the country’s first major party committed to taking seriously the harms caused by animal agriculture—and by eating animals.

Anyone who would like to make that happen can help by joining the Green Party of Canada before 11:59 pm on September 3, and taking part in the voting later that month. Membership in the Green Party of Canada costs just $10; any Canadian or permanent resident aged 14 can join.

I hope you’ll join us! I’ve just posted more information on this website:

Wednesday, April 1, 2020

Rhyme of New Orleans

Maureen and I were thinking the other night that there are too few poems written that have something to them of joy. As you may have experienced recently, there's been a move in some circles to circulate poems--poems that people have found engaging and affecting, particularly in a joyous or uplifting sort of way. There is of course no shortage of engaging and affecting poetry in the world, but E.E. Cummings and a few others excepted, there has been a dirth of poets who frequently write poetry expressive of joy--whether in some pure form or admixed with other emotions.

One such poem that comes to mind is Alice Oswald's "Wedding"; another is Carol Ann Duffy's "John Barleycorn." The latter is a poem very largely composed of the names of English pubs; re-reading it the other day (before I passed it along as a reading recommendation to one of these "Poem Exchange" groups) made me think that one could perhaps do something of the same sort with the names of bars in New Orleans. The poem below is the result of my effort to do just that (drawing on some of the material I'd included a few years back in a poem for Maureen). It didn't end up with as many bar-names as I'd thought it would, and the joy is certainly admixed with a few other emotions, but joy there is. For what it's worth:

Rhyme of New Orleans

New Orleans don’t rhyme with beans, or with means

That’s what they’ll tell you, uptown or downstream:

New Orleans don’t rhyme with beans.

But Satchmo did it—you know what it means...:

When it’s music it all becomes different, it seems,

The notes and the words flow like water, like dreams—

Like the dark and the deep of the river’s wide dreams

As it curls in the sparkle of night through New Orleans.

Rhyme New Orleans. Rhyme New Orleans and the music begins,

With full rhymes, fat rhymes, light rhymes, slant rhymes,

With high notes, low notes, bank notes. Light sins.

Rhyme thick air. Rhyme black and white and good times,

Rhyme Abita and amber, and rhyme good health,

Rhyme night and stomping, rhyme black and blue,

Rhyme like the river, turned back on itself

And stretching, aching, thrusting round, surging through,

And once or twice a lifetime, swamping

The city that once was the place where they sold the enslaved,

City of graves, city of cotton,

Time stretched, time lost, but nothing forgotten,

'Cept some days let’s pretend last night never happened.

The night is warm, the beer is cool,

There’s jazz, there’s blues, there’s someone rappin,

DBA, Hi Ho, they’re passing the hat,

Vaughan’s, the Mother-in-Law, Spotted Cat,

Blue Nile, Maple Leaf, the Candlelight Lounge,

Spare a dollar? I tell ya, I just gotta scrounge

A few bucks, buy a coffee, a meal;

I can tell ya, noone here really wants to steal.

Tipitina’s, Lost Love, the Friendly Bar--

All open late, and the door’s ajar:

That’s Chris Kohl’s clarinet, smooth as a knife;

Eight to the bar, hold a note like forever. Like life.

That tune? You can’t lose it; Time? You can’t choose it.

It’s time like the always and never of music,

Of everything music, of mockingbird music,

Like the always and never of

Living, of loving. Of love.

Wednesday, March 11, 2020

Cross-Cultural Scholarship: A Cautionary Tale

My original intention was to call my first book The Birth of Expectation: A Cognitive Revolution in Western Culture. But when I told the publishers, Macmillan, of my tentative plans for a follow-up volume they decided they wanted a grander title; when it came out in 1989 the book had acquired a definite rather than an indefinite article at the start, and bore the title The Cognitive Revolution in Western Culture, volume 1: The Birth of Expectation. In a quiet way it’s received a good deal of attention over the years—though, in recent years, not of a sort that makes me at all happy.

The monograph originated in the Masters thesis I had completed at Sussex more than a decade earlier (working under the brilliantly wide-ranging scholar, and wonderfully warm-hearted human being A.D. Nuttall), the central insight of which had been that there was a yawning divide between the plotting techniques employed by Shakespeare and Marlowe and the plotting techniques employed by their medieval (and even their immediate Tudor) predecessors. Shakespeare and Marlowe crafted plots that facilitate the formation of expectations in the minds of the audience members; typically, characters’ intentions are revealed before they act on those intentions. That became the most common template for plots through to our own day; audiences (or readers, or viewers) are provided with raw material that encourages them to form expectations of what is likely to happen as the action moves forward. But almost all plots from the medieval period operate on a very different basis; one thing happens, and then another thing happens, and then another thing happens after that, without our being given information that would allow us to form expectations as to what is likely to happen.

The question I was left with when I had completed the short Master’s thesis was why. Why would such a drastic change in dramatic plots have occurred? And the conclusion I eventually came to was that most people in pre-Shakespearian England had not developed the sorts of temporal and causal thought processes that, for educated individuals in technological societies, have become sufficiently ingrained to make the formation of expectations as to what is likely to happen a deep-rooted habit.

It was a conclusion reached largely through a comparison of the evidence from medieval English (and medieval European) culture with evidence from a wide range of other pre-literate societies. But it was a conclusion carefully qualified in a number of ways. First, I made clear that the generalizations I was making concerned the great majority but not every individual; I was not suggesting that highly educated individuals such as Thomas Aquinas or Geoffrey Chaucer had not developed the ability to form expectations of this sort. Second, I made clear that the differences I was postulating were the result of environmental factors and subject to change; they were not innate. “Indeed,” I suggested, “it seems self-evident that a baby born into [San society on the Kalahari] but brought up from infancy and educated in Toronto will grow up with modern Western habits of thought, and that the reverse is also true.”

Though my focus in looking at developed societies was on those of the Shakespearian and post- Shakespearian Western cultures, I in no way suggested that non-Western literate cultures had not developed causal and temporal cognitive habits of very much the same sort as those developed in the literate West. I said very little about non-Western literate cultures, either in earlier eras or today. Then as now, it seemed obvious to me that even a glance in the direction of the history of China should be enough to make clear that, over much of the past 2,000 years and more, Chinese culture evidenced causal and temporal thought processes at least as sophisticated as those found anywhere in the West at the same time.

It seemed clear to me too that many “developing” countries were indeed developing not just economically, but also in terms of people developing more complex patterns of temporal and causal thought. Of Zimbabwe, for example—a country where I lived for three years in the early 1980s, teaching at a rural high school—I observed that generalizations about causal and temporal thought processes which “still hold for the bulk of the rural population, most of whom are untravelled and (despite the massive developments in the years since Independence) only semi-educated, are manifestly untrue of the growing number of Zimbabweans who are possessed not only of a high level of formal education but also what one can only refer to as urban sophistication.”

Most important of all, I repeatedly made clear that my argument should not be taken to suggest an over-arching superiority of any sophisticated culture over any less sophisticated one. Indeed, my contention was that even where most people in pre-literate cultures may tend to think in quite different –and even less logical ways—than most people in more highly educated and sophisticated cultures, they may still be equal or superior to more “developed” peoples in spheres such as the moral and the aesthetic. I also suggested that pre-literate cultures often possess a poetic vitality that has been largely lost in the developed West, and argued strenuously that a highly educated society in which the majority of people possess highly complex temporal or causal thought processes is no more likely to be a wise or a morally good society than is the most undeveloped of pre-literate societies.

The one thing I regret about the way I expressed the argument of The Birth of Expectation: A Cognitive Revolution in Western Culture (as I much prefer to call the book) is that, instead of referring to “pre-literate societies” or “elemental ways of thought” I used a term that, although controversial, was in the 1980s still fairly common in reputable scholarly discourse; I referred to “primitive societies” and “primitive ways of thought.” I soon realized that I had made a mistake. The 1995 paperback edition of the book includes the following note, which bears repeating:
I have become convinced that the frequent use in the text of the word “primitive” was ill-advised. In the book’s first long note and at many other points I am at pains to point out that I see the proper use of the term as being purely descriptive (to mean “original; primary”) rather than pejorative. But as others have now persuaded me, … one doesn’t get to make the language; once a word such as “primitive” has been corrupted by prolonged pejorative use, it may not be enough to argue that it should not carry negative connotations.
The text [of the paperback edition] has not been reset, and troublesome word thus remains. (Nor am I sure of the best substitute; I suspect “elemental” might serve better than any other.) But at least the paperback may carry a prominently placed apologia for my having used a word that I should have recognized carried with it the risk of tainting for many people the thesis of the entire book, and of allowing it to be suspected of perpetrating the very sorts of preconceptions that it was written largely to challenge.
As it happened, the book was not attacked for its use of the word "primitive," or for its thesis; the scholarly reviews were not numerous but they were on the whole very favorable.

I nevertheless am deeply saddened by its reception. It has had virtually no impact in the field of serious literary studies, and I make no complaint about that; it’s the fate of most scholarly monographs, particularly when they put forward arguments that go nowhere near the currents of a discipline’s main stream. What saddens me is where the book has made an impact. Despite all the disclaimers, despite all the careful qualifiers, the book has been cited again and again by those whose goal is to paint the West as superior to the rest.

Typical is an article by Ricardo Duchesne, posted on the “Counter-Currents” website and entitled “Jean Piaget and the Superior Psychogenetic Cognition of Europeans.” Counter-Currents Publishing—a self declared voice of “the North American New Right” –dedicates itself to principles such as this: “We live in a Dark Age, in which decadence reigns and all natural and healthy values are inverted.” It declares that it “aims to promote the survival of essential ideas and texts into Golden Age to come.” Those “essential” ideas, it is made very clear, are European ideas—evidently code for “white people’s ideas,” given that white North Americans are surely included in the "North American New Right."

Ricardo tries to use my work to support what to me are entirely misguided claims for the supposed “superior intellectual powers and superior creative impulses” of Western culture. He refers to the “he uniqueness of the West,” “the higher fluidity of the Western mind, the multiple intelligences of Europeans”—repeatedly suggesting that “Europeans” are innately superior to other peoples.

Duchesne does acknowledge at one point in his discussion that “LePan carefully distances himself from any claim that Europeans were genetically wired for higher levels of cognition.” But he suggests that the sorts of evidence I present can and should be taken to draw such a conclusion. More, he ascribes to writers of my ilk a fear of confronting the truths my work supposedly points to:
The uniqueness of the West frightens academics. They have concocted every imaginable explanation to avoid coming to terms with the fact that Europeans could not have produced so many transformations, innovations, renaissances, original thinkers, and the entire modern world, without having superior intellectual powers and superior creative impulses. To draw any such conclusions about the world’s various groups of humans is to my mind not only wrong in point of fact; it is also morally repugnant.
It is telling that Duchesne dismisses the final chapter of The Birth of Expectation (“Postscript: Zimbabwe, 1995”—to my mind perhaps the best-written part of the book) as simply “strange.” He observes archly that “LePan praises the cultural ‘vitality’ of this African country,” as if that were all I praised about Zimbabwean culture—and as if any praise at all for an “African country” were to be wondered at.

Anyone who knows anything of the culture of Zimbabwe—the engaging and intelligent fiction of such writers as Charles Mungoshi, Tsitsi Dangaremba, and NoViolet Bulawayo, for example, or the unforgettable music of such songwriters and musicians as Thomas Mapfumo, Oliver Mtukudzi, and Leonard Zhakata—will be aware of its rich complexity and its wide-ranging intelligence as well as its vitality. But writers such as Duchesne evidently have no interest in non-European cultures in and of themselves; such cultures seem to be of interest only insofar as information about them can be twisted so as to suggest that they are inferior to “European” cultures.

It is in the Postscript that I argue most powerfully that “the Shakespearean moment” in our own culture occurred when new cognitive processes were emerging among the majority of the population, but the poetic vitality of pre-literate culture was also still very much alive in the mainstream of society. Given the degree to which such vitality has now been blunted by the post-renaissance emphasis on rationalism, I argue that another such moment has become impossible in our own society—and that “if a new Shakespeare is to emerge,” it is far more likely to be “from the valleys of the Niger or the Zambezi than the skyscrapers of New York or London.” I believed that then; I still do.

Saturday, March 7, 2020

Ag Gag Laws Continue to Spread Across the US—and Now They’ve Come to Canada

In many areas of the world trespassing is a relatively minor offence under the law, and offenders are liable to relatively minor punishments. The maximum fine for a first offence in the province of Alberta, for example, was until recently $2,000; for a second offence the maximum was $10,000. (Penalties for trespassing are typically the same regardless of whether the premises are a private individual’s home and yard, a business owner’s warehouse and parking lot, or a farmer’s fields and farm buildings.)

But under the provisions of a bill rushed through the Alberta legislature by Premier Jason Kenney’s Conservative government late last year, an individual in that province who has been found guilty of trespassing is now subject to a fine of up to $10,000 for a first offence—plus six months in jail. A second offence is now subject to a fine of up to $25,000, plus a further six months in jail. An organization involved in sponsoring or directing an act of trespass is subject to a fine of up to $200,000. If one is deemed to have gained access under “false pretenses” (for example, by falsely saying as you start a job at a pig farm that you have no intention of taking photographs of any animals being abused), one is subject to the same penalties.

Why prescribe such harsh punishments for such a minor offence? The key is in another part of the bill, where it is specified that such penalties apply even when property is not fenced off and no notices forbidding trespassing have been posted—if the offence occurs on farmland or “on land that is used for the raising of and maintenance of animals.”

Alberta’s Bill 27 is legislation of a sort familiar to many Americans as “ag gag” legislation—legislation intended to gag those who would inform the general public of what goes on behind the closed doors of the agricultural operations where the 10 billion or so mammals and birds killed every year in North America for human food live out their brief, unhappy lives.

As in many other North American jurisdictions, such operations in Alberta are in practice exempt from almost all provisions of animal cruelty legislation. Typically, such legislation prohibits only the causing of “unnecessary” pain, suffering or injury to an animal, and in many jurisdictions any practice is allowed if it can be classed as part of “generally accepted” practices of animal management or animal husbandry. Given that those have for decades included such practices as confining sows in crates so small that the animals can never turn around, and confining egg-laying hens to cages in which they each have no more than 67 square inches of living space, a phrase such as “generally accepted practices” leaves a lot of room for cruelty. But not enough room to satisfy the animal agriculture lobby. Over the past decade and more, undercover operations at animal agriculture facilities across North America have revealed horrific examples both of what constitutes “generally accepted practice” and of abuses that exceed anything that could possibly be described as “necessary cruelty” or “generally accepted practice.” As Camille Labchuk, executive director of the organization Animal Justice, has pointed out, “whistleblowing employees are often the only way the public has to monitor the conditions animals endure on modern farms.”

Alberta is not the only province that has moved to criminalize such whistleblowing; in the province of Ontario Premier Doug Ford’s Conservative government has just introduced similar legislation. In doing so, the two provinces are following in the footsteps of the many American states that have passed such legislation (in three of which the legislation has been ruled unconstitutional—legal battles continue elsewhere).

At stake is not only the treatment of non-human animals, important though that is; it is also freedom of speech. A society in which whistleblowers are prevented from drawing the attention of the public to horrific abuses is a society that gives license to the powerful to do anything they please.

But surely, many may say, property owners have a right to do as they please on their own property; should we not do everything we can to protect that right? Perhaps the best response to such arguments is to imagine a situation in which those being abused are not calves and piglets and chicks but puppies and kittens—or human children. In any such scenario it becomes clear that we instinctively feel private property rights to be far outweighed by those of the general public. The public in such cases has a right to know—and the public has as well a responsibility to do everything possible to stop the abuse. (That’s particularly the case given that agricultural operations enjoy the benefits of considerable government subsidies in almost every North American jurisdiction.) Yet Governments in Alberta and Ontario—just like governments in many American states—are doing everything possible to keep the public from knowing what’s going on—and nothing whatsoever to stop the cruelty.

Animal activist and free speech groups are likely to challenge the constitutionality of the Canadian laws, just as such legislation has been challenged in America. But it is hard to keep up; though the issue receives little coverage in the mainstream media, in America the problem has been getting worse rather than better. Ag gag laws have been attempted in a total of 28 states; they have now become law in a total of at least 11. Even in Iowa, where the state ag gag law was ruled unconstitutional in January 2019, the legislature within two months managed to pass a new ag gag bill. And in some states with ag gag laws—notably, North Carolina—the legislation is worded so broadly as to deter whistleblowing in almost any industry in almost any context.

Today’s media give a good deal of attention to alleged threats to freedom of speech coming from progressives on university campuses; it is time more attention was paid to these far more serious threats to freedom of speech that are coming from industry and from government.

Tuesday, January 21, 2020

Baseball Hall of Fame Criteria

It's too seldom noticed that, of the six criteria to be considered when deciding who should be admitted to the baseball Hall of Fame, three have nothing to do with anything that can be measured by WAR:
Voting shall be based upon the player's record, playing ability, integrity, sportsmanship, character, and contributions to the team(s) on which the player played.
Far from being unreasonable, keeping the likes of Curt Schilling out of the Hall of Fame is standing up for what's most important--in sports, and also in life.

Larry Walker, we celebrate your entry into baseball's Hall of Fame!