Category Archives: Notes

Just how extensively will jobs be lost to automation?

automated-humanNew Scientist recently interviewed Andrew McAfee, one of the authors of The Second Machine Age: Work, progress and prosperity in a time of brilliant technologies. McAfee and co-author Erik Brynjolfsson are experts in digital technologies and economics (their previous book was called Race Against the Machine: How the Digital Revolution Is Accelerating Innovation, Driving Productivity, and Irreversibly Transforming Employment and the Economy). A basic question raised in the interview (and addressed in the book) is whether advances in technology will greatly reduce the need for human labor, leaving a large segment of the population unemployed. Their answer, basically, is ‘yes.’

In the interview, McAfee describes three possible scenarios. One, the disruptions could be more or less temporary. Technology has been eliminating the need for human labor in various segments of the workforce for hundreds of years now. But technology also creates new jobs. If it can do that fast enough, workers can be retrained, avoiding extensive periods of high unemployment.

Two, there could be successive waves of automation that have a much larger impact than anything we’ve experienced in the past. Automated driving, for example, will eliminate jobs that require a human driver. While this is but one of many examples of automation, it is by no means insignificant. According to the latest US census estimates, the largest occupation category among men was truck driver, employing 3.2 million people. The point is, in this scenario it will be difficult for the economy to adjust simply by retraining workers.

Three, the need for labor could be dramatically reduced. This is the scenario McAfee believes is most likely. There will still be a need for those entrepreneurs who create and perfect even more automation, but the rest of us will not be needed. Read more

Share

Economic losers of the world unite! When work becomes robotic

Amazon Warehouse. Swansea, Wales
Amazon Warehouse. Swansea, Wales
Back in 1759, just as the Industrial Revolution was about to begin, Adam Smith published his first book, The Theory of Moral Sentiments. There he observed that while the division of labor in a pin factory would greatly increase worker output, it would also make workers as “stupid and ignorant as it is possible to become.”

And how are things today? An undercover worker in an Amazon warehouse describes his experience:

“We are machines, we are robots, we plug our scanner in, we’re holding it, but we might as well be plugging it into ourselves”, he said.

“We don’t think for ourselves, maybe they don’t trust us to think for ourselves as human beings, I don’t know.”

Mindless

Robert Skidelsky begins his review of Mindless: Why Smarter Machines Are Making Dumber Humans (by Simon Head) with a reminder of Adam Smith’s observation. Skidelsky is a scholar of Keynesian economics and the author (with his son Edward) of last year’s How Much is Enough?: Money and the Good Life. The review goes on to discuss the various ways in which smarter machines are indeed making us dumber humans. (emphasis added in the following quotations)

It’s no longer just assembly lines that are automated

In his latest book [Head] claims that computer programming is now applied to all the principal sectors of the manufacturing and service economy. Read more

Share

Do what you love: The obligation to find one’s true calling

become-who-you-are
My favorite chapter in Alain de Botton’s The Pleasures and Sorrows of Work is “Career Counselling.” Here he discusses the modern idea that work should make us happy, along with the assumption that work defines our identity and the belief that it is work that makes our existence meaningful.

De Botton arranged to observe a career counsellor, Robert Symons, as he interacted with his clients (after obtaining the clients’ permission). (You can get a sense of Symons, who is also a psychologist, from the title of his unpublished book: The Real Me: Career as an Act of Selfhood.) Here are some of de Botton’s observations. (emphasis added)

On missing one’s true calling

[Symons] remarked that the most common and unhelpful illusion plaguing those who came to see him was the idea that they ought somehow, in the normal course of events, to have intuited – long before they had finished their degrees, started families, bought houses and risen to the top of law firms – what they should properly be doing with their lives. They were tormented by a residual notion of having through some error or stupidity on their part missed out on their true ‘calling’. Read more

Share

Would we be better off if we took ourselves less seriously as selves?

there-is-no-self
Gary Gutting (G.G.), a philosophy professor at Notre Dame, has been publishing a series of interviews on religion in the New York Times “blog” The Stone, which features the writing of “contemporary philosophers and other thinkers on issues both timely and timeless.” Recently he interviewed Jay L. Garfield (J.G.) on the subject of Buddhism (Garfield is a philosopher, currently at Yale-NUS College in Singapore). What follows is the concluding question and answer in this fairly long and quite interesting interview.

G.G.: Won’t the fundamental denial of a self be hard to maintain in the face of the modern emphasis on individuality?

J.G.: I don’t think so. For one thing, note that the view that there is no substantial self has a history in the West as well, in the thought of Hume, and of Nietzsche. For another, note that many contemporary cognitive scientists and philosophers have either rejected the view that there is such a self, or have defended some variety of a minimalist conception of the self. So the doctrine isn’t as secure in the non-Buddhist world as one might think. Read more

Share

Saying the things it is not possible to say

rebecca-solnit-the-faraway-nearby

Been thinking lately about my usual preoccupations. Vanity, death and the self. Healthism and neoliberalism. Blogging and social media. Feels like it’s time to blog a little differently.

Here’s a lovely passage from Rebecca Solnit’s recent memoir/essay/narrative The Faraway Nearby.

Writing is saying to no one and to everyone the things it is not possible to say to someone. Or rather writing is saying to the no one who may eventually be the reader those things one has no someone to whom to say them. Matters that are so subtle, so personal, so obscure, that I ordinarily can’t imagine saying them to the people to whom I’m closest. Every once in a while I try to say them aloud and find that what turns to mush in my mouth or falls short of their ears can be written down for total strangers. Said to total strangers in the silence of writing that is recuperated and heard in the solitude of reading. Is it the shared solitude of writing, is it that separately we all reside in a place deeper than society, even the society of two? Is it that the tongue fails where the fingers succeed, in telling truths so lengthy and nuanced that they are almost impossible aloud?

Read more

Share

Death, Afterlife, and Immortality of Bodies and Data

sanctriIn May of 2013 the journal The Information Society published a special issue called Death, Afterlife, and Immortality of Bodies and Data. I just discovered this, thanks to a post by the Centre for Medical Humanities (@mdiclhumanities). The post announced an upcoming research symposium on ” the digital mediation of dying, death, mourning and personal legacy.” The intent of the symposium: to discuss “how online connectivity is changing how, when and where we engage with death.” When I followed a link to the Death Online Research site, I found the special issue, plus a fairly substantial bibliography of publications on the subject of death online.

What follows are the abstracts from that special issue. Note that all content is behind a paywall, with the exception of the article Beyond the Grave: Facebook as a Site for the Expansion of Death and Mourning. I have added the emphasis that appears in the abstracts (out of consideration for those who have “only so much time”).

Introduction

Introduction to the Special Issue on the Death, Afterlife, and Immortality of Bodies and Data

Connor Graham, Martin Gibbs & Lanfranco Aceti

This special issue poses questions concerning death, afterlife and immortality in the age of the Internet. It extends previous work by examining current and emerging practices of grieving and memorializing supported by new media. It suggests that people’s lives today are extended, prolonged, and ultimately transformed through the new circulations, repetitions, and recontextualizations on the Internet and other platforms. It also shows that publics are being formed and connected with in new ways, and new practices and rituals are emerging, as the traditional notions of the body are being challenged. We argue that these developments have implications for how people will be discovered and conceived of in the future. We consider possible extensions to the research presented here in terms of people, practices, and data. First, some sections of the population, in particular those who are the dying and populations in developing countries and the Global South, have largely been neglected to date. Second, practices such as (online) suicide and sacrilegious or profane behaviors remain largely uninvestigated. Third, the discussion of the management of the digital self after death has only begun. We conclude by posing further questions concerning the prospect of emerging cities of the dead.

Read more

Share

Alive to death

Opening paragraph of Karl Ove Knausgaard’s My Struggle Book One:

For the heart, life is simple: it beats for as long as it can. Then it stops. Sooner or later, one day, this pounding action will cease of its own accord, and the blood will begin to run toward the body’s lowest point, where it will collect in a small pool, visible from outside as a dark, soft patch on ever whitening skin, as the temperature sinks, the limbs stiffen and the intestines drain. These changes in the first hours occur so slowly and take place with such inexorability that there is something almost ritualistic about them, as though life capitulates according to specific rules, a kind of gentleman’s agreement to which the representatives of death also adhere, inasmuch as they always wait until life has retreated before they launch their invasion of the new landscape. By which point, however, the invasion is irrevocable. The enormous hordes of bacteria that begin to infiltrate the body’s innards cannot be halted. Had they but tried a few hours earlier, they would have met with immediate resistance; however everything around them is quiet now, as they delve deeper and deeper into the moist darkness. They advance on the Havers Channels, the Crypts of Lieberkühn, the Isles of Langerhans. They proceed to Bowman’s Capsule in the Renes, Clark’s Column in the Spinalis, the black substance in the Mesencephalon. And they arrive at the heart. As yet, it is intact, but deprived of the activity to which end its whole construction has been designed, there is something strangely desolate about it, like a production plant that workers have been forced to flee in haste, or so it appears, the stationary vehicles shining yellow against the darkness of the forest, the huts deserted, a line of fully loaded cable-buckets stretching up the hillside.

Read more

Share

Subjectivity: The journal

subjectivity-journal-coverI discovered the journal Subjectivity thanks to a link on Dennis Fox’s Critical Psychology website. Here’s a brief, subjective self-description provided by the journal:

Subjectivity is an exciting and innovative transdisciplinary journal in the social sciences. Re-launched by Palgrave Macmillan in 2008, it examines the socio-political, cultural, historical and material processes, dynamics and structures of human experience.

Here’s a somewhat lengthier description from the same page, emphasizing “transdisciplinarity”. (To distnguish transdisciplinarity from interdisciplinarity and multidisciplinarity, see here.) (Emphasis in the last sentence below is mine.)

Subjectivity has been an important concept for academic research as well as for intervening in social and political life since the 1960s and 1970s. The idea of subjectivity had a catalytic impact in changing the terms of the debate in the social sciences: in anthropology, geography, psychology, sociology, post colonial theory, gender studies, cultural and media studies, social theory as well as the humanities.

Subjectivity attempts to capture ongoing debates and activities and to foster a discourse on subjectivity which goes beyond traditional dichotomies between the various disciplines.

The journal aims at a re-prioritization of subjectivity as a primary category of social, cultural, psychological, historical and political analysis. It wishes to encourage a variety of transdisciplinary engagements with this topic in theory as well as empirical research, and, accordingly, to advance the potential of engagement with subjectivity/subjectivities as a locus of social change and a means of political intervention.

Can academic papers inspire social change?

That certainly strikes me as a worthy goal, but social change and political intervention are extremely difficult, no? I’m not sure they can be accomplished by disseminating academic articles. Perhaps it’s a start. Perhaps it’s a way for those who share that goal to find each other. But if we find each other, are we then doing nothing more than simply talking to each other (preaching to the choir, as the cliché has it)? And if we are talking to each other, shouldn’t we at least be doing that in public, like on Twitter, where Subjectivity does not appear to have a presence? (There is a Facebook page, but it contains only announcements, not discussion.) Read more

Share

The philosophical value of a no-self perspective

self-no-selfI went looking for interesting reading material on the Buddhist concept of no-self and found one that sounded promising: Self, No Self?: Perspectives from Analytical, Phenomenological, and Indian Traditions. When I started reading it, however, my first impression was that the subject matter was over my head. Within the first few pages I was looking up the definitions of soteriological and diachronic (a word I’ve repeatedly looked up (diachronically), maybe now for the last time). A book that assumes I’m familiar with the distinction between thetic and non-thetic awareness – interesting as that may be – suggests I should be more philosophically informed before proceeding.

My interest was piqued, however, by a suggestion in the introductory chapter that the narrative self (the self as the author and central character in one’s life-story) might be – in effect – a cop out. And that it is precisely the no-self philosophical view that allows us to see this. In fact, it appears that the no-self view occupies the rational high ground when it comes to conceptions of the self. Or so the editors of this collection of essays (Mark Siderits, Evan Thompson, and Dan Zahavi) would argue. So I decided to take it more slowly, try a little harder, and give the introductory chapter another read through.

No self vs the narrative self

I had recently read Kenneth Gergen’s An Invitation to Social Construction, which included a discussion of the narrative self. Gergen advocates using a social construction approach to practical life problems, including its use by practitioners of narrative therapy. Narrative therapists, he writes, should help people “escape the imprisoning grasp of the dominant discourses of the culture, to create an ‘insurrection’ against injurious but prevailing assumptions.” Read more

Share

Learning in public

As mentioned in the last post, what follows is something I wrote over a year ago. I decided not to publish it then because … I don’t know, I guess because the subject made me uncomfortable. It asks the question: Is it OK for me to be a complete amateur in public? The answer when I originally wrote this was no, but I might want to do it anyway. The answer now is yes, let’s get on with it.

Sitting in lecture

How students learn best

Back in 1990, a physics professor at Harvard (Eric Mazur) noticed that his students were learning “next to nothing.” After studying physics for an entire semester, their erroneous conceptions of how the physical world actually worked were firmly intact. So one day Mazur tried an experiment. After several unsuccessful attempts to clarify a concept, he suggested to the class (of 150 students) that they discuss the matter among themselves. It worked. He reports: “within three minutes, they had figured it out.” Read more

Share
Skip to toolbar