Blogging and my Twitter vacation

blogging-and-twitter

I wanted to write about my Twitter vacation because I wanted to think about the interaction between Twitter and my blogging activity (and I’m someone who needs to write in order to think). In an earlier post, I described how, after a few years of blogging, I’d come to think of my posts as falling into two categories. One I called hey-look-at-this posts — short, quick references to interesting things I’d recently read. The other type was longer and, ideally, was more reflective and — dare I say — substantive.

I felt that writing quick posts was distracting me from writing longer ones. It occurred to me (this was in 2011) that I could simply tweet the items that interested me rather than blog them. What I hadn’t anticipated was that, once I started tweeting frequently, I practically stopped blogging. After a few months, I felt a need to explain my absence and wrote a post called On sabbatical. I assume my decline in blogging was due to both the time I spent on Twitter and Twitter’s ability to satisfy my desire to communicate. Read more

Share

My Twitter vacation

twitter-vacation

On December 22 I pinned a tweet to the top of my tweet list saying I was taking a vacation from Twitter. I didn’t speculate, publicly or privately, on how long this would last. It was an experiment. I linked to a nice post by Adam Brault (@adambrault) called I quit Twitter for a month and it completely changed my thinking about mostly everything. My own vacation was less earth-shaking, but it did get me thinking about my digital habits.

On January 8 I came across a review of a book whose author I follow on Twitter (A History of Lung Cancer: The Recalcitrant Disease by Carsten Timmermann). Since I wanted to share this with my Twitter history of medicine colleagues, I broke my Twitter fast and unpinned my vacation tweet.

So it turns out I went seventeen days without Twitter. Subsequently I’ve found myself only gradually resuming my typical Twitter behavior. It’s too soon to say whether this vacation permanently altered either how active I am on Twitter or the nature of my tweets. That’s a definite possibility. The main reason I haven’t resumed my prior behavior is that seventeen days seems to have been enough time to acquire a new set of habits. Read more

Share

Positive thinking as social control

I found the following video in a blog post written by Zoë Siobhan Baillie (@Zoe_Baillie). The audio is an excerpt from a speech by Barbara Ehrenreich, where she elaborates on ideas from her book Bright-sided: How the Relentless Promotion of Positive Thinking Has Undermined America.

Ehrenreich’s observations deserve to be understood and appreciated. Unfortunately, there’s a huge commercial market in positive thinking, so her insights face an uphill battle. Judging by the comments left on YouTube (e.g., “This seems like an incredibly shallow and nonsensical analysis”), the hill is quite steep. Read more

Share

Death of philosophy greatly exaggerated

hawking-philosophy-is-deadSleep medicine categorizes different types of insomnia by the part of the sleep cycle that’s “troublesome.” For example, there’s difficulty falling asleep, difficulty staying asleep, and waking up too early. I can fall asleep in five minutes, but I wake up at four AM. I don’t consider this insomnia, but a normal expression of two-part or segmented sleep (also known as bimodal, bifurcated or divided sleep). This is the way we used to sleep before industrialization.

Four AM is when I listen to what used to be called “books on tape.” Recently I listened to a series of lectures by philosophy professor Lawrence Cahoone called The Modern Intellectual Tradition: From Descartes to Derrida. The presentation of this potentially difficult subject matter — Kant, Hegel, Heidegger, Wittgenstein — was excellent. Cahoone made philosophical ideas interesting and (relatively) easy to understand. The lectures kept me awake rather than putting me to sleep, and I came away thinking I’d be happy to spend the rest of my life reading nothing but philosophy. Read more

Share

Technology and New Challenges for Privacy: Journal of Social Philosophy Special Issue

privacy-guy-fawkes

The good news: The new issue of the Journal of Social Philosophy is a special issue on “Technology and New Challenges for Privacy.” The less good news is that it’s entirely behind a paywall.

There are no abstracts per se, but the first page of each of the seven articles (including the introduction by editor Leslie P. Francis) is available. I used my snipping tool to place the text below. (Note that the emphasis has been added by me.)

What looks especially interesting here:

  • The use of large-scale sets of health data raises questions of social justice that are often obscured by the way they are framed. (Privacy, Confidentiality, and Justice)
  • Continuous surveillance can place individuals at risk of physical, economic, political, or other damage. Just being aware of how susceptible we are to objectification by anonymous watchers can feel belittling. (Continuous Surveillance of Persons with Disabilities: Conflicts and Compatibilities of Personal and Public Goods)
  • The interests aligned against privacy are often defined in terms of their larger social value, and the protection of privacy often has lower political priority than other social interests. (Privacy and the Integrity of Liberal Politics: The Case of Governmental Internet Searches)
  • Weighing the value and the harm of anonymity (The Ties That Blind: Conceptualizing Anonymity)

Read more

Share

A self-indulgent account of my journey from ‘Vanity’ to the nature of the contemporary self

identity-self-vanity-nikolas-roseI started this blog because I was interested in understanding the history of the self. Based on the reading I’ve done so far, I can see that various academic disciplines (philosophy, sociology, critical and cultural psychology, cultural and intellectual history) have substantially different “explanations” for how and why the self has changed. I can identify explanations that seem compatible with my intuitive preconceptions (not the most objective approach, I know), but it’s clear that the underlying assumptions of any one explanation are open to legitimate criticisms — most of which I’m not even aware of.

As a result, I hesitate. I am a complete novice with respect to this particular subject matter, and my background in these disciplines is the result of an incomplete and haphazard self-education.

Nevertheless, I seem to have arrived at a strong preference for the explanations offered by Nikolas Rose. Many (though certainly not all) of Rose’s fundamental ideas are indebted to Foucault. I am not a student of Foucault, but — thanks to Rose — I have come to appreciate such concepts as problematization, governmentality, responsibilization (an extension by Rose of governmentality), normalization, techniques of the self, and the conduct of conduct.

I would much prefer to avoid using terms such as these in what I write. I appreciate the efficiency of communication that academic terminology provides, but unfortunately it limits one’s audience. The nature of the self is potentially of interest to anyone, not just to those who engage in academic investigation and debate. Fortunately, Rose writes with the intention of being understood by a broad audience. Read more

Share

Just how extensively will jobs be lost to automation?

automated-humanNew Scientist recently interviewed Andrew McAfee, one of the authors of The Second Machine Age: Work, progress and prosperity in a time of brilliant technologies. McAfee and co-author Erik Brynjolfsson are experts in digital technologies and economics (their previous book was called Race Against the Machine: How the Digital Revolution Is Accelerating Innovation, Driving Productivity, and Irreversibly Transforming Employment and the Economy). A basic question raised in the interview (and addressed in the book) is whether advances in technology will greatly reduce the need for human labor, leaving a large segment of the population unemployed. Their answer, basically, is ‘yes.’

In the interview, McAfee describes three possible scenarios. One, the disruptions could be more or less temporary. Technology has been eliminating the need for human labor in various segments of the workforce for hundreds of years now. But technology also creates new jobs. If it can do that fast enough, workers can be retrained, avoiding extensive periods of high unemployment.

Two, there could be successive waves of automation that have a much larger impact than anything we’ve experienced in the past. Automated driving, for example, will eliminate jobs that require a human driver. While this is but one of many examples of automation, it is by no means insignificant. According to the latest US census estimates, the largest occupation category among men was truck driver, employing 3.2 million people. The point is, in this scenario it will be difficult for the economy to adjust simply by retraining workers.

Three, the need for labor could be dramatically reduced. This is the scenario McAfee believes is most likely. There will still be a need for those entrepreneurs who create and perfect even more automation, but the rest of us will not be needed. Read more

Share

Work, leisure and the self

i-sure-did-a-good-days-work-today
As I anticipate writing yet another post on technology and the robotizing of the workforce, it occurs to me that I should perhaps say something about what this has to do with the self and why this subject interests me.

One very obvious observation is that — for almost everyone — work (employment) not only occupies most of our waking hours, but is a major component of our identity. And identity, of course, is a significant aspect of the self. Being involuntarily unemployed, for example, has a negative impact on our sense of self. Being a successful career professional (doctor, lawyer, architect, academic, hedge fund manager), as opposed to working at MacDonald’s for a minimum wage, is one of the ways we Americans segregate people into social classes. Social class — which relates to social status — affects how we feel about ourselves when we compare ourselves to others, as we inevitably do. (Social class is highly correlated with, but not identical to, social status. You can be from an elite social class, but if you murder your wife, your social status will decline.) I don’t think I need to say any more than this to make the case that work is relevant to issues of the self.

I’m interested in how our sense of self — how we regard ourselves — has changed over the course of the 20th century. Ever since I read Vanity: 21st Century Selves last year — where I was relentlessly confronted with what’s involved in being a self these days — I’ve been seeking explanations for how we ended up with the type of self we have today. I’d characterize this self as being so thoroughly psychologized that we can’t imagine not being preoccupied with (and this is the subject matter of Vanity) our self-esteem, our social status, the attractiveness of our bodies, the youthfulness of our appearance, and how many Facebook friends we have. Read more

Share

Economic losers of the world unite! When work becomes robotic

Amazon Warehouse. Swansea, Wales
Amazon Warehouse. Swansea, Wales
Back in 1759, just as the Industrial Revolution was about to begin, Adam Smith published his first book, The Theory of Moral Sentiments. There he observed that while the division of labor in a pin factory would greatly increase worker output, it would also make workers as “stupid and ignorant as it is possible to become.”

And how are things today? An undercover worker in an Amazon warehouse describes his experience:

“We are machines, we are robots, we plug our scanner in, we’re holding it, but we might as well be plugging it into ourselves”, he said.

“We don’t think for ourselves, maybe they don’t trust us to think for ourselves as human beings, I don’t know.”

Mindless

Robert Skidelsky begins his review of Mindless: Why Smarter Machines Are Making Dumber Humans (by Simon Head) with a reminder of Adam Smith’s observation. Skidelsky is a scholar of Keynesian economics and the author (with his son Edward) of last year’s How Much is Enough?: Money and the Good Life. The review goes on to discuss the various ways in which smarter machines are indeed making us dumber humans. (emphasis added in the following quotations)

It’s no longer just assembly lines that are automated

In his latest book [Head] claims that computer programming is now applied to all the principal sectors of the manufacturing and service economy. Read more

Share

Do what you love: The obligation to find one’s true calling

become-who-you-are
My favorite chapter in Alain de Botton’s The Pleasures and Sorrows of Work is “Career Counselling.” Here he discusses the modern idea that work should make us happy, along with the assumption that work defines our identity and the belief that it is work that makes our existence meaningful.

De Botton arranged to observe a career counsellor, Robert Symons, as he interacted with his clients (after obtaining the clients’ permission). (You can get a sense of Symons, who is also a psychologist, from the title of his unpublished book: The Real Me: Career as an Act of Selfhood.) Here are some of de Botton’s observations. (emphasis added)

On missing one’s true calling

[Symons] remarked that the most common and unhelpful illusion plaguing those who came to see him was the idea that they ought somehow, in the normal course of events, to have intuited – long before they had finished their degrees, started families, bought houses and risen to the top of law firms – what they should properly be doing with their lives. They were tormented by a residual notion of having through some error or stupidity on their part missed out on their true ‘calling’. Read more

Share

Would we be better off if we took ourselves less seriously as selves?

there-is-no-self
Gary Gutting (G.G.), a philosophy professor at Notre Dame, has been publishing a series of interviews on religion in the New York Times “blog” The Stone, which features the writing of “contemporary philosophers and other thinkers on issues both timely and timeless.” Recently he interviewed Jay L. Garfield (J.G.) on the subject of Buddhism (Garfield is a philosopher, currently at Yale-NUS College in Singapore). What follows is the concluding question and answer in this fairly long and quite interesting interview.

G.G.: Won’t the fundamental denial of a self be hard to maintain in the face of the modern emphasis on individuality?

J.G.: I don’t think so. For one thing, note that the view that there is no substantial self has a history in the West as well, in the thought of Hume, and of Nietzsche. For another, note that many contemporary cognitive scientists and philosophers have either rejected the view that there is such a self, or have defended some variety of a minimalist conception of the self. So the doctrine isn’t as secure in the non-Buddhist world as one might think. Read more

Share

Saying the things it is not possible to say

rebecca-solnit-the-faraway-nearby

Been thinking lately about my usual preoccupations. Vanity, death and the self. Healthism and neoliberalism. Blogging and social media. Feels like it’s time to blog a little differently.

Here’s a lovely passage from Rebecca Solnit’s recent memoir/essay/narrative The Faraway Nearby.

Writing is saying to no one and to everyone the things it is not possible to say to someone. Or rather writing is saying to the no one who may eventually be the reader those things one has no someone to whom to say them. Matters that are so subtle, so personal, so obscure, that I ordinarily can’t imagine saying them to the people to whom I’m closest. Every once in a while I try to say them aloud and find that what turns to mush in my mouth or falls short of their ears can be written down for total strangers. Said to total strangers in the silence of writing that is recuperated and heard in the solitude of reading. Is it the shared solitude of writing, is it that separately we all reside in a place deeper than society, even the society of two? Is it that the tongue fails where the fingers succeed, in telling truths so lengthy and nuanced that they are almost impossible aloud?

Read more

Share

Death, Afterlife, and Immortality of Bodies and Data

sanctriIn May of 2013 the journal The Information Society published a special issue called Death, Afterlife, and Immortality of Bodies and Data. I just discovered this, thanks to a post by the Centre for Medical Humanities (@mdiclhumanities). The post announced an upcoming research symposium on ” the digital mediation of dying, death, mourning and personal legacy.” The intent of the symposium: to discuss “how online connectivity is changing how, when and where we engage with death.” When I followed a link to the Death Online Research site, I found the special issue, plus a fairly substantial bibliography of publications on the subject of death online.

What follows are the abstracts from that special issue. Note that all content is behind a paywall, with the exception of the article Beyond the Grave: Facebook as a Site for the Expansion of Death and Mourning. I have added the emphasis that appears in the abstracts (out of consideration for those who have “only so much time”).

Introduction

Introduction to the Special Issue on the Death, Afterlife, and Immortality of Bodies and Data

Connor Graham, Martin Gibbs & Lanfranco Aceti

This special issue poses questions concerning death, afterlife and immortality in the age of the Internet. It extends previous work by examining current and emerging practices of grieving and memorializing supported by new media. It suggests that people’s lives today are extended, prolonged, and ultimately transformed through the new circulations, repetitions, and recontextualizations on the Internet and other platforms. It also shows that publics are being formed and connected with in new ways, and new practices and rituals are emerging, as the traditional notions of the body are being challenged. We argue that these developments have implications for how people will be discovered and conceived of in the future. We consider possible extensions to the research presented here in terms of people, practices, and data. First, some sections of the population, in particular those who are the dying and populations in developing countries and the Global South, have largely been neglected to date. Second, practices such as (online) suicide and sacrilegious or profane behaviors remain largely uninvestigated. Third, the discussion of the management of the digital self after death has only begun. We conclude by posing further questions concerning the prospect of emerging cities of the dead.

Read more

Share

Anti-intellectualism, pornography, and a communal sense of the sacred

philip-rieffI was just reading the introduction to the 2006 (fortieth-anniversary) edition of Philip Rieff’s The Triumph of the Therapeutic. 2006 – as it happens — was the year that Rieff died (at age 83). The introduction was written by social/cultural/intellectual historian Elisabeth Lasch-Quinn and contained some passages I thought worth quoting.

Capitalism and the self

Rieff’s book is about the cultural transformation of the twentieth-century — from widely held religious/communal values to the prominence of psychology as supreme arbiter of interests and values. According to Lasch-Quinn, Rieff sees a connection between this transformation and the “advances and excesses of capitalism, with its radically destructive gospel of greed.”

[H]e makes a clear link between modern wealth accumulation and the “symbolic impoverishment” of the therapeutic age. The wealthy attempt to compensate for the shortfall with money and its accoutrements, making both art and science into forms of self-analysis and self-worship.

Self-interest becomes “the only principle of action or judgment.”

The book’s implicit connection between consumerism and the cult of impulse release, the nihilism of which Rieff captures so persuasively, represents a searing indictment of the status quo, a clear condemnation of a society “technologically loaded with bribes.”

Nice phrase, nice insight that last bit, and so much more characteristic of society 47 years later. And “gospel of greed” turns out to be even more descriptive of the 1980s (“greed is good”) than the 1960s. In a preface to the 20th anniversary edition of his book (1987), Rieff remarks: “This book stands as it first appeared. To change the text of a ‘prophetic’ character would be to write another book.” Read more

Share

Alive to death

Opening paragraph of Karl Ove Knausgaard’s My Struggle Book One:

For the heart, life is simple: it beats for as long as it can. Then it stops. Sooner or later, one day, this pounding action will cease of its own accord, and the blood will begin to run toward the body’s lowest point, where it will collect in a small pool, visible from outside as a dark, soft patch on ever whitening skin, as the temperature sinks, the limbs stiffen and the intestines drain. These changes in the first hours occur so slowly and take place with such inexorability that there is something almost ritualistic about them, as though life capitulates according to specific rules, a kind of gentleman’s agreement to which the representatives of death also adhere, inasmuch as they always wait until life has retreated before they launch their invasion of the new landscape. By which point, however, the invasion is irrevocable. The enormous hordes of bacteria that begin to infiltrate the body’s innards cannot be halted. Had they but tried a few hours earlier, they would have met with immediate resistance; however everything around them is quiet now, as they delve deeper and deeper into the moist darkness. They advance on the Havers Channels, the Crypts of Lieberkühn, the Isles of Langerhans. They proceed to Bowman’s Capsule in the Renes, Clark’s Column in the Spinalis, the black substance in the Mesencephalon. And they arrive at the heart. As yet, it is intact, but deprived of the activity to which end its whole construction has been designed, there is something strangely desolate about it, like a production plant that workers have been forced to flee in haste, or so it appears, the stationary vehicles shining yellow against the darkness of the forest, the huts deserted, a line of fully loaded cable-buckets stretching up the hillside.

Read more

Share

Harvesting your intentions: Tumblr’s David Karp meets Zygmunt Bauman

david-karpShortly after Yahoo confirmed its plans to purchase Tumblr for $1.1 billion, Charlie Rose interviewed 26-year-old Tumblr founder and CEO David Karp. Karp was wearing his signature gray hoodie. Rose sported a purple tie. I was struck by what Karp had to say about advertising.

Here’s the set-up:

Rose: What excites you the most? The building of the business or creating the product?

Karp: So, we have this … ah … look. The product is why I got into this. I have to tell you that the business end of this has become such an interesting, exciting, fun challenge for us, because we’ve got this thesis that we can build a business that not only does not compromise everything that is special about Tumblr – makes it such an incredible home for these incredibly talented people – but actually makes Tumblr a better place. In the same way that, you know, if you ripped all the ads out of Vogue, one, it would be half the magazine, but two, it would actually lose a lot of the great content. The way we’ve approached advertising doesn’t look anything like advertising across the rest of the Internet today. So much of …. There’s a lot of nuance here, but, you know, so much of …

Rose: But explain it to me,

Karp: Sure, sure, sure.

Rose: … because it’s the essence of what you’re trying to do.

But here’s where it gets really interesting (my emphasis added) – where Karp expands on what will make Tumblr a “better place.” Read more

Share

Subjectivity: The journal

subjectivity-journal-coverI discovered the journal Subjectivity thanks to a link on Dennis Fox’s Critical Psychology website. Here’s a brief, subjective self-description provided by the journal:

Subjectivity is an exciting and innovative transdisciplinary journal in the social sciences. Re-launched by Palgrave Macmillan in 2008, it examines the socio-political, cultural, historical and material processes, dynamics and structures of human experience.

Here’s a somewhat lengthier description from the same page, emphasizing “transdisciplinarity”. (To distnguish transdisciplinarity from interdisciplinarity and multidisciplinarity, see here.) (Emphasis in the last sentence below is mine.)

Subjectivity has been an important concept for academic research as well as for intervening in social and political life since the 1960s and 1970s. The idea of subjectivity had a catalytic impact in changing the terms of the debate in the social sciences: in anthropology, geography, psychology, sociology, post colonial theory, gender studies, cultural and media studies, social theory as well as the humanities.

Subjectivity attempts to capture ongoing debates and activities and to foster a discourse on subjectivity which goes beyond traditional dichotomies between the various disciplines.

The journal aims at a re-prioritization of subjectivity as a primary category of social, cultural, psychological, historical and political analysis. It wishes to encourage a variety of transdisciplinary engagements with this topic in theory as well as empirical research, and, accordingly, to advance the potential of engagement with subjectivity/subjectivities as a locus of social change and a means of political intervention.

Can academic papers inspire social change?

That certainly strikes me as a worthy goal, but social change and political intervention are extremely difficult, no? I’m not sure they can be accomplished by disseminating academic articles. Perhaps it’s a start. Perhaps it’s a way for those who share that goal to find each other. But if we find each other, are we then doing nothing more than simply talking to each other (preaching to the choir, as the cliché has it)? And if we are talking to each other, shouldn’t we at least be doing that in public, like on Twitter, where Subjectivity does not appear to have a presence? (There is a Facebook page, but it contains only announcements, not discussion.) Read more

Share

And in conclusion …

When I decided to write some introductory posts that explained my personal interest in the subject matter of this blog, I didn’t anticipate that I would write 24 of them and not post them until I’d finished the last one. But once I started down that path, I decided to follow it to the end. This is the end.

damaged-lives

So now I ask: What have I learned by writing these posts?

One thing I was a bit surprised to find was that, when I reread what I’d written over 30 years ago, I still identified strongly with that author. I still have the same questions, the same aspirations, the same intellectual and emotional responses to a certain set of ideas. Given that I lack a ‘motive force’ that tells me what to do next — which means I find myself moving on from one thing to the next without any obvious theme or reason — I would have expected more change in my life than continuity. It seems I’m still attracted to what the sociology of knowledge/social construction can tell me about everyday life and the taken-for-granted, which may mean I’m still trying to understand whay they can tell me about my own life.

Critical psychology and the social determinants of health

I also hadn’t anticipated that I would find a strong connection between the social determinants of health – which I’d been writing about at The Health Culture — and critical psychology. Why is it so difficult to get the medical profession and public policy makers to acknowledge the importance of the social determinants of health and take appropriate action? I know that fundamentally there’s an economic explanation, but I don’t find that sufficient to explain why we care so little about inequality and social disparities. Read more

Share

Bibliography 1.0: Can I escape the judgment of psychology?

da-vinci-the-selfI decided to make a list of the books I’ve recently read, browsed, or added to my reading list. This turned out to be a thought-provoking process. Although this may sound naïve, when I first imagined this blog, I didn’t anticipate that psychology would be such a major category in my bibliography. My main interest, after all, was the social and cultural history of the self. But of course the self is a subject of considerable interest to academic psychologists these days. The ‘psy’ disciplines – psychology, psychiatry, psychotherapy, psychoanalysis — have been incredibly influential in how we think of ourselves. That’s something I’m now beginning to appreciate more fully.

Recurring questions from my Chinese horoscope

The actual process of making the list was probably more valuable for me than the list itself. And the list may not be particularly valuable for anyone else, since I can’t recommend these books the way I recommended books on the history of self-help. That’s because I’m not sufficiently familiar with most of them. Plus, the categories turned out to be imprecise and unsatisfying: Should Jerrold Seigel’s The Idea of the Self: Thought and experience in Western Europe since the 17th century go under Self, Philosophy, or History? Read more

Share

The history of self-help: Some books to read

self-help-books-coverI’ve wanted to read more about the history of self-help for years now. I’ve started Micki McGee’s Self-Help Inc. several times and always been distracted by something that seemed more pressing. I knew that if I read about self-help I would want to write about it, and I wasn’t quite sure I wanted to do that at The Health Culture.

I have written there about happiness and the positive psychology movement. I wrote several posts on a book I really enjoyed: Pascal Bruckner’s Perpetual Euphoria: On the Duty to Be Happy. Although I never got around to writing about it, I’ve repeatedly recommended a great article by William Davies called The Political Economy of Unhappiness. It’s about the responsibility of Britain’s National Health Service to keep workers happy, not for the benefit of employees, but to improve corporate efficiency. While these were not directly on the history of self-help, they were on the fringes.

Below I’ve compiled a list of books that I’ve either read, want to read, or want to refer to (even if they’re not worth a close reading). I’ve divided them into two parts. This first group contains books I feel confident recommending. Read more

Share

Self-help from Norman Vincent Peale to the new Oprah

surreptitious-self-helpThe quotation at the start of the last post — “[W]e are in a new era of mass self-help, wherein the laboratory and the writer work together to teach us how to change ourselves, rather than our world” — is from an excellent article in New York Magazine. Boris Kachka describes what self-help has become. Though he writes mainly about how self-help has changed the publishing industry, his analysis of how this relates to cultural history — the shift from pragmatism and self-reliance to being personally responsible for self-regulation — is spot on.

Kachka refers to a “new kind of self-help,” by which he means: “These days, self-help is unembarrassed, out of the bedside drawer and up on the coffee table, wholly transformed from a disreputable publishing category to a category killer, having remade most of nonfiction in its own inspirational image along the way.”

Here are some passages from the article that I particularly enjoyed (emphasis added):

This new kind of self-help could never thrive in a vacuum. Or rather, it thrives in a particular vacuum—the one left behind by the disappearance of certain public values that once fulfilled our lives. Strains of self-help culture — entrepreneurship, pragmatism, fierce self-reliance, gauzy spirituality — have been embedded in the national DNA since Poor Richard’s Almanack. But in the past there was always a countervailing force, an American stew of shame and pride and citizenship that kept these impulses walled off, sublimating private anxiety to the demands of an optimistic meritocracy. That force has gradually been weakened by the erosion of all sorts of structures, from the corporate career track to the extended family and the social safety net. Instead of regulation, we have that new buzzword, self-regulation; instead of an ambivalence over “selling out,” we have the millennial drive to “monetize”; and instead of seeking to build better institutions, we mine them in order to build better selves. Read more

Share

Self-help as psychological healthism

Photo: Paul Ruscha/© Ed Ruscha/Courtesy of Ed Ruscha and Gagosian Gallery (“Me”, 2001)
Photo: Paul Ruscha/© Ed Ruscha/Courtesy of Ed Ruscha and Gagosian Gallery (“Me”, 2001)

[W]e are in a new era of mass self-help, wherein the laboratory and the writer work together to teach us how to change ourselves, rather than our world. (Boris Kachka)

I’m interested in self-help for the same reasons I’m interested in healthism. “Self-help is the psychiatric equivalent of healthism,” I once wrote. Healthism is an anxious preoccupation with one’s physical health, encouraged by those who profit financially from inducing anxiety. Self-help is an anxious preoccupation with one’s psychological self, encouraged by an abundance of self-help literature, personal seminars, and tell-all TV shows. (More fundamentally, of course, the proliferation of self-help advice is the result of a profound twentieth century change in how we understand ourselves, which is the subject of this blog.)

Both healthism and self-help assume that individuals are ultimately responsible for their problems, whether medical or psychological. Personal responsibility relieves society of the expense and inconvenience of creating healthier, more equitable lives for its members. Robert Crawford pointed this out in 1980. That early glimmer of a potential trend has done nothing but escalate. Read more

Share

Critical psychology – a new home?

critical-psychologyAnthropology, sociology and history – disciplines that consider a variety of cultures, social conventions, and historical times – make a valuable contribution to understanding the self (along with the many other valuable contributions they make, of course). They force us to acknowledge that what is true for one specific culture, society, or historical time is not universally true.

Because psychology considers itself a science, with theories based on empirically validated findings, it lacks the benefit of the self-reflective qualities intrinsic to humanist disciplines. This leaves psychology open to criticism.

For example, there’s the charge that psychology has been guilty of assuming that what it observes in Western (North American and European) cultures must be true for other cultures, as well as for our ancestors. In an excellent article on this point, The Weirdest People in the World (PDF), the authors point out that those who live in Western societies not only differ psychologically from people in the rest of the world. WEIRD (Western, Educated, Industrialized, Rich, and Democratic) people are in fact quite exceptional compared to other cultures and to their ancestors. Americans, in particular, are so unusual that they stand out as “outliers among outliers.” (See also on this: Ethan Watters, We Aren’t the World.) Read more

Share

The joy of bibliographies

derek-de-solla-priceOne of my graduate school professors, Derek J. de Solla Price (Little Science, Big Science), used to say that if you start with the references in the bibliography of one journal article, look up each of those references and follow their bibliographies to yet another set of articles, and so on – like the branches of a tree – you would eventually locate all the important publications in a given field.

Bibliographies are often the first thing I read in a book. If an academic book has only footnotes and no bibliography, I’m disappointed. It’s so much easier to scan a bibliography for interesting titles than to ferret them out from wordy footnotes. An annotated bibliography is rare, but ideal.

On the Internet, Amazon has a feature called “Customers who viewed this item also viewed,” which is somewhat useful. Not that long ago Amazon used to have something much better: Titles that reference this book. It’s now gone, but you can accomplish the same thing these days using Google Books.

Mark Leary and the psychology of the self

It was a bibliography that recently started me reading about the psychology (as opposed to the history, philosophy, sociology or anthropology) of the self. A few months ago I saw a full page magazine ad by The Teaching Company for a lecture series called Understanding the Mysteries of Human Behavior. The DVD version was available in my local library and, since I had some free time, I ended up watching all 24 of the half-hour lectures. Read more

Share

The philosophical value of a no-self perspective

self-no-selfI went looking for interesting reading material on the Buddhist concept of no-self and found one that sounded promising: Self, No Self?: Perspectives from Analytical, Phenomenological, and Indian Traditions. When I started reading it, however, my first impression was that the subject matter was over my head. Within the first few pages I was looking up the definitions of soteriological and diachronic (a word I’ve repeatedly looked up (diachronically), maybe now for the last time). A book that assumes I’m familiar with the distinction between thetic and non-thetic awareness – interesting as that may be – suggests I should be more philosophically informed before proceeding.

My interest was piqued, however, by a suggestion in the introductory chapter that the narrative self (the self as the author and central character in one’s life-story) might be – in effect – a cop out. And that it is precisely the no-self philosophical view that allows us to see this. In fact, it appears that the no-self view occupies the rational high ground when it comes to conceptions of the self. Or so the editors of this collection of essays (Mark Siderits, Evan Thompson, and Dan Zahavi) would argue. So I decided to take it more slowly, try a little harder, and give the introductory chapter another read through.

No self vs the narrative self

I had recently read Kenneth Gergen’s An Invitation to Social Construction, which included a discussion of the narrative self. Gergen advocates using a social construction approach to practical life problems, including its use by practitioners of narrative therapy. Narrative therapists, he writes, should help people “escape the imprisoning grasp of the dominant discourses of the culture, to create an ‘insurrection’ against injurious but prevailing assumptions.” Read more

Share

Basic research on the self

The name of this blog, Basic research on the self, comes from an essay by Phillip Lopate. The essay is an introduction to his edited collection of essays, The Art of the Personal Essay. A number of qualities characterize the personal essay, according to Lopate. These include the author’s willingness to write in the first… Read more

Share

Can we think outside our culture: My Chinese horoscope

I have been preoccupied with my Chinese horoscope for over a decade. One of its revelations touched on something I instinctively felt was true, but to this day I continue to resist believing it. Why is that? The story starts with my sense of being a little different, odd, abnormal (I so dislike that last… Read more

Share

Philosophers ask: What do we mean by “self”

Psychology is a relatively recent discipline (late 19th century). With a few notable exceptions (William James, neo-Freudians, humanists), psychologists largely ignored the self until the late 20th century. Only with the decline of behaviorism and psychoanalysis did the self emerge as a topic worthy of consideration. Philosophy, on the other hand, has a long history… Read more

Share
Skip to toolbar