Q&A

What Determines Whether We Tend to Use Singular or Plural First-Person Pronouns to Refer to Ourselves?

First-person pronouns are ubiquitous in human communication. We use them whenever we refer to ourselves—in conversations, letters, blogs, therapy sessions, and even in scientific articles. Moreover, they indicate whether we refer to ourselves as individuals or as parts of dyads or groups: Using first-person singular pronouns (e.g., I, me) highlights the self as a distinct entity, whereas using first-person plural pronouns (e.g., we, us) emphasizes its embeddedness in social relationships. However, we usually do not pay much attention to first-person pronouns. For example, consider your last conversation: You will (hopefully) have no problem remembering what you were talking about, but you will have a hard time in guessing how many, and which kind of, first-person pronouns you used.

During the last two decades, James Pennebaker and his colleagues initiated an intriguing body of research showing that people differ in the frequency with which they use first-person pronouns, that these individual differences are relatively stable across context and time, and that they are meaningfully related to underlying psychological processes. For example, frequent use of first-person singular pronouns has been interpreted as an implicit measure of self-focused attention and found to be positively correlated with depressive symptoms.

Recently, we conducted our own study that builds on and extends these findings. Specifically, we assessed pronoun use in clinical interviews. These were relatively standardized situations in which people spoke about their lives, their problems, and their relationships. Participants also responded to questionnaires assessing their depressive symptoms and interpersonal problems. Our results were quite clear: The more participants used pronouns such as “I” or “me” during the interview, the more they reported having elevated depressive symptoms and interpersonal problems. In contrast, the more they used pronouns such as “we” or “us,” the less depressive symptoms and interpersonal problems they had. Note that these associations hold when controlling for other variables that might have influenced participants’ pronoun use (e.g., having a partner or child).

In addition to these findings, we also observed that self-referencing the singular way was associated with a (maladaptive) tendency to seek attention from others, whereas self-referencing the plural way seemed to reflect the (adaptive) tendency to balance social pressures and an individual’s own social needs. In sum, our findings confirm that the automatic processing of first-person pronouns indeed provides a window into how people think, feel, and relate to others.

As always, our study had several limitations, and there are plenty of questions that cannot be sufficiently answered yet. For example, it’s difficult to say what actually “determines” individual differences in pronoun use. The current data, including our own, is essentially correlational, so we cannot draw any causal conclusions. Experimental designs are needed to explore the causes and consequences of pronoun use more thoroughly. Until then, my rather speculative answer to the question is that pronoun use is, at least partially, determined by our current mood and habitual way of interacting with others.

Johannes Zimmermann is a researcher in clinical psychology and psychotherapy at the University of Kassel in Germany.

Q&A

Why Did You Apply to Go to Mars? David Brin Answers

I believe that a one-way Mars mission is a viable-enough idea for some people to consider it, even knowing, as I do, that “one-way” has several possible connotations. On the surface, the claim is that you’ll strive hard upon arriving, unfold and deploy solar-powered units that can produce food and other necessities, and voila, become the first human colonist on the Red Planet. “One-way” then means you’re happy to spend the rest of a reasonable lifespan exploring, maintaining, and then greeting the next wave, knowing that you’ve helped immensely by foregoing the vast expense of a return trip. And knowing that all that time at low gravity has probably left you unfit for life on high-gravity Earth, in any event.

But, of course, this mission would have very low margins for error; even if the sustainability modules work perfectly, the odds are still strong that “one-way” will also mean “short duration.” In which case your hard work will have set the stage for follow-up missions that will use your base, build on and improve it … after they bury you. And future generations will erect a monument on that spot.

You’ll want very qualified people, who can have a decent stab at setting up the life-support technologies and perhaps (despite long odds) surviving to greet the second wave. But the first wave volunteers must be realistic about those odds, and willing to go, anyway. People who cannot imagine any sane person making that choice simply aren’t envisioning the wide range of human diversity.

Consider what I told my family. By the very earliest date that Mars One might launch, I expect to be a spry 75-year-old, whose kids are already successfully launched, and who might yet spend a few years doing something truly remarkable. I think you’ll find tens of thousands of people who—under those circumstances—will at least ponder it seriously.

David Brin is a physicist, best-selling science fiction writer, and futurist, who has applied to go to Mars.

Q&A

Are People Who Think They Are Good More Likely to Accept Morally Tainted Money?

The answer to this question depends in part on how people think about their own moral identity. Our research would suggest that individuals who are feeling particularly moral in the moment may feel greater license to accept morally tainted money, which is referred to as “moral licensing.” The idea that individuals who momentarily feel secure in their moral high ground subsequently act less moral has been documented by many researchers in a variety of domains, including volunteering and prejudice.

This work assumes that a person’s moral identity is relatively dynamic, like a barometer—when it is low, individuals are motivated to enact moral behaviors to feel more moral; when it is high, they feel licensed to behave a little less morally. However, the finding may be confusing because it contradicts some of what we know about people, which is that when individuals believe they are moral people they often act in ways consistent with their moral values. This idea assumes that a person’s morality is stable like a trait. We think we are moral people so we behave morally, or we think we are immoral so we act less morally.

The true answer likely lies somewhere in between. When individuals are led to think about morality as a trait, they tend to show moral consistency. However, in our experiments, they are led to think of morality in a more dynamic way (they think of a specific past moral act that they did, not whether they are a moral person or not) so they exhibit moral licensing.

So let’s get back to the question at hand. If you think of yourself as a good person and that is important to you, then you would avoid taking morally tainted money because it threatens to make you feel less moral. However, if something gives you a temporary boost in how moral you see yourself to be, you would be more willing to accept morally tainted money because, in that moment, the money is no longer a threat to your sense of your own morality. In short, this work suggests that it is in those moments when people are feeling particularly good that they may act a little bad.

Jennifer Stellar is a doctoral candidate in social psychology at the University of California, Berkeley.

Q&A

Why Would Feelings of Entitlement Have Increased Among Young Americans Over the Past Few Decades?

In our new study, we found that the young generation (known as Millennials or Generation Me) are more likely to say that having expensive material items is important, compared with baby boomers when they were the same age in the 1970s. At the same time, GenMe is less likely to say they are willing to work hard. Thus, there’s a growing gap between wanting money and wanting to work to earn it. That’s one possible definition of entitlement.

We also tried to find out what aspects of the culture co-occured with these trends. We found that entitlement was highest when the country spent a lot of money on advertising—which makes sense, as advertising tends to show the material items but not the hard work necessary to earn them. Entitlement was also higher when families were unstable and disconnected and unemployment was high, suggesting that insecurity may lead to unrealistic materialism. Overall, these generational shifts are rooted in a culture that emphasizes fame and money but not always the long hours of work usually necessary to achieve financial success. Even now, people who “get rich quick” are the exception, but they are often portrayed in the media as the rule.

Jean Twenge is the author of Generation Me and a professor of psychology at San Diego State University.

Join the Conversation

Twitter

Twitter Search Feed: @scireltoday