Revelations

Standard

I’m in the middle of  a research revelation and an ideological one, the two of which are struggling to meet in my mind.

First, the research. The movement known as Occupy is clearly not an isolated movement. It has attracted many people from across the world to act in ways that seem, even to those of us who are statistically part of the 99 percent. It is also salient because it came during a time when revolts and protests are not exactly that difficult to find in the news. For the second time this year, Tahrir Square in Egypt is full of citizen protestors. Two years ago, Iran made front page news as the first truly Twitter-aided election protests. The Arab Spring, as it is called, has thrust the importance of the Middle East in the West’s face, probably quicker than it was ready to accept.

So, it is no surprise that we have the Occupy movements happening now as well as the media’s consistent coverage of them. Because protests, revolts, and uprisings have becoming such hot news items in recent years, I wonder if this has always been the case throughout the history of news media. Have the media always covered protests with such fervor? Have they ever been biased in their coverage? What is the history behind the coverage of protests in the United States? This is what I think I’m after.

Secondly with my ideological issues. Last night, I watched Page One, a documentary about The New York Times with a focus on the faltering newspaper market. There is a point where media reporter David Carr admits how comical he thinks it is when people are both excited and scared to talk to him. I mean, he is The New York Times. He is part of the institution of journalism. But, he’s also a person doing his job. True, I’ve had people tell me they love what I do, but Carr operates in an entirely other universe, where even the people with the coolest jobs would take a pay cut and move to the Bronx just to set foot inside the Times building.

But, Times reporters are just people. If you call their office phone lines, they may pick up. You can find them at local New York eateries or scattered around the globe doing their jobs. After watching this documentary, I’m reminded, as I often am, at just how much labor truly goes into crafting a solid front-page news story – or any story in the paper for that matter. They work long hours, are at the whim of their sources, wear their creativity on their sleeves proudly, and in the end are still dependent on their editors’ decisions as to the quality of their work.

And researchers are always digging through the archives with the intent of uncovering some sort of framing bias or support for some theoretical shift in the way journalism happens.

Therefore, I’d like to approach this project with a historian’s eye. No bias except probably one that allows me to respect the work of journalists.

Bathroom secrets: Why we wash our hands

Standard

There’s a person standing next to you in the bathroom. This person is washing his or her hands and the chances that you are doing, have done, or are about to do the same are quite high. That is, unless the toilet you’re doing your business in is awkwardly close to the sinks.

In the event that you are both washing your hands, I’d like you to, in your mind, remove this extra person from the situation. In this new hypothetical bathroom, where you are the lone user, are washing your hands? Or, have you already left, tainted hands and all? Do you think that your bathroom mate, sans your presence, would have skipped out on washing?

Some communication researchers have done us all a favor by uncovering the oft imagined, yet seldom discussed, loner-in-the-loo phenomenon. According to their study (they covertly observed some 600 college students in the bathroom), people who use the bathroom with one more person present are more likely to wash their hands than they would while using the bathroom alone. Or, for that matter, but a larger group of people.

They also decided that this effect was more pronounced among women, who were more likely to feel the effects of that extra bathroom buddy and even spent more time washing their hands when someone else was around then men.

The reason for this is almost too simple for research, which could be why the article is painfully short, terribly constructed, and discusses only 10 sources. Apparently, it’s standard practice to act “normally” around other people. But, the freak flag flies free when we’re all alone. The research term for this is social impact theory, which basically states that humans act based on the perceived norms of the particular situation regardless of personal preference because the social impact of not adhering to cultural norms in the presence of others could be devastating.

Of course, if testing social impact theory is the research goal, the researcher must put him or herself in some pretty awkward situations. In this paper, for example, researchers either feigned several non-bathroom tasks for the sake of observing bathroom goers or hid in a stall. This hiding, of course, was the primary method employed to make bathroom users feel they were tinkling alone. Following through with this method, it seems that in order to properly test social impact, you must position yourself as the secretive observer, often in some particularly intimate situations.

And here is the argument for an increased importance of research. If researchers could begin convincing the top government brass that they needed to “research” the many daily doings of influential individuals (and by “research” I mean “spy”), we’d have a whole lot fewer problems to talk about. Think of how the Clinton scandal would have played out if a university professor has been slouched behind the ficus tree, pen and paper in hand. We probably wouldn’t need presidential debates anymore because researchers could just tell us everything the candidates have been doing and spare us the senseless chiding that takes place, removing intelligent discourse from the airwaves.

Influential individuals aren’t the only ones who need observing, as this paper clearly shows. The next challenge to researchers is to uncover that people really do pick their noses, throw recyclables in the garbage, and cough without covering their mouths. Once we have done this, we will finally understand one another and the need for societal norms will disappear. By default, we will no longer have a definition for weird.

Everyone will be creepy, but no one will care.

Reference:

Henningsen, D. D., Henningsen, M. L. M., Braz, L., & Davies, E. (2011). Are we being watched? A test of hand washing in public restrooms as communication behavior. Human Communication 14(1), 31-38.

As a somewhat serious note, I’m slightly concerned that this article has passed for communication scholarship. The researchers follow the path that communication does not necessarily concern word-based, interpersonal, or even purposeful, but goal-based, activity-centric, unintended communication:

In their daily routines, people engage in thousands of communication acts.
Although at times these behaviors may be mindless (Langer, 1989) our communication
behaviors tend to be goal driven (Kellerman, 1992). Indeed, even some behaviors that
may be perceived as mindless or as non-communicative may actually be communication
acts intended to achieve specific goals. We propose that hand washing behavior in public
restrooms represents such an act.

Thus, the researchers’ conclusions – that what is being communicated by washing or not washing our hands is an effort to be perceived as normal – exist as part of any number of readings of culture that could be applied to this situation. Unfortunately, we cannot be certain as to how specific individuals will react to differing situations – this data is not a part of this research. How do we know, for example, that washing/not washing one’s hands is a function of social context rather than an individual’s preconceived notions about hand washing?

Additionally, the sources used to support this article, including the most explanatory about social impact theory, come from social and abnormal psychology research. Those communication resources sited are from the same journal and reflect on theories that are now more than 20 years old.

A better test of social impact theory from a communication perspective might be to research communication among colleagues in a professional setting, especially in times of crisis or when a particularly thorny situation is of concern. See for example the two stories told in this past week’s This American Life. What could be gained, for example, by researchers who follow those individuals who choose to speak/act in ways that are contentious with the known culture of a business?

Then again, I have a sneaking suspicion that people aren’t so much interested in corruption or organizational culture as they are in how other people use the bathroom.

So you think you’re a pro

Standard

It’s 2011. Somehow we found ourselves in the middle of a social media revolution.

A recent article in the Journal of Sports Media1 spurred me to begin thinking more carefully about the connection of social media and the process of “professionalizing” journalism. Journalists and writers, it’s no secret, enjoy talking about journalism and writing. And, what they say about journalism as a concept is not altogether good: “clearly, at the start of the 21st century, many experienced journalists fear for the future of journalism” (p. 1).2

Generally speaking, popular effects research on the media tend to behave in a manner that posits the media as a negative entity that causes heavy users to behave differently and contrary to normal expectations. Think of Putnam’s Bowling Alone or Postman’s Technolpoly and Amusing Ourselves to Death. At the most basic level, these books argue that something is wrong and that the cause of these social ills is the mass media.

However, in none of these accounts specifically, and in the larger world of research generally, do journalists, news-makers, or the news media (aside from complaints of bias against FOX News) act as a primary culprit in turning well mannered folk into norm-upsetting hippies. It’s always entertainment media or the overuse of new technology, etc. Yet, journalists are the most forward-thinking group of media-makers discussing the process of “becoming more professional.” This is, of course, an entirely separate conversation.

The question facing all journalists is no longer “should I engage with social media,” but “how should I engage with social media.” The “revolution” of social media has forced anyone with a voice needing to be heard to adopt technologies they do not necessarily understand and adapt their message these technologies, which do not necessarily fit the journalist’s method of communication.

Engaging with social media is a casual enterprise. The first thing I did when I logged on to Facebook was accept an invitation to be in a relationship with the girl I’d been in a relationship with for a couple of months. Without knowing it, my online destiny was sealed. My relationship became a “news” item.  In the world of the “news feed” professional news outlets now compete for attention among the myriad of content posted by individual users each day. In so doing, producers of content enter a realm outside of their control. At least when you print a paper or direct a television show or air a radio broadcast, you are in control of how your content looks, feels, and sounds. Publishing a website gives you a large amount of control over your users’ experiences, provided users can actually find and use your site. But Facebook, for example, renders each user, brand, news outlet, and log-on-once-a-month Larry as equals.

To become relevant in the social world, one must somehow become similar to the social world, but in a way that does not sacrifice one’s professionalism. This is no small feat. Consider how quickly you post status updates. Do you use as much effort source-, fact-, and spell-checking these updates before they go live? Have you ever had to delete a post or offer a correction to something you posted earlier? Has something you’ve posted ever split hairs within your organization while being well accepted by your audience? Does your organization have a “best practices” guide to social media?

And, questions of professionalism extend much further than merely what to post, what not to post. What I love about the Sports Media article is its focus on sourcing. Do you pull athlete quotes from Twitter and what’s the professionalism of that? Shouldn’t you be interviewing your sources for solid stories, not picking up stories from Twitter? The answer here is unclear. In fact, everything with respect to the relationship between professionalism and social media is unclear. As Reed suggests, when things begin to clear up, ” it will be up to journalists and the organizations for which they work to decide how to preserve credibility in this environment that arguably demands more of them” (p. 58).

Good luck. It’s been a difficult road to “professionalize” journalism until now. Social media is not making it any easier. But, if journalists can do what they have always done well and get ahead of the trends, they may soon be able to use the social movement to help shape what it means to be professionally social.

Notes

1 Reed, S. (2011). “Sports Journalists’ Use of Social Media and Its Effects on Professionalism.” Journal of Sports Media 6(2), pp. 43-64.

2 Ornebring, H. (2008). “The Two Professionalisms of Journalism: Updating Journalism Research for the 21st Century.” Proceedings from the 2008 International Communications Association Annual Meeting, Quebec, Canada.

An old and a new

Standard

I read a fun blog post from the folks at Wired two weeks ago about the capitalization of the word, “internet.” AP Style dictates that you ought to capitalize the word, but others across the globe have done away with this rule.

Why? Because it’s too commonplace to warrant the same status as other proper nouns, like “Web site”.

Actually, “web” and “net” go lower in Wired‘s case.

This caused me to think about things that still exist, regardless of their anachronistic style, as well as things that have not yet been invented, but may not be far off.

The one “old” thing that literally appeared into my line of sight this evening is the picture located to the right. If, for whatever reason, you’re unable to view the photo, it is an icon of a floppy disk. By clicking this icon, you can save a search, journal article, or citation from the online database I’m surfing. What’s funny is that I haven’t owned a computer in the past six years that even has a way to process these things, and yet they are still an iconic representation of what we have all learned is the process of saving something electronically.

It would be interesting if the databases and online spaces that used this image replaced it with something more modern, say, a picture of a cloud? It would make logical sense, but would it get the message across?

My “new” thought for the day deals with social media. I was watching one of my authors on television two weeks ago and realized that, no matter how much I tweeted or shared on Facebook the link to the Book TV website advertising the show, I could not make people watch the show, nor could I make it any easier for them to do so. Someone without access to a television, or one they can control, might get frustrated that I keep sharing non-television content about a live television show. This could even foster negative will toward me.

The technology already exists, based on your television service provider, to program your television to record specific shows without ever having to be in front of your television. I predict that this technology will develop to the point where even the lesser-known providers make available recording apps for DVR subscribers. When this technology is finally adopted by a chunk of the population, the wise entrepreneur will design a social app that allows individuals to share television content, offer people the option to record the content directly from their device, and even interact with the programming – among peers.

In the same way I find out about dozens of new pieces of content each day through the social web – that is, content I would not have discovered otherwise – so to will people be able to watch television they may never have known existed. This could mean great user engagement with television (seen until recently as a one-way medium), the lessening importance of the television schedule, and even the demise of primetime television.

Why science could use more Facebook fans

Standard

Frederic Vandermoere, F., Blanchemanche, S., Bieberstein, A., Marette, S., & Roosen, J. The public understanding of nanotechnology in the food domain: The hidden role of views on science, technology, and nature. Public Understanding of Science 20(2), 195-206.

A disclaimer first: the study that provides the basis for this post was based on a non-probability sample taken in France, so its generalizability to the United States can be questioned.

This recent article in the journal Public Understanding of Science raises many questions, some new and some very old, about nanotechnology as it is applied to food. As the authors discuss, although some largely unknown benefits of nanotechnology abound, the novel application of the technology to food has raised some serious concerns among the public about the technology’s environmental and societal risks. Their study works at this intersection to determine at least some of the factors that cause people to weigh the risks and benefits of this technology application one way or another.

What they find is largely not surprising. First, they determine that knowledge level is not a significant predictor of attitude, thus putting to bed some of the arguments for a knowledge-opinion deficit model where (more knowledge = more positive opinions). Second, the researchers found that trust and attitudes about science, nature, and food were the variables most likely to predict support for nanotech in food.

Both of these findings have been noted in past research that has used the survey as the predominate methodology (see for example Allum et al., 2008 and Bauer, 2005). Allum et al. (2008) provide a stellar reflection on the state of the knowledge-opinion gap research. This article is one of many focusing on the broad topic of science attitudes and science knowledge and, more specifically, on the scientific application of nanotechnology. The step to make the study more specific, and even more so, to focus research on an application with direct societal consequences, is the strength of this study. Yet, the fact that this research is just now becoming popular (the popular General Social Survey finally added nanotechnology measures in 2008) is an unfortunate weakness in the broad science knowledge-attitude research phenomenon among scholars. If this study reveals anything at all, it is that people know almost nothing about nanotechnology (at least more than 80 percent of their sample responded this way). And with this being the case, researchers can’t really do much else aside from surveys. Yes new technologies are a lot of fun to research, and for scientists, it’s very valuable to know just how much of your target audience is lacking science-specific knowledge. But until researchers can step back and embrace more qualitative techniques, survey results probably will not differ much in the coming years.

So, why is any of this important for society at large. First, studies like these continue to cement in place the idea that people do not regularly rely on factual knowledge to help them form opinions. The authors briefly mention heuristic processing, but do not expound further, and their findings indicate that something like trust in the government or in science as an institution are strong predictors of positive attitudes. So, while scientists get angry over the increasingly uninformed public, it might actually be in scientists’ best interests to stop inundating the public with information and start enacting overly hyped-up public relations campaigns.

YouTube anyone?

Writing essays

Standard

Do you remember when we were in middle school? We were taught how to write essays in five paragraphs. The following outline should trigger memories of adolescence. This was probably the case for most of us in high school, too.

I.  Introduction (or Intro)

You might remember the umbrella diagram...which looked nothing like this

Here, we were required to set up our topic of interest and, most importantly, include a final sentence called a “thesis.” I remember being critiqued for a multi-sentence thesis or locating it somewhere else within the intro paragraph rather than at the end. I also remember agonizing over whether my thesis would be good enough for my teachers and spent a lot of time drawing some “umbrella” diagram where my thesis was my umbrella and everything under it was “support.”

II. – IV.  Body Paragraphs (1-3)

Body paragraphs made up that “support” underneath the thesis umbrella. Each paragraph included a “topic sentence,” which acted like a mini umbrella for its paragraph, three statements of support (for both the large and small umbrellas), and a conclusion statement. Mathematically, paragraphs could be no less than five sentences. Unless you started using transition sentences, which threw everything off.

Transitions were those things we learned that always included words like, “however” and “therefore,” or phrases like, “on the other hand.” I admit to always being confused by transitions, especially since, within the five paragraph system, there was no place for them. You certainly couldn’t create a paragraph from one transition sentence, but you couldn’t include transitions as the first or last sentences of a body paragraph because those were topic sentences or conclusions.

Unless you were really good. The skilled scribe could creatively weave the transitional phrase into the topic sentence, ultimately pleasing the grading teacher.

That was not me. I liked my umbrellas plane and gray, uncolored by fancy flowers or cartoon characters. I created my new, one-sentence paragraphs. In hindsight, I can now see my disposition for journalism flourishing back then. If only I had paid attention.

V.  Conclusion

You could never begin a conclusion without, “In conclusion ….” Once you’d done that, I believe the conclusion ended up being a jumbled collection of umbrella phrases, reworded, of course, for originality. But, who were our teachers kidding? Young kids weren’t thinking about how their argument all came together in that final paragraph. For us, it was a formula without any argument.

Somewhere along the line, we were also supposed to begin using vocabulary and creative rhetoric to show that we were masters of our language rather than just barely capable of vomiting up a few messy pages. And, while I see the value of the form, I’m wondering how it’s influenced me and others today. If we focus so heavily on the thesis, do we disregard the support? The conclusions? And, what has that support or data been used for? You can shape data to support your thesis, or you can craft your thesis based on your data.

I distinctly remember my college freshman English professor teaching us to read the sources first before we made our argument. However, I also distinctly remember beginning my term paper with a solid argument in my mind. Luckily, none of the data supported my value-based thesis, and I think my paper ended up being better because of it.

For scholars, the “data first” approach is logical. You can’t approach research from any other way. For everyone else, arguments and predispositions guide thought and speech.

Have these differences created a gap? How do we bridge it?

Research interests scattered in young minds: CCM

Standard

The second you think you’ve got it all figured out, you suddenly realize there’s another world existing beyond your library cubby hole.

After my first year of research, utterly quantitative dealing with figuring out what statistics really is/are and what it/they mean(s), it’s become obvious that historical research could be a guilty pleasure for those looking to do some in-depth research while also engaging their  writing talents to tell a story.

I was having breakfast with one of my favorite historical researchers, and we began talking about why an up-to-date history of Contemporary Christian Music had not been written. For one, the field is extremely young, and the data are lacking. Secondly, with what some might see as a “secularization” of the mass media, research into Christianity’s move toward increased acknowledgment through contemporary media channels could be rejected by those both in and outside the Church.

Yet, regardless of these two weaknesses, my mind is now piqued about the possibility of conducting some more qualitative, historical research into the fascinating world of CCM.

Why?

If you look just at the last 10 years of CCM, we see what seems to be an interesting shift. Now, what follows are my own observations after having lived and worked in the CCM world at a major Christian radio station. What we find when we listen to CCM via public media is a mission statement that has less to say about Christianity and more about keeping individuals focused on a family values agenda. Indeed, the mission of K-Love is, “Positive. Encouraging.” A short survey of major church Web sites will reveal that their mission statements have more to say about Christ than positive encouragement.

I remember a meeting with a network employee at the station who frustratingly  spoke of how the mission of Christian radio stations did not mesh with what he believed to be an accurate Christian mission. When the primary audience, women in their late 20s with not-quite-school-aged kids riding back-seat shotgun, turn on the radio, they need something that’s family friendly for their kids to listen to. Thus, Christian radio has become about what it leaves out of its content, rather than what it puts in.

My hypothesis is that when Christian musicians began producing what we know as CCM, they didn’t have the audience in mind that current radio formats do. These musicians wrote, performed, and produced what they believed to be God’s gift. They were evangelists. Utilizing the mass media to distribute these records may then have had a negative effect on the message originally produced, watering it down to a level that’s unoffensive to anyone who doesn’t express the same Christian values.

It’s also interesting that, in America, no other religion has yet dominated the airwaves with the same breadth as have Christians. There are a number of potential reason for this, but what is in need of research is an unpacking of the interesting parallel that exists between the commercialization of CCM and the growth of the Religious Right. Here’s a dumb diagram:

And, what do the commercialization of CCM and the rise of the Religions Right have in common? A family values platform. As for clarification, this diagram is not meant to say that the commercialization is directly the artists’ fault, but the CCM media model.

Of course, being an aspiring research, none of this could be true.