Why younger people don’t read the newspaper

Standard
A girl holds The Washington Post of Monday, Ju...

Image via Wikipedia

Three things happened within the last week or so that led me to write this post.

  1. The New York Times website alerted me that I had only 4 articles left to read until I was kicked out behind the pay wall for the rest of the month.
  2. The PIPA and SOPA legislations attracted a lot of attention at the hands of Google, Wikipedia, and others, if only momentary, for internet freedom.
  3. My autumn 2011 copy of Journalism & Mass Communication Quarterly arrived in the mail and includes an article about why young people have stopped reading the newspaper.

The bulk of this post will focus on a review of that article in the spirit of looking at journalism and mass communication through eyes seeking its advancement rather than its demise. For all the scholars out there, this is the article to which I’m referring:

Zerba, A. (2011). Young adults’ reasons behind avoidances of daily print newspapers and their ideas for change. Journalism & Mass Communication Quarterly 88(3), 597-614.

What initially sets this study apart from others is its qualitative nature. Zerba sites several studies that addressed the phenomenon of, as she terms them, “nonuses,” from a uses and gratifications (U & G) approach. Because both U & G research and studies of newspaper use and nonuse have been rather prevalent since the advent of journalism research, the focus group method employed is able to better unpack in detail the nonuse findings of other generations. These include lack of time; availability of other media choices; access; and possible bias, to name a few. As the author states, she aims “to get at the underlying meaning of nonuse” (p. 597).

Possibly of greater importance, focus groups enable the researcher to capture a small slice of one of the primary news-reading audiences highly affected by the current shift from print to digital. And in an age where the internet has become a concept of legal upheaval, a study of print media is extremely relevant.

Zerba puts forth two research questions:

  1. What are the most popular reasons for not using a daily newspaper?
  2. What would your ideal print newspaper include?

Focus groups were formed using research companies in three major cities – Chicago, San Antonio, and Dallas. Sixty four adults between the ages of 18 and 29 were assembled in total (an error in San Antonio resulted in eight groups for evaluation, rather than the anticipated six). The discussions had among the groups revealed results that were both predictable based on previous research and worthy of reflection with respect to today’s somewhat tumultuous media environment.

In sum, adults 29 and under viewed newspapers as

  1. Difficult and inconvenient to access and use
  2. Environmentally unfriendly
  3. Slow to report the news and redundant
  4. Difficult to multitask with
  5. Drag on one’s time
  6. Irrelevant to the age group
  7. Boring
  8. Biased

When given the opportunity to design their own daily print newspaper, respondents decided that the perfect paper would be:

  1. Brief with only the fact
  2. Local in focus
  3. Inclusive of diverse perspectives
  4. Simply formatted and aesthetically pleasing with color, pictures, and a table of contents
  5. Easily accessible
  6. More entertainment content
  7. Topically specialized
  8. With slightly less negative news and summation of leading news items
Front page of the first issue of The New York ...

Image via Wikipedia

In sum, young people want a newspaper that is easily accessible and up-to-date, aesthetically pleasing, and convenient for their multitask, on-the-go lifestyles. In essence, young people want the internet.

In today’s world, to describe why people choose not to use a newspaper, even though they call themselves news aficionados, is to describe why people do use the internet. The real question for practitioners in the news world coming out of this study might not be, “How can we make our print product more accessible to the younger demographic of readers,” but, “How can we make our online product more accessible to everyone?”

Look at the New York Times online. It doesn’t look all that different from the company’s print product. Look at the Washington Post online. Stunning images, fonts, headlines, and masthead all there. Purchase digital subscriptions and customize your own news on your mobile device.

The only way for a print enterprise to save its print product from extinction is to dramatically change the content offered through its online medium. This is why I am in favor of paywalls, especially when a particular newspaper, say a local one, does not have the human resources to produce new electronic content that will add value to its overall product.

I love this article because of its methodology and because of what it says about individuals of my age. I like to live in my bubble and believe that most people are like me. But they are not. And my close-minded perspective is selfish and unjustified. This does not, however, stop me from hoping that a more literate cohort of twentysomethings will rise up and work as hard to “save” journalism as the generations before us did to build it.

Advertisements

Algorithm bias and why we’re all angry about it

Standard

Kevin Slavin: How algorithms shape our world

It’s not just #OccupyWallStreet-ers who are angry about algorithms. In any business that has been online for any amount of time there exists at least one person who frets about algorithms, surmises new ways to make those algorithms work in their company’s favor, and inevitably gives up knowing that jumping feet first into the world of algorithms is like entering a black hole. And, with no end in sight, we give up, angry and frustrated.

It’s just that the Occupy-ers are the only group of folks to complain about it and receive news coverage for having done so. The NPR story linked here does a solid job of uncovering what most of us already knew about algorithms (that they favor topics trending locally, at a given moment in time) and that anyone actually trying to get their name known through a chance appearance on Twitter’s most famous list is really just wasting their time and money on a frivolous pursuit. Like the article says, it’s more likely that #thingsthirstypeopledo will find its way to the top ten long before any advertising campaign messages do. And, they’ll inspire greater interaction.

The more intriguing part of this story, as is often the case with most NPR pieces, is the final section, titled “Getting Used to an Algorithmic Editor.” Quoting Cornell University communication professor Tarleton Gillespie, the article paints an algorithm as nothing more than your local newspaper editor.

Or, better yet, think of an algorithm as yourself while you edit that term paper you’ve been working on for months. “This doesn’t belong there,” you say, and you scratch it out. “This would work perfectly over here,” and you add it. You are the filter, the gatekeeper over your work. What you let through is made public. What you don’t is not. And, the reason you let some things through while others are left behind is because you’ve likely received years of training in this particular discipline. You know intuitively what ought to belong, what may merit the highest grade, what sounds best, how many sources you should have, how many block quotes you shouldn’t, and so on and so on.

But, your intuitive knowledge is loaded with subjectivity as well. Though you were taught how to write essays in junior high school, everything from your style and voice to your research techniques to the number of times you’ve written anything before this point all serve to form in you a particular way of writing that  is all together different from your professor or fellow classmates. And you still have to consider your particular ideological positions that inform the way you think about your topic.

Algorithms had to be written and edited. They are still being edited. The NPR article indicates that even just a few years ago, Amazon had an algorithm snaffoo that allowed adult-themed titles onto their best-seller list – something they decided should not happen. There are people trained in the science of algorithm writing and they, just like you writing your paper, are biased.

“The important point is that one can never generalize beyond known data without making at least some assumptions,” Martin Sewell wrote for Futures Magazine. The point of the algorithm is to represent information that has not happened based on a mixture of the known data and the assumptions that can be drawn from that data.

Our problem as consumers is that we have been trained to think about anything electronic as inherently unbiased, objective. You can buy something on Amazon without ever interacting with a human being. You do this because there is little fuss, no salesmanship and it can easily be done from in front of the television or as a distraction from work. You do it without thinking. Without engaging. What you do not think about is the host of work happening just on the other side of your computer screen as you clicked the “purchase” button. Every search term, click, and scroll you made while on that site has been recorded in a way that can be marketed back to you. Every second you stayed on one  page longer than another and every source you used to get to that particular place are now part of your online DNA.

But, we ignore this. We get angry when the algorithms don’t work in our favor and we fight about online privacy when we’re the ones exposed. It certainly seems like until we purposely get a hold of our habits online, this is one fight mankind will not win.

So you think you’re a pro

Standard

It’s 2011. Somehow we found ourselves in the middle of a social media revolution.

A recent article in the Journal of Sports Media1 spurred me to begin thinking more carefully about the connection of social media and the process of “professionalizing” journalism. Journalists and writers, it’s no secret, enjoy talking about journalism and writing. And, what they say about journalism as a concept is not altogether good: “clearly, at the start of the 21st century, many experienced journalists fear for the future of journalism” (p. 1).2

Generally speaking, popular effects research on the media tend to behave in a manner that posits the media as a negative entity that causes heavy users to behave differently and contrary to normal expectations. Think of Putnam’s Bowling Alone or Postman’s Technolpoly and Amusing Ourselves to Death. At the most basic level, these books argue that something is wrong and that the cause of these social ills is the mass media.

However, in none of these accounts specifically, and in the larger world of research generally, do journalists, news-makers, or the news media (aside from complaints of bias against FOX News) act as a primary culprit in turning well mannered folk into norm-upsetting hippies. It’s always entertainment media or the overuse of new technology, etc. Yet, journalists are the most forward-thinking group of media-makers discussing the process of “becoming more professional.” This is, of course, an entirely separate conversation.

The question facing all journalists is no longer “should I engage with social media,” but “how should I engage with social media.” The “revolution” of social media has forced anyone with a voice needing to be heard to adopt technologies they do not necessarily understand and adapt their message these technologies, which do not necessarily fit the journalist’s method of communication.

Engaging with social media is a casual enterprise. The first thing I did when I logged on to Facebook was accept an invitation to be in a relationship with the girl I’d been in a relationship with for a couple of months. Without knowing it, my online destiny was sealed. My relationship became a “news” item.  In the world of the “news feed” professional news outlets now compete for attention among the myriad of content posted by individual users each day. In so doing, producers of content enter a realm outside of their control. At least when you print a paper or direct a television show or air a radio broadcast, you are in control of how your content looks, feels, and sounds. Publishing a website gives you a large amount of control over your users’ experiences, provided users can actually find and use your site. But Facebook, for example, renders each user, brand, news outlet, and log-on-once-a-month Larry as equals.

To become relevant in the social world, one must somehow become similar to the social world, but in a way that does not sacrifice one’s professionalism. This is no small feat. Consider how quickly you post status updates. Do you use as much effort source-, fact-, and spell-checking these updates before they go live? Have you ever had to delete a post or offer a correction to something you posted earlier? Has something you’ve posted ever split hairs within your organization while being well accepted by your audience? Does your organization have a “best practices” guide to social media?

And, questions of professionalism extend much further than merely what to post, what not to post. What I love about the Sports Media article is its focus on sourcing. Do you pull athlete quotes from Twitter and what’s the professionalism of that? Shouldn’t you be interviewing your sources for solid stories, not picking up stories from Twitter? The answer here is unclear. In fact, everything with respect to the relationship between professionalism and social media is unclear. As Reed suggests, when things begin to clear up, ” it will be up to journalists and the organizations for which they work to decide how to preserve credibility in this environment that arguably demands more of them” (p. 58).

Good luck. It’s been a difficult road to “professionalize” journalism until now. Social media is not making it any easier. But, if journalists can do what they have always done well and get ahead of the trends, they may soon be able to use the social movement to help shape what it means to be professionally social.

Notes

1 Reed, S. (2011). “Sports Journalists’ Use of Social Media and Its Effects on Professionalism.” Journal of Sports Media 6(2), pp. 43-64.

2 Ornebring, H. (2008). “The Two Professionalisms of Journalism: Updating Journalism Research for the 21st Century.” Proceedings from the 2008 International Communications Association Annual Meeting, Quebec, Canada.

Risks of early adoption, or my ‘cloud’ hasn’t rolled in yet

Standard
iPhone launch

I sold my soul for the 3GS, but the 4 isn't much cheaper

With the most profound technological revolution in modern history currently at hand, it would appear that being an early adopter of technology – someone who is first in line for the newest tablet, smartphone, or cloud-enabled device – would be the most “cool” someone could be.

But maybe not.

Very recently, both Sony and Amazon, two large companies that have taken advantage of the newly popular “cloud computing” model, have come under fire for what are being called “elementary error[s]” resulting in hacked systems (in Sony’s case) and some logistical glitches that caused major network slowdowns (in Amazon’s case). While this makes for a some great critique of the still-nebulous cloud technology at large, what’s to be made of technology’s earliest adopters?

Well, in Sony’s case, if you were an early adopter, there’s a chance that hackers had brief access to your secured credit card information. And that’s not fun. In an age where it has become not only increasingly handy to possess a high-powered smartphone but also fashionable to be seen carrying one, it’s no surprise that being an early adopter is something to be desired. But should it be?

Diffusion of Innovations Graph

Get it now, laggard.

Quite often, the finances of adopting new technologies (part of the innovation and information diffusion theory) are taken into account, and customers wait for price cuts before making a purchase. But much more is at stake. While we can search CNET for tech reviews at all hours of the day, no one predicted the glitches experienced by Sony and Amazon. Glitches may be minor or require a quick software update, but the true risks associated with many of these technologies are still unknown. This is also the case with any and all social media (think privacy).

All this to say that until it is no longer “cool” to be an early adopter, a lot of innocent people may unnecessarily be putting themselves at risk without fully understanding the potential consequences. That fraternity party sounded like a blast a few hours before when all the fun people couldn’t stop talking about it, but from under the kitchen table where everyone woke up, things look a lot different.

For some common relief, here’s a cartoon from The New Yorker:

Harry Potter reflecting apps-centered society?

Standard

On a scale from 1 to 10, how confused is Ron?

It strikes me that Harry Potter is remarkably similar to our apps-centered world of technology. In the 7th book/movie, Hermione conjures up a spell that turns her bag/purse into a voluminous contraption that can hold just about anything without resulting in added size or weight. The boys, Harry and Ron, are impressed with this, and it’s obvious that they’d never thought of such a thing.

In our world of technology, Hermione would be considered an “early adopter” or possibly even an “innovator,” who finds out about spells, incantations, and enchantments earlier than most and essentially influences others to use them. What if we thought of our smartphones and other gadgets, complete with their never-ending world of apps, as the wizard or witch’s magic wand? Certainly we’ll never learn about all the spells, and may never really want to. But, we can know about some, and in exploring their uses, their pros and cons for ourselves, we will simplify our lives with the use of magic – something not created or even fully understood by us.

But, we also have to be aware of the side effects of dependence on “magic.” Magic is costly ($$$ per app), addictive (think of the Windows Phone Commercials, watch below), and will almost always result in controlling its user when it is not respected.

So, are you Hermione, whose use of magic is the result of her own exhaustive research? Are like Ron, who, although rather inept at performing magic, still pushes forward despite his shortcomings, even if such effort results in nothing but frustration? Or are you more like Harry, whose natural talent and distracted personality leave him somewhere in the middle knowing how to use some of the existing magic most applicable to his situation very well?

You know what comes next: I hope you’re not a Voldemort, punch-drunk on magic and the power it brings him.

Apple’s losing it, literally

Standard
New iPhone

Courtsey of GIZMODO.com

At first, I couldn’t believe it. An Apple employee mistakenly left his iPhone prototype at the pub. It’s hilarious.

And after a little while of trying to figure out if it was legitimately from Apple and not some imitation, the Web site that has been reporting the entire saga was asked to give back the prototype device in a kindly written letter from a senior vice president at the company.

This all comes just weeks after Apple released the iPad to the general public, and only a few months after it debuted to the media.  In the days leading up to its original debut, media prophets were consulting their crystal balls in hopes that their greatest fantasies would finally come true in this new invention. Unfortunately, many critics were disappointed that the iPad didn’t support Flash technology and that you couldn’t make phone calls or multitask.

I’m just sad that they called it the iPad instead of the iSlate.

Regardless, the release of the iPad said something about Apple’s serious commitment to securing its creative property under lock, key, and electronic password until the exact point in time it should be made public.

This little incident, on the other hand, says something entirely different about media companies and their all-too-serious outlook on the future of technology. I hope that the corporate executives of the large media companies understand this, but the future is going to come – there is nothing you can do to stop it. One day, your product will be released, people will purchase it, and you will release a subsequent product with minor adjustments that even more people will purchase.

This is why the leak of the new iPhone prototype is so comical to me. It’s not that new of a piece of technology. It’s an improvement upon something they’ve already created. If you had given iPhone users the chance to come together and brainstorm what the next generation of iPhone would look like, they probably would have come up with something remarkably similar to what was leaked (according to the social construction of technology theory).

Then again, this could all be an elaborate media stunt created by Apple to distract us from the real future iPhone.

In conclusion, my undergraduate advanced writing class taught me a few things, but one of the important things it taught me was not to take myself (or sports writing) so seriously. Technology is technology is technology. It’s going to come out, and it’s going to be replaced. We live in an extremely disposable world, and we are becoming more disposable by the day. The quicker technology inventors understand this, the more productive I think we’ll be as a society in the future.

Life in the fast lane: Immediate gratification

Standard

Take a look at this quote from Discoblog’s RSS feed:

new research suggests that ‘we are how we eat’ and that the mere thought of fast food can result in general impatience. Researchers from the University of Toronto conducted a series of experiments in which they showed volunteers logos from several fast-food chains or asked them to recall the last time they’d visited, writes Scientific American. And they found that folks who had thought about fast food would then read faster, even though no one told them to hurry.

There were also a number of other findings present in this study linked with the need for convenience and speed. I wonder, though, what the findings of a similar study might be if researchers substituted fast food with Internet technology and the personal devices capable of utilizing high-speed Wi-Fi. Maybe a study already exists, but what if researchers compared groups of individuals who had access to iPhones, for example, and those who did not and then asked them to access a specific bit of information without the aid of an iPhone.

I have a feeling that iPhone users might get stressed out quicker. But, until the study comes out, I’ll continue to assume that the constant need for ease-of-access in our country is turning us into stressed-out work-aholics who value technology over, well, everything.