Category Archives: The Public Square

iDistraction

As you read this, you’re probably thinking about whether your classmate has written you back yet. You might have stopped halfway through that last sentence to check your e-mail. Maybe you checked your Facebook to see what time some event starts tonight. If I still have your attention right now, I have overcome the most difficult hurdle for a writer in the age of technology: Keeping someone’s attention when there are so many other things to do simultaneously at your fingertips.

Still with me? Good.

Last week, I was at dinner with four friends. Each one had their black, sleek iPhone sitting out. In the middle of dinner, one friend would respond to a buzz and proceed to text message under the table. He checked CNN. With time to spare, he checked the weather. This was all before the entrees arrived. Twitter was during dessert.

I felt left out with my Sony Ericsson sitting on the table beside me. In comparison, my phone has the capability to send text messages and take pictures. That’s about it. I was at a crossroads. Would I spring for a more versatile phone in order to fit in with my friends, or would I keep this limited, practically archaic phone?

The answer came when my mom asked me if I’d like a BlackBerry for my soon-approaching birthday. Almost without hesitation, I said no.

I’m afraid what such a multifaceted piece of technology will do to my already screen-oriented lifestyle as a college student. I’m addicted to Facebook, and I don’t want it to take over my life any more than it already has. I’m completely vulnerable to distractions whenever I sit down in front of my computer to do something, like say, write this blog. Do I want to finish this assignment, or do I want to watch The Office instantly on Netflix? Where I could have spent an hour editing a paper for class, a considerably short venture turns into a three-hour marathon where people can chat with me through at least four different mediums: Skype, Gmail, AOL or Facebook. Then comes 2 a.m. and that paper still isn’t finished. And still, chat windows are popping up from friends and acquaintances alike who want to engage in entirely insignificant conversations at such a late hour. So I know I can’t be the only one who allows distractions to get in the way.

When I leave my computer, that’s my only opportunity to be free of all the confines that technology brings. We are literally tied to a screen. Don’t get me started on the future of bookstores thanks to the Kindle. Call this all Emersonian, but I’m afraid that the advances of technology are making us so distracted that we can’t enjoy the moment anymore. I consider the compaction of different technologies to be more of a setback than progress. In some ways, it is completely helpful to have infinite knowledge only a click away. It’s great for when you’re lost and you need to look up a map, or for communicating with people you would have never kept in touch with otherwise. But when you can’t get through a dinner without everyone whipping out their phones … I see that as problematic. How important is it to know instantaneously whether someone has e-mailed you at the expense of giving the person in front of you your undivided attention?

I am personally taking small steps to help combat my own proneness to distractions. Ironically, one of the solutions has come via a new application for Macs.

This is what my screen looks as I write this:

This program, WriteRoom, blocks out everything else — every Internet page I have open, every chat window — to create an entirely black screen with, I admit, cryptic green writing. As long as it’s open, there is no Twitter or e-mail to distract me. Unlike the programs that try to do everything, WriteRoom is a completely minimalist writing tool. I can’t even distract myself by changing fonts.

I use this in class so that I don’t become one of those students who check Facebook and zone their teacher out. I use it at home when I’m writing an assignment so that my desire to procrastinate doesn’t trump the need to work. I’m sure the kids sitting in back of me in class think that some alien force has taken over my computer. Probably, but it’s worth it.

Now if only there was an application to block out every distraction when you’re at dinner with a friend.

3 Comments

Filed under The Public Square

In this corner, a rogue. In the other, Oprah.

As always, Sarah Palin has been attracting more attention than she probably deserves. Alaska’s resident hockey mom had a lot going on this week. First, the release of her memoir, Going Rogue. Then, she duked it out for top billing against the father of her daughter’s baby, Levi Johnston, who made a…erm… debut of his own in Playgirl. Throw in an unauthorized Newsweek cover that had her crying out sexism and an interview with Oprah, and we had ourselves a Sarah Palin bonanza.

Palin detractors are astounded by her staying power months after her media gaffes have tallied way past the point of excusable. How, they ask, is Palin, an ex-governor and former vice presidential candidate, able to score Newsweek’s cover story and Oprah’s hot seat all in such quick succession?

If anything, I blame the media personalities who have had the opportunity to interview her. The way they have framed their questions and depicted her has given Palin a spot forevermore on the political stage.

Ever since Palin captivated American audiences with her debut speech at the Republican National Convention, people have had either positive or negative feelings toward her. Palin moderates are few and far between — so few and far that I don’t know of any. She has been an extremely polarizing figure since day one. In turn, the news media also fed into the fixation. Interviewers like Katie Couric and Oprah do not treat Palin as an ordinary politician. Their respective approaches toward Palin have created a celebrity out of her.

Because interviewers are the people who lead the dialogue, interviews are primarily reflective of the interviewer’s personality and motives. Most reporters go in with a preconceived notion of what they expect to extract from their subject. Whether they are successful depends on their own preparedness, as well as the preparedness and personality of their subject. Both players, therefore, are crucial to the outcome of the interview.

The David Frost/Richard Nixon interviews proved how the personalities of the two combatants affect the outcome of an interview. Here were two men both floundering for their former glory, who both had everything to lose and everything to gain during those 1977 interviews. So set was Frost on getting what he and the American public wanted — a confession and an apology — that the whole interview became that of a hostile interrogator grilling the defendant. Outside of this zero-sum situation, Frost and Nixon had a mutual respect for each other’s modest beginnings. But Frost’s motivations outweighed his desire to approach Nixon on human grounds. In other words, he was no Oprah.

Couric’s sit-down with Palin during the 2008 election was, like the Frost/Nixon session, a battle of wits. Couric’s skepticism and negativity toward Palin was evident. She tossed curveballs at Palin that would seem like throwaway questions for any other person. For example: “What do you read?” Here, Couric strategically wanted to catch Palin off guard. She did not ask that question expecting Palin to come back with a rundown of her favorite publications. Palin, unsurprisingly, offered so many unintelligent, rambling answers that they quickly became fodder for Tina Fey and the rest of the Saturday Night Live writers. (Palin also told Oprah that she was flustered and put off by Couric’s inquiry into her choice of reading materials, deeming the question an insult to her intelligence.) That one interview contributed to the undermining of the McCain-Palin ticket and our endless fascination with her.

In contrast, critics noticed that Palin’s recent interview with Oprah on Monday didn’t result in as many flubs as her interview with Couric. Palin appeared confident and more like the Sarah we first saw when she clued us in to the difference between hockey moms and pit bulls. Oprah scaled back her usual “How do you feel…” line of questioning. Though she did push Palin on certain other subjects, Oprah was for the most part willing to stick to a Palin-approved script about Going Rogue. It didn’t hurt either that Palin had pages of recently published autobiography material to draw upon. Getting to mull over and write down her version of history — the script of her life — before being asked about it on live TV did wonders for preparedness, I’m sure.

Neither interview was that of an unbiased interlocutor asking a politician straightforward questions. Palin was either the subject of too much scrutiny and gotcha journalism, as with Katie Couric, or she was not scrutinized enough, as with Oprah. In short, Palin has never been treated as a normal guest. As a result, the public does not see her as ordinary, and she has risen in the ranks from mere politician to a celebrity wonder because of that.

If the interviewer and subject do not have the same goals for the interview or do not see eye- to-eye, the interview quickly escalates from a question and answer session to a journalism duel. A seasoned politician, a rogue even, should be able to handle either situation. Chalk it up to inexperience and unpreparedness for Palin. Still, Couric and Oprah have bought into, and even fueled, the media’s obsession with Palin. Thus, it is Palin who comes out the true winner.

2 Comments

Filed under The Public Square

Lord of the blogs

Most professors want students to continue to avoid Wikipedia like the plague. Almost every syllabi I’ve ever skimmed on the first day of class said that the site does not count as a source in any bibliography for any assignment. Ever. Professionals and journalists too lament how the emerging blogosphere allows Joe Schmoe, who has no press credentials, to write anything as fact without being held accountable, thereby skirting around the traditional information gatekeepers. Still, the popular online encyclopedia says a lot about the future of the Internet and information.

Academics’ disdain for Wikipedia is legitimate to an extent. We shouldn’t necessarily rely on one page for all our information.The evolution of Wikipedia, however, shows that people demand the same order and bureaucracy in the digital age that they require in everyday society. As much as we brandish our First Amendment rights, there is a resistance against user-generated content being uncensored. Wikipedia is already going in a direction where its users are expected to cite their sources. Eventually the site could become just as reliable as an academic journal.

Further proof that Wikipedia is susceptible to intervention and not just an academic Lord of the Flies free-for-all occurred last week when two Germans convicted of murdering an actor sued Wikipedia for publishing their names on the site. According to German privacy law, criminals are granted anonymity in the news after they have paid their debt to society. The two men, who have served their sentence, want Wikipedia to grant them the anonymity they are given back home — pitting American First Amendment rights against other countries’ laws.

Though the outcome hasn’t been decided, the case shows that the information posted by anyone on Wikipedia is not immune to being challenged. Wikipedia’s content is now subject to the law, and other institutions. For example, during the period of journalist Daniel Pearl’s abduction, Wikipedia honored The New York Times’ request that no one write of his kidnapping. NYT felt that such news would only further endanger Pearl. Not only did Wikipedia participate in this unprecedented media blackout, but the editors also went in and revised information to make Pearl appear more sympathetic toward the Muslim faith, so as to appease his kidnappers.

Online articles have begun linking to Wikipedia. One article I was reading on Slate about physiognomy directed me to Wikipedia’s definition of the art. Though our professors might still scorn the site, newspapers – which are supposed to be the ultimate source of credibility – are now entrusting Wikipedia to give readers accurate information.

And as of August, Wikipedia editors control all the information. Though anyone can still go in and update a page, the changes don’t show up until an editor has reviewed and approved the revision.

This democratic information age is mirroring a real life democratic society. The people’s voices still matter, but gatekeepers are springing up to bring order. Some people have cried out against this new Wikipedia filter, the way some people desire limited government.

Though Wikipedia is still riddled with errors and isn’t the most reliable source, the direction it is aiming for indicates that online information can only be entirely unrestricted for so long.

6 Comments

Filed under The Public Square

Oh, Mickey, you’re so…fine?

M-I-C

See you real soon!

K-E-Y

Why? Because we like you.

M-O-U-S-E.

Ah, the Mickey Mouse Club. Those were the days. Britney Spears, fresh-faced and pure, existing in harmony with fellow cast member Christina Aguilera, sans shaved head and pregnant Jamie Lynn. Little Justin Timberlake, from head to toe in gray flannel, not a tattoo in sight. Those were the days before putting Britney and Justin together in a sentence prompted questions about Britney’s virginity, Lance Bass’s sexuality and Kevin Federline’s existence.

Those were the days I preferred. Mickey Mouse was the umbrella under which all this squeaky clean, good fun existed.

Now replace that image of cheery Mickey with a dark, brooding Mickey.

This is Disney’s latest rendering of Mickey’s world in the video game Epic Mickey. If this were the world Mickey lived in during the Mickey Mouse Club, the show probably would have been a lot less early 1990s and a lot more midriff from Britney and Christina.

Some people have taken to decrying the transformation of the Disney icon. Significantly, those people are my age and older. The Mickey debate reveals a deep generational conflict. It seems that those close in age to me and older are nostalgic for the feel-good shows that were on a few years ago.  I can’t even begin to impress how many times I’ve sat through a conversation rehashing old Nickelodeon cartoons and how much better they were than the shows on today.

Doug!”

“What about Salute Your Shorts?”

“Oh yeah!” a few people exclaim, pretending to conjure up the tune of Camp Anawanna from the deep abyss of their memory, though they just had this same conversation last week.

“And Hey Arnold!”

Many of the same people, including myself, aren’t interested in the less heartfelt cartoons that are on TV today. Our preference for the gentle and mild is drastically different from a more desensitized generation that prefers Epic Mickey. Disney’s team of researchers has concluded that, indeed, Epic Mickey is what the kids want.

BusinessWeek took note that businesses would have to completely revamp their marketing approach for the latest generation:

Marketers haven’t been dealt an opportunity like this since the baby boom hit. Yet for a lot of entrenched brands, Gen Y poses mammoth risks. Boomer brands flopped in their attempts to reach Generation X, but with a mere 17 million in its ranks, that miss was tolerable. The boomer brands won’t get off so lightly with Gen Y. This is the first generation to come along that’s big enough to hurt a boomer brand simply by giving it the cold shoulder–and big enough to launch rival brands with enough heft to threaten the status quo.

Generation Y, at 60 million strong according to BusinessWeek in 1999, is a generation that is exposed to more graphic and explicit images at an early age. With far less censorship, movies are packed with sex, action and violence. As a result, desensitization occurs.

I see this within my own family. Though my brother and I are part of the same generation, I grew up much more sheltered since I’m almost 10 years older. My brother, 12, spends a lot more time than I did playing violent video games and watching cruder cartoons. When I was 12, there was no Facebook. My brother uses it to talk to his friends about the fights he witnesses at school. Same parenting, different generations.

Interestingly, Epic Mickey was not a concept derived by the old-timers at Disney, but by interns working there in 2004. This further compounds the notion that the desensitization of Mickey is an idea embraced by a younger, less nostalgic demographic.

The premise of the game is ironically symbolic of Mickey’s struggle with his celebrity status:

Mickey is forced to become more aggressive because he’s entered into a dark Disney world where he is no longer famous. He must scribble and draw his way through different levels (which, granted, doesn’t seem so frightening) to reclaim his spotlight as Disney’s mascot. This evil underworld — where celebrities’ stardom ceases to shine— is run by Oswald the Rabbit. (Desperate for fame…evil underworld…I wonder if Lindsay Lohan was one of those interns.) If you don’t know who Oswald is, that’s the point. He was the Disney star pre-Mickey Mouse, before Disney disputes lead the character down the rabbit hole and into obscurity. Disney just acquired the rights to the character again in 2006.

Why bother to touch a beloved character and overhaul a trademark? Disney thinks this risky venture is going to breathe new life into a character that generates about $5 billion in merchandise a year. (That’s a lot of cheese for a little mouse.) Disney is afraid that Mickey has lost his appeal to younger audiences. On a different note, Mickey remains extremely popular internationally, which suggests that perhaps it is specifically the American mindset that is desensitized. It’s unknown whether the Disney researchers have made any conclusive findings about this, though.

Still, the generational divide seems to be the biggest issue at play. A secondary divide is gender. The company is actively trying to draw in young boys, whose interests lay with edgier characters. Disney’s studies on boys’ interests have concluded, not surprisingly, that Disney princesses and everything sweet and nice don’t cut it for the boys as much as it does for the girls. Boys are still watching Disney, but girls are more likely to follow through with buying merchandise. Because of this, we have new Mickey merchandise that looks like this:

Boys, Disney found, are more likely to publicly proclaim their fandom for cars, dinosaurs, and now, conniving Mickey Mouse. I can see where there is some logic to Disney’s approach. My friend touts his love for Disney/Pixar’s Cars. “Why do you like Cars so much?” I asked him, expecting him to gush about the music, or the subtle innuendos.

“Um…it has cars.” Well put.

I for one would rather not see Mickey become part of some Disney black hole. I am much fonder of the Mickey Mouse that — forget a conniving grin —had a toothless smile. Rather than reinventing an icon, there must be a way to bring back the Mickey we know and love so that everyone wins — the young and the old, the boys and the girls. Hopefully the next set of Disney interns get to work.

5 Comments

Filed under The Public Square

War of the Worlds Strikes Again

Almost 71 years ago this Halloween, people cowered in their living rooms. They prayed and cried. The aliens were invading.

Given the fun Orson Welles undoubtedly had spooking the public that night, it might not be so far-fetched to think he is the culprit behind nearly every fright story reported on the news by cable TV. His being dead, however, quickly kills that hypothesis. This can only mean that the sensationalistic stories about the rise of teenage “Super-Predators” and “bio-underclass” crack babies are neither parody nor farce.

If we bought the alien story in 1938, on Halloween no less, who would bother to question stories like Newsweek’s “After Iran Gets the Bomb” issue last week, with its none-too-subtle photo of a mushroom cloud? After all, how many people even know that Iran has a “no strike first” doctrine?

Almost every article is susceptible to scaremongering these days. Just today, a gem of an article on CNN about online identity theft was one of the headlining stories:  “If you’re on Facebook, Twitter or any other social networking site, you could be the next victim,” it began.

But before I cower under my bed and delete my Facebook account, lest I become a victim of something called “phishing,” which sounds ominous and yet is left undefined, let’s do some quick math (something that journalists are averse to):

There are 300 million users on Facebook. Since 2006, the article said, there have been 3,200 cases of this social networking “cybercrime.” (Again with the scary words.) A few clicks on my calculator widget tell me that this averages out to less than 1 percent of users being hacked each year — even less if we were just including Facebook.

I think my Facebook account may just live to see another day.

Journalists are abandoning their purpose as they continue to turn and distort facts into dramatics. President Obama is spurning Fox News for its one-sided, skewed reporting. If he really wanted to make a statement for the public good and not just his administration, he could turn the other cheek, or at least rebuke, every news organization that does a disservice to the public by paralyzing them with hair-raising phrases and inaccurate statistics.

So what if the media is prone to spurious reporting, you might ask.  Shouldn’t the average news consumer shoulder the blame for being so darn gullible?

I concede that it might do us some good to approach what we read and hear more analytically, and take the time to fact check the likelihood of becoming the next target on a cyberthief’s to-do list. Still, we shouldn’t just chalk our gullibility up to stupidity or laziness.

As for the War of the Worlds incident, let’s give listeners the benefit of the doubt. Perhaps they were so hysterical that they missed the disclaimers that it was a hoax. (Please, humanity, let this be true.) Also, consider the time it was broadcast — shortly before War World II. People were especially on edge, the way we were after September 11. No one expected it to happen, and no one could anticipate what might come next. Things we never dreamed of happening really did turn our world upside down. That, with the onslaught of bleak news — some accurate and some not so much — leaves us vulnerable to the idea that we could be the next victims of something out of this world. And the news media capitalize on that.

That’s why we don’t always think to second guess reporters. The way we don’t expect someone who has spent every Sunday in services to steal, we also don’t expect a reporter who has been trained upon a code of ethics to create words out of thin air like “Super-Predators” and “crack babies.” Et tu, Nancy Grace?

So no, I don’t believe we can place too much blame on the consumer. To borrow a line from Spiderman’s Uncle Ben, I think that we’ve entrusted journalists with “great power and great responsibility.” We’ve bestowed them with the job of gatekeeper. We’ve put our trust in their hands. It is the journalist’s job to produce work that does not require hours on end of sifting through information to figure out where the cool logic and reasoning is hidden among vague and misreported facts.

The problem boils down to just plain careless reporting. There are two main reasons I see for this:

First, to throw in another superhero reference for good measure: A news company’s kryptonite is its profits. More than ever, the media are driven by the need to survive, and that means money. To be the first across the finish line with the story, decisions are made on the fly, instead of being painstakingly deliberated. (Now if only Clark Kent applied his superhero powers to ridding The Daily Planet of sensationalism.)

Second, journalists are paranoid freaks like the rest of us. (Guilty.) Maybe they are writing about the rise of Super-Predators because they truly believe they’re out there lurking in the dark, waiting. Despite what journalists say about being unbiased watchdogs, reporters do not just throw away their identities, morals, personalities and beliefs when they sit down to write. The words they write and say are the best window into their cognition. It’s likely that if reporters are spewing out jitter-inducing phrases, they are allowing their own paranoia to blind them from reality. They too are vulnerable to the frightful ideas their sources leak.

So, journalists, I implore you to examine your priorities and your frame of mind. Both are preventing you from upholding the good name of journalism. Remember: With great power comes great responsibility.

Super-Predators” and “bio-underclass” crack babies are neither parody nor farce.

If we bought the alien story in 1938, on Halloween no less, who would bother to question Newsweek’s “After Iran Gets the Bomb” issue last week, with its none-too-subtle photo of a mushroom cloud? How many people know that Iran has a “no strike first” doctrine?

Another scaremongering gem from CNN the other day began, “If you’re on Facebook, Twitter or any other social networking site, you could be the next victim.” But before I cower under my bed and delete my Facebook account, lest I become a victim of the dreading “phishing,” let’s do some quick math (something that journalists are averse to):

There are 300 million users on Facebook. Since 2006, the article said, there have been 3,200 cases of this social networking “cybercrime.” (Again with the scary words.) A few clicks on my calculator widget tell me that this averages out to less than 1 percent of users being hacked each year — even less if we were just including Facebook.

I think my Facebook account may just live to see another day.

So what if the media is prone to spurious reporting, you might ask.  Shouldn’t the average news consumer shoulder the blame for being so darn gullible?

I concede that it might do us some good to approach what we read and hear more analytically, and take the time to fact check the likelihood of becoming the next target on a cyberthief’s to-do list. Still, we shouldn’t   just chalk our gullibility up to stupidity or laziness.

As for the War of the Worlds incident, let’s give listeners the benefit of the doubt. Perhaps they were so hysterical that they missed the disclaimers that it was a hoax. (Please, humanity, let this be true.) Also, consider the time it was broadcast — shortly before War World II. People were especially on edge, the way we were after September 11. No one expected it to happen, and no one could anticipate what might come next. Things we never dreamed of happening really did turn our world upside down. That, with the onslaught of bleak news — some accurate and some not so much — leaves us vulnerable to the idea that we could be the next victims of something out of this world. And the news media capitalize on that.

That’s why we don’t always think to second guess reporters. The way we don’t expect someone who has spent every Sunday in services to steal, we also don’t expect a reporter who has been trained upon a code of ethics to create words out of thin air like “Super-Predators” and “crack babies.” Et tu, Nancy Grace?

So no, I don’t believe we can place too much blame on the consumer. To borrow a line from Spiderman’s Uncle Ben, I think that we’ve entrusted journalists with great power and great responsibility. We’ve bestowed them with the job of gatekeeper. We’ve put our trust in their hands. It is the journalist’s job to produce work that does not require hours on end of sifting through information to figure out where the cool logic and reasoning is hidden among vague and misreported facts.

Journalists are abandoning their purpose as they continue to turn and distort facts into dramatics. President Obama is spurning Fox News for its one-sided, skewed reporting. If he really wanted to make a statement for the public good and not just his administration, he could turn the other cheek, or at least rebuke, every news organization that does a disservice to the public by paralyzing them with hair-raising phrases and inaccurate statistics.

There are two main reasons I can see for such careless reporting:

First, to throw in another superhero reference for good measure, is that a news company’s kryptonite is its profits. More than ever, the media are driven by the need to survive, and that means money. To be the first across the finish line with the story, decisions are made on the fly, instead of being painstakingly deliberated. (Now if only Clark Kent applied his superhero powers to ridding The Daily Planet of sensationalism.)

Second, journalists are paranoid freaks like the rest of us. (Guilty.) Maybe they are writing about the rise of Super-Predators because they truly believe they’re out there lurking in the dark, waiting. Despite what journalists say about being unbiased watchdogs, reporters do not just throw away their identities, morals, personalities and beliefs when they sit down to write. The words they write and say are the best window into their cognition. It’s likely that if reporters are spewing out jitter-inducing phrases, they are allowing their own paranoia to blind them from reality. They too are vulnerable to the frightful ideas their sources leak.

So, journalists, I implore you to examine your priorities and your frame of mind. Both are preventing you from upholding the good name of journalism. Remember: With great power comes great responsibility.

2 Comments

Filed under The Public Square

Why do I have to pay to work for you?

It’s application season for students across America. Parents of high school students are preparing to reach deep into their pockets to pay for hefty tuition bills. They might expect this to be the last major investment they make before their children are out the door for good. But they might be wrong. In today’s economic environment, even work experience is a commodity.

For example: One father bid $30,200 on an internship at GQ for his son.

During college, advisers stress the importance of internships.  There are two types: paid and unpaid. Because of the economy, it is not so surprising that internships are usually unpaid, considering how many businesses can barely even afford to pay the salaries of their full-time staffers. Employers insist that if the internship is indeed an unpaid position, students must get class credit through their university. The catch: Registering for a class, especially at an expensive, private university, can cost thousands of dollars. At USC, a normal internship would cost roughly $2,000. As a result, students — or, I should say, the students who are able to — are shelling out thousands for the opportunity to use internships as a resume booster.

Oftentimes, whether a student applies for an internship boils down to whether he or she can afford it. What I see here is a division — a class divide —between who can and cannot get internships. In the next few years, we can expect to see that the people with jobs are the ones who were able to pay to get their foot in the door. This system is eradicating colleges’ need-blind policies. Even if a financially disadvantaged student is able to get into Harvard or Stanford, or any other elite university based solely on his merits, he will be at a disadvantage getting an internship unless he can find a way to fund it.

And because jobs are so scarce, it’s especially important now that students have previous work experience to put on their resumes to have a competitive edge. Internships can determine future employment, which means it’s the affluent students who have an advantage in the working world.

Talent can still emerge victorious when it comes to paid internships. For obvious reasons, these are most sought out by students. Here, businesses will at least offer a stipend to help with the cost of housing for the duration of the internship. These types, however, are more hard to come by, and especially competitive.  The Los Angeles Times recruiter said recently that he insists upon paying interns. The downside to this is that it limits the amount of interns he can hire. Because of this, he can only offer internships in the summer instead of yearlong.

Some companies that once paid their interns are now cutting back. A Pittsburgh Post-Gazette contact wrote that they no longer pay their interns.

Because unpaid internships are most common, our current internship system primarily supports a class divide.

Can’t these businesses get creative and be flexible in their definition of “compensation”? Or, what about the spirit of voluntarism? Why can’t someone choose to be a “volunteer” and forgo the title of “intern”? People can spend hours volunteering and not get compensation.

Students who want an internship but can’t afford it are forced to think of ways to circumvent the system. Some students get their class credit at a community college because the classes are cheaper. If you’re like me, you hope your employer forgets all about it. That worked the first time around, because the senator’s office had more to worry about, like saving the United States from economic ruin and what flavor fro-yo to get.

This year, I may not be so lucky. One internship program at USC sounded fantastic … until I heard how much it cost. Immediately I was deterred from applying. The program matches students with companies in New York for the summer and costs $6,800 — not including airfare, food or the commute to the internship site every day. I don’t know what the exact breakdown was, but $2,000 accounted for the cost of a 2-unit class, and I’m assuming about $2,000 for housing. The rest, I’m not sure. Also — financial aid does not apply.

I’m sure some parents are more than willing to pay for their child to have the internship opportunity. After all, parents pay for private schools, private tutors and private college counselors to put their children ahead in the college admissions game. I don’t think it’s necessarily wrong. They want the best for their kids, and it’s unfortunate that paying exorbitant fees is what it takes to get the best for them.

I have hope that merit still means something regardless of how much you can pay. I hope that one day there will be a more efficient internship system that realizes not every student can pay thousands of dollars to learn valuable work skills.

1 Comment

Filed under The Public Square

Fear Mongering in the 21st Century Newsroom

The other night I was riding the metro home, when a kind, albeit chatty, woman sitting beside me said she was relieved to see three officers patrolling our compartment. “Anyone could be on this train. There could be a terrorist here. For all anyone knows, there could be a bomb in your bag,” she said, nudging my modest tote bag with her foot.

Deep down, I think she probably knew the chances of a terrorist riding from Los Angeles to Lancaster were slim to none.  Even someone concealing a weapon would have been a stretch. Though I know this to be true, I am not immune either to letting irrational fears take over, to surveying fellow passengers, strangers and classmates, and sometimes wondering, What if? I am not immune to jumping at an abrupt sound, only to realize it’s the loud slap of a flip-flop coming down a stairway.

In an attempt to combat the paranoia I sometimes see in myself and in others, I decided to analyze what could possibly allow our minds to overhype normal, day-to-day activities and transform them into the next potential Columbine.

The answer, I believe, lies in my chosen career path, my self-imposed calling: journalism.

Here’s the thing about what I write and what I read in the news: It’s almost entirely depressing. A father worried about the economy kills his family and then himself. Identity theft is on the rise. Disabled people are more likely to be victims of crime than anyone else. The top three viewed stories on CNN.com right now are, “Suspect named in death of actress’s fiancé,” ”Former Japanese finance minister found dead,” and “Beaten teen’s funeral held in Chicago.”

But before I continue, I want to be clear that I’m not arguing we shield ourselves from the ugly in the world. Journalists have a responsibility to report what’s going on, even if that means reporting the disheartening and the scary. I don’t expect, nor do I think, that bad news is going to go away. However — if journalists are going to engage in this type of reporting, I am calling on them to take a look at how they present these stories, to ask themselves why they’re presenting them that way and to take a moment and consider the effect that presentation can have on an impressionable public. Though fear mongering and sensationalism are by no means new to journalism, the enhanced pressure of deadlines and competition in the 24/7 news era lends itself to irresponsibility in the newsroom. This, in turn, fosters an unrealistic sense of imminent danger and perpetuates feelings of anxiety for both journalists and news consumers.

Just by considering the very definition of news, it becomes immediately apparent why the stories we read are making the front page. News, by definition, is something new. (Earth-shattering, I know.) If it happened every day, it wouldn’t be newsworthy. It is supposed to take us by surprise, or at least present something unusual. By that definition, it makes sense that headlining stories involve car chases, gang wars and homicides. What is ironic is that the more we read and watch stories about danger and violence, the more we begin to think these aren’t just isolated, extraordinary circumstances — which is the very reason they made the news in the first place. They are so prevalent in everything we read and watch that overtime they begin to fuse, creating a horror-filled world. They begin to manifest as real threats and trends. When a channel dedicates 55 minutes of its hour-long news program to evildoers, kidnappings and shootings, it impresses upon us that this is the end-all of what’s happening in the world.

Because of this, a “culture of fear” has emerged. Author and USC executive vice provost Barry Glassner’s book by the same name, subtitled “Why Americans Are Afraid of the Wrong Things,” argues that the media are very much responsible for the widespread pathology of paranoia in the United States.  Though journalists have a right and a duty to report distressing statistics and stories, he makes a convincing case that they are also misreporting, focusing on the wrong topics, and in some cases, creating doom and gloom where it isn’t called for.

In one example from Glassner’s book, he addresses a common fear: the fear of flying. He says the media does relate to audiences that people die in car accidents three more times in any given year than the number of people who have died on flights in the history of commercial aviation. At the same time, they continue to play up the fear of going on flights. Misunderstandings also lead to misreporting. Glassner cites a 1998 story in the Washington Post called “Airline Accident Rate is Highest in 13 Years” (Glassner 183). In actuality, Glassner said, the rate of accidents was on a decline. The number of accidents had gone up because more flights were taking off — but this didn’t affect the rate. We will explore possible reasons for such false reporting a little bit later.

Like the plane example, reporters tend to invoke fear over the wrong topics by blowing isolated events out of proportion, distracting us from more worthwhile fears. With so much bad news to choose from, editors and journalists sometimes pick the wrong bad news, emphasizing illegitimate concerns instead of the real problems. One story that garnered a lot of attention back in the 1990s was the “growing epidemic” of road rage. In a Los Angeles Times article, the reporters opens with an anecdote, as the L.A. Times is apt to do, about an instance of someone being shot on the road. Several paragraphs down, the reporter writes:

“Road rage has become an exploding phenomenon across the country, but nowhere has it been more painful or pronounced than in the Pacific Northwest. Five drivers or their passengers have died since 1993” (Murphy).

Given that the article was written in 1998, that statistic averages out to about one person per year dying from road rage, which hardly classifies as “an exploding phenomenon.” The topic was not confined to this one article. Just last year, the L.A. Times ran another story called “Road-rage killings strikes again: They are random and senseless” (Vartabedian). The reporter says that neither the Los Angeles Police Department nor the California High Patrol keep track of how many people are shot on the road by other motorists. Regardless, he says, shootings have a long history — the most famous being John F. Kennedy’s. That calls into question exactly what falls under this ambiguous “road rage” categorization. I was under the impression JFK’s death was a planned assassination, not a kneejerk reaction incited by JFK motorcade tailgating another driver.

Other times, a fear is warranted, but it lacks proper context. Sometimes only a few minutes more of information gathering could put a story in perspective. Let’s go back to that headline about the disabled being the most likely victims of crimes (Freiden). As soon as I read this story on CNN, my mind raced to my autistic brother, a sweet, unassuming little boy, who would never hurt anyone. This story would have me believe my brother is just another statistic waiting to happen. What the story didn’t tell me is what percentage of disabled people are assaulted. My guess is there are millions of disabled people, which means that, hypothetically, maybe 3 percent of disabled people are ever victimized. Also, what types of people are being defined as disabled? The story is vague and could mean anyone — from someone who is temporarily on crutches to someone who is dyslexic. Journalists, so pressed for time, don’t stop to consider the bigger picture or the effect their words might have on a reader like myself.

This leads me to my next question: Why is this happening? I’ve been taught in journalism school to put aside my biases, but I firmly believe that most journalists who aspire to write hard news enter the industry with good intentions. Why does ethical, thorough reporting become skewed and sensationalized?

More than ever before, journalism has become a profit-driven industry. To stay alive in an era that my professors are more or less referring to as the demise of the print industry, reporters have the pressure of fierce competition from all directions, including the blogosphere. Looking back at the airline story where the reporters and editors mistook the word “rates” for “incidents,” it is possible it was a careless mistake made under a tight deadline. It is a writer’s job to ask all the right questions and make sure he or she understands the concepts behind whatever technical jargon their interview subjects may spit out. If the reporter asked for clarification, it’s possible the “rates” versus “incidents” misunderstanding could have been avoided. Also, because of layoffs in the newsroom and time pressure, there isn’t as much staff or as much time to double check and question what a writer puts down on paper. Writers have to rely on themselves to get things right more than ever.

Picking the stories that will generate the most attention is an unfortunate given. Unfortunate, but necessary in order to keep afloat among such fierce competition. In Social Marketing in the 21st Century, Allen R. Andreasen set out to explain, like Glassner pointed out, why the media acknowledge it’s more likely to die in a bathtub than a plane crash, yet continue to stir up alarm over flying:

“Crashes make for vivid reporting, tragic human interest stories, and sometimes episodes of bravery and sacrifice. Such stories play out over many news cycles and can reinforce a common fear that flying is quite risky” (44).

The problem here is that we, the news consumers, buy into these stories. We eat up tragic stories. It’s like driving by an accident; it’s upsetting and scarring, but we slow down though our instincts tell us to look away. It’s a symbiotic relationship: The readers are feeding the journalists, and the journalists are feeding the readers.

Profit-driven reporting spills over into how the news media brand events. As a viewer is flipping through channels, the news organization knows it has about a five-second window of opportunity to grab the viewer’s attention with a compelling phrase. For example, Virginia Tech wasn’t just a mass shooting. CNN dubbed it “Virginia Tech Massacre” — the text oozing down the screen in red, bloodlike script. Here, although sensationalism sells and scare tactics do attract immediate attention, the media are also running the risk of turning away the faint of heart. I personally know people who can’t stand to watch the news, because it’s so overwhelming and emotionally draining. Language is a powerful tool, and journalists who are mindful of their audience would do well to wield their newspeak carefully.

The last explanation I will offer for journalists’ fear mongering is the least mainstream, but something I see from personal experience as having a large impact on the way reporters report: the psychological effect of news on the journalist. If the constant barrage of bad news has an effect on the news audience, consider the effect it has on the journalist, whose life is dedicated to scoping out these stories in the 24/7 news age. Because of the Internet era, readers can pick and choose what they want to know about. They have the option of filtering out the disillusioning and the depressing. They can choose to stock up on the latest from Perez Hilton instead. For journalists, on the other hand, if they aren’t practicing community journalism — which is more attune to local Little League victories than catastrophic events — or entertainment journalism, they can either grin and bear the downpour of bad news they’re dealt, or they can submit to it.

One indication that journalists are negatively affected by what they report is their reputation for smoking and drinking, not only because of the stress of deadline—although that is no minor factor—but also because of the nature of the stories they cover. Truman Capote said, “”I drink, because it’s the only time I can stand it” (Waldron).  Similarly, photojournalist and recovering alcoholic David Ogot, said “most journalists smoke and drink to drown the images and pitiful stories they have to cover” (Arogo).

Former reporter William J. Drummond warned his colleagues to take the proper steps to heal their wounds:

“When [journalists] tell stories of trauma, loss, suffering, they do not walk away clean. Yet, as an industry, the news media offer little if any preparation or comfort to its workers who face this kind of emotional meat grinder. Instead the journalists are expected to suck it up and move on. When somebody shoots up a schoolyard, the school officials routinely call in grief counselors to tend to the survivors. But the reporters and camera people who covered that story get no such attention. They saddle up and go off to the next crime scene” (par. 13).

What results, I believe, is that the more journalists report on bad news, the more their rose-tinted lens on the world turns to gray. Journalists are often cynics. I would argue that their psychology, their inflicted state of mind brought upon over time by the nature of their work, pours into how they approach every new story. What’s the catch? they wonder. It is understandable why writers might have a worst-case scenario mentality, since the worst case is often what they are exposed to. One recent example of a writer scrounging for something negative in what could have been an all-around happy story was an article about Elizabeth Smart finally getting her day in court to testify. “Too often, it seems, the world is filled with bad news,” the reporter began. “But every now and then […] a story has a happy ending.” Then, the article takes an abrupt turn for the worst, and unnecessarily, too. He concludes: “Thankfully, we aren’t left to deal with the aftermath of what may have happened had she not been found. Unfortunately, not all children are as lucky,” and then goes on to list statistics about the hundreds of thousands of children gone missing each year. The transition from happy to downright Debbie Downer was tangential at best. You can just see him floundering, trying to rid cynicism from his system and then…failing.

It’s almost as if journalists have entered a frame of mind where they so expect things to be wrong that, when they aren’t wrong, this presents a fear of a looming catastrophe — a fear of the unknown. This, in my opinion, is the worst form of fear mongering — when a writer’s own trepidation becomes warrant for a story. A classic example is a Newsweek story — “The Lull Before the Storm?” The article, written in 1995, begins by saying crime fell by 7 percent nationally since 1990 — a fact that mayors were “euphoric” about. The writer follows up that spoonful of sugar with this:

“… there is bad news ahead. Criminologists are already warning that the United States can expect another wave of violent crime in the coming decade, and some say it will be much worse than the one that is now subsiding.”

The writer next references another article that “ominously predicts ‘The Coming of the Super-Predators’—teenage boys who routinely carry guns, who ‘have absolutely no respect for human life’ and who ‘kill and maim on impulse, without any intelligible motive’” (Smith, Beals, Brant, Annin).

First, what sets “Super-Predators” apart from the passable, OK-At-Best Predators? Do they wear capes? Second, why are these fear-based predictions making national news? The effect on readers, especially young readers who have to worry about their lab partner becoming the next Super-Predator, is unnecessarily paralyzing.  The writer should hold off on writing about these mutant teenagers until that fear comes to pass.

Perhaps even more disparaging: As I read this story, I saw an adjacent and very graphic ad of a mutated, scabbed and pothole-infested back that read, “If you’ve had chickenpox, you’re at risk of Shingles.” Great. If I’m not worried about a Super-Predator killing my roommate, I now have to worry about my 6-year-old bout of chickenpox becoming a blistering skin rash. Dan Rather once said, “Once we begin to see ourselves as more of a business and less a public service, the decline in quality is accelerated” (Mindich 80). Though undoubtedly good journalism is practiced every day, it is disappointing when business considerations and scare tactics overshadow quality reporting.

Journalists owe it to their public and to themselves to take a step back and put the story in perspective — to consider the emotional toll that resonates with news consumers. “It can happen to you” and “you’re next” is not ethical, whether it’s allowing a suspect ad about shingles to interrupt the flow of text to make money, or letting an isolated road rage incident become an exploding phenomenon. We should remember that one of the most revered journalists in history is Edward R. Murrow, who incidentally made it his job to dispel irrational fears. News corporations would also do well to put scaremongers in check, acknowledge that happy stories sell, and realize that striking an emotional balance might not hurt sales.

What’s more, “real” journalists grumble about gossip magazines, entertainment television and tabloids. The proliferation of celebrity news disturbs some journalists who consider themselves the stalwarts of hard news, but they need to ask themselves why people choose to consume “guilty pleasures” so veraciously. The answer is in the word “pleasure.” It’s light. It makes people happy, and happiness sells. Journalists don’t have to smear their good name and debate “Who wore it better? Paris Hilton or Lindsay Lohan?”, but they can tell respectable, happy stories with no catch. In fact, one of the biggest stories earlier this year was pilot Chelsey “Sully” Sullenbergers’s successful landing of US Airways Flight 1549 on the Hudson River. Heroism and feel-good stories have a place in the market.

If journalists can temper their language and report the right stories the right way, it is possible that the faint of heart can watch the 11 o’clock news and still manage to clock in some peaceful sleep.  Misplaced fears are detrimental to the minds of Americans and are undermining the ability to address the truly pressing issues of the day. The world we see on our televisions is scarier than the world outside our front doors. Journalists are supposed to be the watchdogs of society, not the harbingers of hysteria.

Works Cited

Leave a comment

Filed under The Public Square