Monday, October 22, 2007

IGNORE HOLLYWOOD: FOLLOW ONE PATH

Published Oct. 5, 2006 in "The Oklahoma Daily"
Viewable Online Here

Over the past two weeks, two movies employing actor Ashton Kutcher’s talents have been released. One is a drama/action flick with Kutcher playing one of the main roles. The second features the actor’s voice talents in a comedic animated feature.

While both films have performed respectably, the animated movie has easily outdone the live-action production.

Kutcher made his name in Hollywood as a solid comedic actor, the star of movies revolving around wondering where one’s car might be. In that vein, he was and remains successful, with numerous laugh-happy roles under his belt.

In the world of the small screen, he has gone on to produce several humorous television shows, in addition to his long-running sitcom work.

Kutcher is good at what he does.

As long as what he does is comedy.

As of late, he has forayed into dramatic roles, such as the aforementioned one. While name recognition and acting ability ensure that he does a decent job in these movies, they are generally nowhere near as successful as his comedic turns.

Without unduly insulting Kutcher or his career choices, this example can be said to be reflective of something more and more prevalent today — people trying to do too much.

This antithesis of specialization is most prominent in the entertainment field. It seems it is no longer profitable nor desirable to remain just an actor or just a singer.

Singers try to become actors, and actors try to sing. A modeling career is almost obligatory. Let’s not forget the clothing and fragrance lines. A personal restaurant is always good. And of course, there are the reality shows about, ironically, one’s life away from the spotlight.

In the process of pursuing these divergent ends, most entertainers’ resumes suffer greatly. This is due to a combination of overexposure and concentrating more on pursuits beyond their original one. Sadly, once glittering careers are often reduced to a shadow of their former selves, supported by a lifeline of celebrity endorsements and talk show appearances.

If this was a phenomenon limited to the arena of entertainment, it would not be so bad. This mantra of “do everything,” however, has trickled down to other areas of society as well. Particularly ominous is the infatuation and support this theory enjoys in the arena of education.

“Well-rounded” has transcended the label of buzzword and become something of a holy grail when it comes to college applications.

Each year, more and more high school students hear that it is not enough to simply have good grades. They must have musical talent, leadership experience, volunteer work, athletic skill and any number of other attributes.

While the pursuit of these multiple endeavors is beneficial and worthy, a very crucial point is missed by many in our multitask loving world — not all of these areas have to be mastered.

It is perfectly all right to specialize in one area, and maintain the others as hobbies or secondary pursuits.

In fact, this is a much closer model of the real world than one in which people are masters of all trades. True professional success is much more likely to come as a result of mastery of one skill rather than proficiency in many.

Let’s pretend that you’re about to undergo surgery to remove a tumor in your brain.

Would you even care whether your surgeon helped build an orphanage in Djibouti or could play the Brandenburg Concertos on three instruments while blindfolded?

No, you wouldn’t. As long as the surgeon was skilled with a scalpel and had thorough knowledge of neurosurgery, then it really wouldn’t matter what else he could or could not do.

This is just one example. In most any field, success and professional satisfaction are contingent not on wielding a plethora of skills all equally well, but on honing just the skills required for one’s particular arena.

By not focusing on other skills, one isn’t selling himself short, but rather is allowing himself to sharpen the skills that are truly important, the ones he cannot do without.

Take a lesson from the entertainers. Almost everyone that has tried to follow multiple career paths simultaneously has fallen into the trap and ended up with almost no career at all.

By all means, do anything and everything that interests you. But keep in mind that not all those things must be done to equally high standards.

Don’t let the pursuit of this facetious notion derail you from the path of your true niche, whatever that may be.

Let Hollywood be alone in its mistakes. Your true calling in life is much too great a prize to sacrifice.

HELP YOURSELF BEFORE TURNING TO COURTS

Published Sep. 21, 2006 in "The Oklahoma Daily
Viewable Online Here

A few days ago, a woman was attacked by a neighbor’s dog in Tampa, Fla. There’s nothing unusual about that, as attacks by vicious dogs are an unfortunately common occurrence.

The standout of this story is that the poor woman, instead of simply wilting under the force of the hundred-pound Rottweiler’s bites, fought back. She took matters into her own hands and actually bit the dog. And the dog immediately released her arm and retreated.

The amusing and heroic elements of this allegory also bring with them a somewhat symbolic element, with respect to the abuse of the legal system.

Our society seems to generate ever-increasing numbers of lawsuits. There’s nothing inherently wrong with lawsuits, because they are an integral part of the judicial system guaranteed by the Constitution.

America’s founding fathers knew that in any pluralistic nation, there would be disagreement between people. Thus, they designed a mechanism to settle those disputes in a civil manner with the rule of law.

Those founding fathers, however, probably did not foresee the sort of lawsuits being increasingly filed today.

For example, there have been a number of lawsuits seeking damages from fast-food chain McDonald’s for making their food so addictive. The claimants argue that this addictive property made them unable to resist the fatty foods and, as a result, they became obese. In other words, McDonald’s should pay for serving good-tasting food that their customers kept voluntarily returning for.

Now, I’m no authority on the legal codes of this country, but from a logical standpoint, customers suing a restaurant for its food being irresistibly tasty is truly baffling to me. Particularly when the lawsuit demands several thousands of dollars (if not more) as compensation.

Well now, why should it stop at McDonald’s? The cow that the burger patties came from is also to blame. The estate of the obviously dead cow and the rancher that raised it should also be made to pay for their parts in this fat-laden odyssey. And let’s not forget the fries that were likely super-sized by the disgruntled customers. Aren’t the potato farmers and the suppliers of the fry oil to blame also? Should they not also add their apologies (in the form of dollars, of course) to the settlement the plaintiffs seek?

No. In fact, this sort of lawsuit in its entirety is flawed. This is why almost all have been thrown out in courts.

That’s all fine, but by the time they were thrown out, these lawsuits had wasted the time and resources of the courts they were heard in. Almost all courts around the nation are severely strained and have a solid backlog of cases. Baseless cases such as the aforementioned only aggravate the existing difficulties by adding to the courts’ workload and diverting resources from cases that actually have a valid basis for existing.

And what is truly disheartening is that the McDonald’s illustration is just one example.

There has been a deluge of these cases in recent years. Examples include parents suing a cheerleading coach for failing to appoint their young daughter squad captain and coffee shops being sued because their cups did not clearly indicate that the coffee contained within was indeed hot.

The story of the dog-biting woman from Florida illustrates two virtues which are crucial to lessening the number of these frivolous and laughable lawsuits.

First, simply exercising common sense would ensure that these lawsuits remain exactly where they should — deep inside the recesses of the minds of those who are thinking about suing. Secondly, and more importantly, one should take steps to address one’s own grievances.

I’m not advocating vigilantism, but rather I’m saying one should do exactly as the dog victim did: Work to improve troubles in a meaningful way.

Instead of suffering the dog bites and playing the role of helpless victim later, she used the human attributes of reason and rationality to improve the situation for all those involved, the dog included.

Does she have reason to sue? Definitely. Will she? Possibly.

If she does, will the fact that she actually fought off the dog add to her credibility? Absolutely.

This is the crucial piece that many money-hungry plaintiffs in our sue-happy society often overlook.

Before automatically dialing the nearest attorney of flexible standards, we should think about what we ourselves can do to improve our situation.

Everything is simply less complicated that way.

HOLIDAYS ARE LOSING MEANING

Published Sep. 7, 2006 in "The Oklahoma Daily"
Viewable Online Here

By the time this is printed, the short week will almost be over.

For that short week, we can thank the enterprising spirit of the American worker. At least, that was what Labor Day originally commemorated. The first Labor Day celebrations were full of parading workers’ unions, speeches and celebrations for workers’ families. Since then, though, things have changed.

Granted, some of these changes have been undoubtedly for the better. The annual Labor Day Muscular Dystrophy Association Telethon, a relatively recent addition, raises millions of dollars every year.

Most of the changes, however, have simply contributed to make this celebration of hard work and fruitful labor into an ironic smorgasbord of lethargy and unproductiveness.

For most people, the most significant aspect of Labor Day is that it offers an extra day’s respite from classes and work. The most profound and widespread celebration of this day may be reveling in a few more hours of sleep. And let’s not forget the killer sales at Best Buy and the mall. There’s nothing wrong with relaxing and shopping. But there is a problem with forgetting the original reason and significance for the holiday. That problem is all the more serious because it is not limited to just Labor Day.

Unfortunately, there are a number of holidays whose commemorations are getting farther and farther from their original intent. In some cases, certain holidays are not even being celebrated. To be fair, there is merit in some of these holiday boycotts.

An example is Columbus Day. Many Native Americans take much offense (rightly so, some would say) that a man who they see as the primary reason for the eventual decimation of their people would be commemorated. Thus, in some areas, Columbus Day is celebrated with a minimum of festivities — if any at all.

However, there are no such controversies surrounding several other holidays fading into obscurity. Examples include Flag Day and Veterans’ Day.

Flag Day commemorates the choosing of the first American flag. Old Glory, the red, white and blue, the very symbol of the United States known the world over, was adopted on this day. And most people don’t even realize when that day was.

Veterans’ Day is an even more meaningful holiday: it highlights the sacrifices and achievements of American veterans in all wars, regardless of outcome or public popularity. The holiday is intended to be a commemoration of the men and women who readily sacrifice for the defense of their country and fellow citizens at a moment’s notice.

Shamefully, outside of military bases and VFW posts, the celebrations of this solemn holiday are extremely limited, if existent at all.

Both these aforementioned holidays were once very widely celebrated. However, most of us today would be hard-pressed to even name the months that these once widely commemorated celebrations fall in.

Often, the only clue of the days of these holidays is a lack of postal service and a brief announcement on the local news. This is disheartening in and of itself, particularly since these holidays each have such a special significance. Even more foreboding is what this suggests could happen to today’s popularly celebrated holidays in the future.

Considering that Veterans’ Day and Flag Day were once so popular and well-known, one can’t help but wonder if this foreshadows the demise of more holidays. Particularly jarring is the idea that a holiday is only remembered for its material associations — extra sleep, cheaper shopping and so forth.

The reason and emotion behind the holiday, its true substance, seem worthy of remembrance only in the context of those things.

So, without the requisite sales and mass turkey dinners, are our children’s children destined to celebrate Thanksgiving like much of our generation celebrates Labor Day?

That is, concerned only with the extra sleep, and not at all with what it commemorates.

And is Fourth of July only to be known for sales of fireworks and hot dogs sometime in the future?

I sincerely hope not, because it’s not a pretty picture.

While Flag Day and Veterans’ Day may never regain their once-popular and widespread commemorations, we can, and should, prevent this from happening to other holidays in the future.

When you find yourself with no classes on a weekday, think about why that is. Think about whose accomplishments or what commemoration allowed you to not have to listen to lectures or not take a test that day. Think about the significance of that day in history. And finally, take time to remember that day’s date for the years to come.

Oh, by the way, Flag Day is June 14, and Veterans Day is Nov. 11. You’re welcome.

COLUMN: NO SIDEWALK, BUT SAME PROBLEMS

Published Aug. 24, 2006 in "The Oklahoma Daily"
Viewable Online Here

"Where the Sidewalk Ends” is the title of a book of Shel Silverstein’s poetry. As of late, it’s also a good description of the western side of South Oval.

Where there was once a well-trodden sidewalk, there are now new patches of grass squares. The old asphalt has been completely redone, and all traffic is now to use the new surface.

Since that part of the oval has always been off limits to anything larger than a golf cart, all traffic there is either on foot, two wheels or the occasional golf cart.

While the new road surface is undoubtedly much improved, I am not so sure that it’s a good long-term traffic solution.

I hate to delve into the hard-science-major part of my brain and drag out quantitative comparisons so early on, but here, I must.

The old system of a sidewalk and the road simply had more usable surface area.

As the sidewalk in question was about 1,000 feet long (according to Google Maps) and most sidewalks are about four feet wide, 4,000 square feet of usable concrete have been done away with.

A less usable area means that traffic is more congested. This does not bode well, as traffic has not decreased at all.

In addition, not only did the old system have more space, it was better organized as well.

Most pedestrians, me included, used the sidewalk. Perhaps this was a result of the same innate mental guardian that makes me look both ways before crossing streets.

Whatever the reason, for the most part, cyclists had the road surface to themselves and this worked out well.

Pedestrians could travel at a comfortable pace, and not have to worry about bikes bearing down on them. Bike riders, in turn, could travel faster without having to look out for oblivious pedestrians blocking their path.

That’s no more. The more spacious, better organized South Oval of old has been turned into one all-purpose thoroughfare. This freshly-resurfaced, single swath of concrete is now used by all forms of traffic. Based on first-hand observations over the past few days, this is less than ideal.

The fault for this, though, rests on no single group.

Many pedestrians see the new road as one giant sidewalk, and feel free to walk in groups of five or six abreast in the middle of the road. This effectively blocks most of the road for any bikes and the ubiquitous opening-week golf carts.

Some of those vehicle operators, however, feel that the new road is solely their domain. This results in high-speed runs and weaving maneuvers worthy of the Tour de France. Keep in mind, though, there are no pedestrians on the course of the actual Tour — and for very good reason.

So where does that leave us, the bikers and pedestrians of South Oval?

I would say that the old layout was a much better one. However, we can’t undo what has already been done, so we’re stuck with the new road. I wouldn’t really recommend simply using the other side of the oval — unless you particularly enjoy dodging buses and inhaling diesel fumes.

So, the thing to do seems to simply heed the advice of many a crossing guard and elementary school hall monitor: watch out for other people.

If you prefer walking, think of those on bikes. Keep in mind that it is much easier and safer for you to stop and turn on your feet than someone balancing on two thin tires and a bunch of metal tubes. There’s a good reason that bike helmets have been invented and walking helmets have not.

If you’d rather pedal, then keep in mind that most pedestrians want to you to crash into them just as much as you want yourself to crash into them. They’ll move out of you way given proper warning, especially from behind.

We all have to conscientiously share and use this newfangled path, whether we like it or not. So do just that — share the road. And spread the love.

LEARN TO LEARN FROM OTHERS

Published May 2,2006 in "The Oklahoma Daily"
Viewable Online Here

The human race is constantly progressing in its accumulation of knowledge. Since the discovery of fire by the cavemen to Roman law to Renaissance architecture to modern medicine, the sum total of human knowledge is astounding in breadth and depth.


This knowledge has been and continues to be gained in one of two ways. The first is by individual minds deducing and inducing ideas from nature. Newton's now-famous gravity experiments illustrate this.


The second way, the more important way, is by transferring and enhancing knowledge among people and groups.


Consider the idea of inoculation, which was used to counter smallpox. It was first developed by a Turkish physician, but shot to prominence when successfully employed by an American doctor.


I call the second aforementioned method more important simply because if more people are involved, it follows that the knowledge gained will usually be greater.


This is all the more important given the diverse and disparate groups of people on this planet. The population of the planet has well surpassed six billion, and the growth shows no signs of slowing down.


Those six billion souls occupy every corner of the planet, at least some of the time. In the time that they are there, they undoubtedly acquire knowledge about their surroundings.


They have to because that knowledge stems from every facet of life. Thus, generally speaking, humankind is always learning because humankind is always living.


If all this knowledge was transferred and shared in the way that inoculation was, the amount of learning amassed by cooperative learning and would truly be indescribably massive.


Or, in simpler terms, as long as people in one part of the globe had knowledge of something, it would only be a matter of time and interaction before all humankind shared in its knowledge.


There is, however, one major obstacle preventing the success of the above scenario. It is man-made, baseless and completely preventable.


I'm speaking of the evil of prejudice. Throughout history, and unfortunately even today, people have thought of others as inferior, based on things such as lifestyle and skin color.


Some of that inferiority is tangible, such as the level of technological sophistication or military strength.


However, such material deficiencies should never be held to be indicative of a shortfall of useful, advanced knowledge. Consider the following:


The first permanent English settlement in North America, Jamestown, Va., refused to adopt Native American farming methods.


Such "primitive" people obviously could not know anything that English gentlemen didn't, or so the thinking went.


That most of the settlers nearly perished from a lack of food simply underscores the sheer fallacy of such a pretense.


Even in more recent times, such prejudices have persisted. Indigenous tribes, living in numerous areas, employ natural remedies and folk treatments.


These were derided as unscientific voodoo-esque practices by many in the scientific community well into the 20th century.


Recent research, however, has shown that a great many of those remedies actually have the potential to stop many deadly diseases that have plagued humans for centuries.


Just because these tribespeople have never seen the inside of a lab or a doctor's waiting room does not mean that their medicine is any less potent or innovative.


Indeed, some of their previous detractors are now using their knowledge of the natural world to forge new and powerful treatments to serve all people.


Humans are much the same, no matter where or how we live. We look similar, overall. We have the same needs.


Most importantly, we are always trying to better our lives. We are continually learning new things to do just that, both consciously and unconsciously.


Given all these glaring similarities, something as heinous and baseless and prejudice is simply counterproductive.


Not only does it belittle others unfairly, but it also keeps their accumulated knowledge from being put to widespread good use because of a false belief of its inferiority.


So when it comes to people, what they wear, how they live and where they live should not be of paramount importance.


What truly matters is what they know, and how that knowledge can be used to improve the lives of all humans.

'REALITY' TELEVISION USUALLY NOT WORTH WATCHING

Published Apr. 19, 2006 in "The Oklahoma Daily"

Viewable Online Here


For the vast majority of Saturdays between the ages of 8 and 11, I was up at 6 a.m.


Watching those morning cartoons was the highlight of my week.


As I got older, school, activities and, well, life, began to take up more of my time.


My trusty Panasonic 20-inch became more and more a mirror in a fancy plastic box. Now that I'm older, I realize that television programming has never been near the zenith of human achievement.


Even then, in the last few years, it has inched ever closer to rock bottom, if you ask me. The little television I watch, and used to enjoy, has become permeated with substandard wastes of camera time.


What vile vehicle has brought the boob tube to such abysmal heights? I give you the reality show.


"Survivor" was arguably the first of the modern breed. It was actually somewhat decent entertainment. Then came copycats like "The Amazing Race" and "Big Brother."


With each passing iteration of these shows, quality dropped lower and lower. This downward spiral continued until now such shows as "Fear Factor" and "The Bachelor" serve as primetime viewing.


The actors in the shows may be average people, but the situations certainly are not.


Think about it. How often are you on a small island playing asinine games for water? Have you tried to get your father married off to women half his age?


Even on a much simpler level, consider this: Anytime a camera crew is present, no situation, no matter how natural, can be called reality.


The so-called "reality" of these shows is nothing more than a contrived amalgam of sketches designed to garner ratings.


Not that there is anything wrong with contrived amalgams of sketches. It's what all TV programming is.


It's not because of this that I decry reality shows. Here is my main beef with so-called reality TV: It's not reality.


These shows are about as far from most viewers' realities as PBS is from Cinemax.


Well, why not just drop the reality-show moniker, and just call it a "show"? That only solves half the problem.


The shows in question are hardly quality entertainment. Granted, networks have the right to broadcast whatever shows they like, as long as there are no violations of decency standards, a la Janet Jackson.


However, being an opinion columnist, I must relay what I think of the aforementioned shows.


In a word: horrible. I'm not a prude when it comes to my entertainment. I don't expect every show I watch to reinforce family values or enlighten me in some way. I'm open to meaningless entertainment.


But seriously, I don't see how people being compelled to eat insects and horse innards constitutes entertainment in any form. My tolerance for entertainment is pretty broad, but I really don't like it to kill my appetite and make me gag.


And call me old-fashioned, but isn't finding a fiancee supposed to be a romantic affair, and not a TV smorgasbord of speed-dating beauty queens?


Basically, calling most of these sorts of shows "entertainment" is insulting to the very ideal of my entertainment.


So there you have it: "Reality shows" certainly aren't reality, and in their extreme forms are borderline repulsive.


So, why are they broadcast in the time slots previously occupied by traditional sitcoms, dramas and such?


The answer lies in the copycat nature of much of the telecommunications sector. Given the runaway success of the original "Survivor," suddenly every network rushed to produce reality shows.


In that rush, the thought and foresight that went into planning "Survivor" was not apparent. Therefore, the new shows had little of the entertainment and none of the originality of that show.


As a testament to this, most of those copycats were canceled after just a few seasons, while "Survivor" has continued, now in its 12th season.


So, where does all this leave me, the disgruntled viewer? It leaves me imploring networks to not develop new reality shows.


The concept has been overdone, and the only shows continually succeeding are those that were the original pioneers.


More importantly, it leaves me asking you, my fellow viewers, to not give the rating-chasers any reason to justify creating any more shows.


If you really want reality, just turn off the TVs and go outside.

UOSA CANDIDATES MODELS OF POLITICAL CIVILITY

Published Apr. 5, 2006 in "The Oklahoma Daily"
Viewable Online Here

Your first clue was probably the barrage of signs standing at attention in the grass.


Or maybe you grew accustomed to being handed some kind of vote flyer every time you passed Dale Hall.


In any case, unless you were staging a one-student sit-in within the confines of your room during the previous week, you're likely aware that the UOSA presidential election recently took place.


As with all elections, the preceding campaigning was done tirelessly by candidates and their legions of supporters.


From the aforementioned flyers and signs to chalkings and individual handshakes, virtually every campaign tactic possible was used.


There was one notable exception, at least in the public arena. Nowhere was there any negative campaigning or mudslinging to be found.


Being a well-regulated campus, this is as likely due to OU rules as it is to candidate decency.


No matter what the reason, it is not there, and that is to be commended.


Look around at virtually any election of consequence at any level of government.


Almost guaranteed, every candidate will precede it with a campaign that paints every other candidate in a negative light.


Many will say this is a necessary part of establishing oneself as the best candidate. To this I cannot agree.


While it is necessary to establish superiority over the other candidates, this method of negative campaigning is not the way to do it.


An ideal election is one in which the voters choose the best candidate.


However, due to the maelstrom of mudslinging that characterizes current elections, most voters do not receive enough credible information to determine who is the best candidate. Instead, most simply vote for the person who appears to not be the worst.


This method of simply picking the "least of the evils" is not what a democratic election ought to be. But because of the rampant smear campaigns, it has emerged as the only way.


Indeed, I would go so far as to say that these campaigns have played a significant role in the increasingly negative image of the political system when compared to previous decades.


The main force behind this is probably the general voter's view of most negative campaigns.


A propensity to attack competing candidates while not highlighting one's own campaign generally gives off a lack of confidence in that campaign.


And if the candidate himself does not believe in the value of his platform, then why should the voter?


While not all politicians practice negative campaigns, the vast majority of them unfortunately do.


This election season, count the number of negative ads each candidate sponsors, and you'll see what I mean.


With each negative ad they run, the candidates get closer and closer to the increasingly popular caricature of shady, conniving, opportunistic baby-kissers.


The UOSA candidates certainly aren't such people. They are running simply because they want to be a larger part of this university, not because their livelihood will benefit from their potential office.


As clich? as it may sound, the candidates are truly running their own campaigns.


As such, they cannot shift blame for bad decisions or unfavorable campaign stunts on to party strategists or bellicose campaign managers.


Each candidate really is only representing him or herself.


Whether a candidate had your vote from the very moment of announcing his or her candidacy, or someone won your vote through long hours of campaigning, each stood as public examples of political civility.


They did their best to portray themselves favorably, but stopped well short of sullying their opponents' names.


Whether you particularly cared for any of the candidates or not, I hope you at least enjoyed the campaigns of each ticket.


In our modern democratic system of party strategies, focus groups, multimillionaire donors and ultra-active PACs, it's pretty safe to assume that campaign civility is likely seen nowhere except on South Oval.