86 items found for your search. If no results were found please broaden your search.
(10/10/07 11:17pm)
I personally believe that there should be a requirement that every politician who runs for public office must have smoked pot at some point. Even if that experience doesn’t make the politician want to legalize it, he or she will at least realize how dangerous it isn’t.\nMy position on this issue was only strengthened this week after watching a CNN video of Mitt Romney, in typical 2008 Republican front-runner style, dismiss a multiple sclerosis sufferer advocating that medical marijuana arrests be stopped. The MS sufferer caught Romney on camera and explained to him that, although he is against legalizing marijuana, the smoked form of the drug is the only pain reliever for his lifelong illness that he can use without getting sick. \nHis question, then, was “Will you arrest me and my doctors if I get medical marijuana prescribed to me?”\nRomney dodged the question, answering, “I’m not in favor of medical marijuana being legal.” After that, he returned to his mission of shaking hands with as many rally attendees as possible, ignoring journalists who pressed him to answer the man’s question.\nRomney’s attitude toward the MS patient exemplifies the 2008 Republican front-running presidential candidates’ chronic dodging of the issue of medical marijuana arrests and raids on medical marijuana dispensaries, which have been common since the U.S. Supreme Court decided Raich v. Vernon in 2005. The verdict allowed federal officers to arrest sellers and users of medical marijuana, regardless of individual state laws.\nDetermined to at least appear concerned for everyone’s well-being, the candidates have tried to make their anti-medical marijuana stance appear justified through pointing out the drug’s safety issues, health risks and its potential to proliferate recreational drug use. \nBut that appearance falls apart when someone brings up the topic of medical marijuana arrests and dispensary raids. Standing firm in the belief that cancer patients and well-meaning doctors should be tossed in the slammer doesn’t exude that same sense of compassion about public health.\nSo to avoid the hypocrisy, the candidates draw attention away from the arrests and toward the drug’s risks.\nWhen a woman at a New Hampshire conference last week asked John McCain whether he would legally allow her use of medical marijuana, he replied: \n“You may be one of the unique cases in America that only medical marijuana can relieve pain from ... Every medical expert I know of, including the (American Medical Association), says there are much more effective and much more, uh, better treatments for pain.”\nAnd last week at another conference, when a woman asked Rudy Giuliani about his position on the raids, he, too, avoided the topic and talked about the FDA’s evaluation of cannabis alternatives.\nThe health and safety issues medical marijuana presents are important topics for political discussion. But the discussion that needs to come first is the one about people who are getting arrested for trying to put themselves out of agony while hurting no one else – and how to stop those arrests.
(09/27/07 2:11am)
Being the language addict I am, an Associated Press article caught my eye this past week about how thousands of languages around the world are dying out as speakers turn to more commercially-useful or national languages.\n According to the article, the world has about 7,000 living languages, as many as half of which have never been written down. Languages are dying out at a rate of about one every other week. For some languages, this is because researchers have not yet gained cultural access to them. For others, it is because we do not yet even know they exist.\n People hardly ever think about the importance of language on a greater scale than its usefulness as a personal communication tool. Especially as English-speaking Americans, we tend to have a “Tower of Babel” mentality that language plurality is evil. That unfortunate mentality often gets in the way of being able to communicate with everyone. But we would not be better off if everyone spoke English, because every language teaches us something new about human thought. \n A lot of people are apathetic about language extinction. Their philosophy: Times change, attitudes change; languages die out when they are not useful anymore. Speakers of a language obviously have a nostalgic drive to keep it alive because of cultural heritage, but evolution will eventually kill it just like every other language in history. It’s silly to keep a language alive if the only culture that needed it to communicate is dead.\n This viewpoint is right, to an extent. Language is meant for communication, and language death is an inevitable part of history. But for linguistic researchers, language is a tool for a different purpose. Whereas speakers use it as a tool to make meaning of their place and time in the world, researchers use it to compare those meanings.\n Of course, living languages are the most useful in terms of making such comparisons because they are more closely connected to their culture. But even languages that have been artificially kept alive allow researchers to see helpful cultural remnants. Because no two languages are the same, even these remnants provide valuable information about language that cannot be obtained from other languages. \n If we all spoke English, we would be less likely to find out how much we really “know” about the world around us. By studying languages, what we thought was “real” or universal is metaphorical or only related to knowledge of what is current and immediately around us. In English, we might think objects relate to other objects in a certain way, then realize how relative our ideas were when we look at how that object is expressed in another language.\n Although, as college students hanging around IU we do not have much control over language extinction, we should try to understand that the plurality of languages is important. Any knowledge we can gain from foreign languages is knowledge we would not otherwise have about the human experience.
(09/13/07 1:47am)
In that epic Technicolor film “Spartacus,” Julius Caesar sneers at a comment politician Marcus Licinius Crassus makes about God putting Rome on a pedestal. Caesar says, “I’d no idea you’d grown so religious.” \nCrassus responds with laughter, “It doesn’t matter. If there were no gods at all, I’d still revere them.”\nUsing the appearance of piety as a control mechanism is one of government’s oldest tricks. Yet people still elect visible public leaders on the not only incorrect but also dangerous belief that a politician’s theism makes him more trustworthy. \nThe Pew Research Center and the Pew Forum on Religion and Public Life released the results this week of a telephone-interview poll they conducted last month about what religious beliefs a presidential candidate might hold that would make voters more or less likely to vote for that politician. \nThe poll found that 61 percent of Americans would be less likely to vote for an atheist, 45 percent would be less likely to vote for a Muslim and 25 percent would be less likely to vote for a Mormon, while only 16 percent said they would be less likely to vote for an evangelical. \nThe logic I usually hear from non-religious people on the inanity of the preference by most Americans for a theistic leader over an atheistic one is that people can be moral without being religious. \nThey are right, but this should not even be an issue. Yes, politicians often use religion as a public-relations ploy. And yes, belief in God and the claimed adherence to that belief is not a good gauge of morality. \nBut more importantly, when politicians let any moral desires – any that are extraneous to the desire to keep their electors alive and safe – influence their decisions, they perpetuate the hazardous myth that “legal” equals “moral” and “illegal” equals “immoral.” \nMany people inevitably realize when a crime is made illegal because of its “immorality” that the government has exaggerated the danger. Although these crimes often have at least a somewhat logical basis for being made illegal, officials demean that logic by resorting to talking about the crimes’ “immorality.” And because not everyone has those morals, they then see not getting caught as the only logical reason for not committing them. \nAt the same time, plenty of legal activity goes on that, by the same alleged logic, seems just as or more immoral. It is legal to gamble on a river boat, for instance, but not on the Internet, and it is legal to be intoxicated but not high. These laws of course do not prevent illegal gambling or drugs and sidestep the shaky reasoning for their illegality by being “immoral.” \nReligion plays the role of an easy way to control politics, and the best policies would be made in a world where public leaders did not feel like their electors need to know their religious beliefs at all. Although they don’t need to keep their religion a secret, it should never play a part in their policy.
(09/06/07 4:00am)
"Year of the Dog" is an endearing movie, but just how endearing you find it depends on how much you like animals.\nIf you love animals, it will probably be your kind of movie. If you're ambivalent about animals, you'll probably be ambivalent about the movie. While "Year of the Dog" has its charms, most of its "heartwarming" factor is wrapped up in its furrier characters.\nThe plot revolves around awkward spinster secretary Peggy (Molly Shannon), a reliable woman who everyone takes for granted but who has no real soul mates except for her dog Pencil. When Pencil dies after eating poison in the neighbor's yard, Peggy tries to cope with the loss by deepening her relationships with people. None of them, however, are her cup of tea. \nShe finally finds shelter volunteering with a dog adoption. There she also meets a charming vegan named Newt (Peter Sarsgaard), who turns out to be celibate, but who nonetheless convinces her too to give up the "murder" of animals and pursue the valiant cause of animal rights activism. \nThe plot might be relative to your affection for smaller mammals, but the movie's characters are its saving grace. They are quirky, slightly detached from reality and interact in hilarious ways. \nBut even the appealing plot and fabulous characters can't save the movie from its awful ending. It's sappy, none of the plot points get any closure and I would have felt completely cheated had it not been for the movie's witty interactions.\nThe special features, however, are better. There are interviews with Molly Shannon and Director Mike White and features on the making of the movie and on the group that trained the animals for the movie. \nOverall, "Year of the Dog" is worth seeing if you're an animal lover. If not, it's just decent.
(09/04/07 3:18am)
I hate exclamation points. \nIf one grammatical matter can give away someone’s lack of writing abilities and inept vocabulary faster than anything else, it’s the use of several exclamation points in a row.\nI thought anyone who knew anything about writing agreed with me here, but on Thursday, Slate Magazine ran an article arguing that not only are exclamation points a good thing, but that we should use more of them.\nIt was not, however, actually talking about “writing.” Rather, it was talking about the speaking-writing hybrid we’ve come to know as Internet talk. \nThe Slate article claimed that we use exclamation points in Internet chatter out of a subconscious admission that what we are typing is not really worthy of being written at all. Our writing evolved from a system in which time and writing materials were scarce, the article argues, so every word had to count. \nFourteenth-century monks never would have written to their friends that they would “totally, like, meet them in 15 minutes,” and such traditions of writing have been passed down to us. That’s why you probably have friends who get peeved if you type “lol” or omit the first two letters of “you.” \nSlate, and many people I know, seem to think we are demeaning writing by letting it take on this shallow purpose of saying things that don’t “deserve” to be said. However, I’m inclined to disagree, because I wouldn’t call Internet chatter “writing,” per se. \nIf language were movies, verbal speech would be analogous to live action, and writing would correlate to animation. Neither could completely represent real life, but the first comes more naturally and can more naturally inflect lifelike situations. The latter is more composed of a set of symbols of which the meaning and use must be consciously learned. \nInternet speech is more like claymation. It’s a hybrid we created to strike a balance for a task that neither of our previously set systems could totally cover. \nSpeaking seemed inadequate because it was too lifelike to be written; writing seemed inadequate because it was too formal for the quick conversation created through typing and the instant sending and receiving of messages. \nEventually Internet speech became its own form of language expression, equally as valid in its semantics as the others. Internet chatterers actually use a surprisingly complex thought process for determining when to use “lol” versus “hahahaha” in an AIM conversation or deciding whether to follow up a statement with two exclamation points or three when a friend e-mails you to ask how much fun you had at an event. \nThis fine distinction doesn’t present itself in your term paper. Regardless of whether you say “Nero played the violin while Rome burned!!” or “Nero played the violin while Rome burned!!!” you look like you don’t know the rules of writing. \nEven though I, as a writer, still cringe at the sight of exclamation points, I think it’s good that they’re finding use in Internet speech. It’s always exciting to see the possibilities of language evolve.
(08/29/07 10:39pm)
Last week I received an e-mail, disguised as a press release, informing me that Michael Vick’s dog fighting sentence clearly means abortion should be outlawed. \nI never would have made this brilliant connection by myself. But because I am always looking for ways to gain more press credibility, I decided to try my hand at emulating this news wire’s superior reasoning skills by making equally well-thought-out correlations between other important news topics from this past week. \nThe week’s most compelling story was America’s increasing waistline. Obesity rates increased in 31 states last year and decreased in none, according to data released by the nonprofit disease prevention agency Trust for America’s Health. Nineteen states now have obesity rates of more than 25 percent, up from 14 states last year. \nThough we’re staying at home to sit around, we’ve been active on the other side of the globe. Due in large part to our efforts to fight Taliban insurgency, more Afghanis have been staying home to farm opium. The country’s production of the drug reached an all-time high this year, up 34 percent from its all-time high last year. \nPerhaps this sounds like bad news because we’re helping to fuel a dangerous black market, but I see it as the solution to our obesity epidemic. Who ever heard of getting the munchies from heroin? If we replaced, say, half of our food production industry with opium trade and promoted the drug as a healthy alternative to alcohol and marijuana, I’m sure we’d see a reduction in our nation’s collective waist size. \nIn food news, the Federal Trade Commission failed in its attempt to block the $565 million merger of the natural-foods chains Whole Foods Market and Wild Oats Market on the basis that the merger would decrease competition. Apparently some judicial branch employee was intelligent enough to realize that the chains didn’t usually have branches in the same cities, but that’s not why this event is important. \nIt is important because now that Big Organic is taking over food (which will assist all that opium in making us healthier), we’ll have to take our penchant for engineering genetic mutations to other objects, such as shoes. While shopping in Houston, a reader of the popular internet blog Boing Boing spotted pairs of “Cruggs,” the unfortunate result of a footwear-mating between Crocs and Uggs. \nPresident Bush made an equally charming connection when he pointed to Vietnam as a reason to stay in Iraq. Attorney General Alberto Gonzales stood up for him by resigning, walking out in hopes of still maintaining a relationship with something that once had lunch with dignity. \nBush is looking for potential appointees to the empty position, and I have no doubt they are jumping at the chance to fulfill such an honor. \nBut snark mostly aside, I say this all should circle back to Michael Vick, probably the second-most-hated man in America, and Bush should make him Attorney General for the year instead of allowing him to hang around prison.
(08/01/07 9:22pm)
Not a week goes by that I don’t notice one of my Facebook friends crawling across my News Feed, informing me they’ve joined either a group declaring the superiority of good grammar or a group that wants the Grammar Nazis to shut up. \nI understand both where these people are coming from and from where they are coming. \nAs an English major, linguistics major and as IDS copy chief, I have conflicting grammar addictions. But for that same reason, I can’t help feeling that the people who wish everyone would talk “right” and the people who want to do away with this kind of standardization altogether are missing out on the freeing capabilities of language.\nLanguage is like any sport: To get the most out of it, you have to respect the rules.\nBoth the crowd that wants grammar standardized – in linguistics, we call these people “prescriptive grammarians” – and the crowd that wants language left to standardize itself without snooty people restricting it – the “descriptive grammarians” – hold the position they do because they want language to be respected.\nPrescriptive grammarians think people who carelessly throw out sentences that end in prepositions and grocery store owners who post signs at express checkout counters reading “15 items or less” (if you’re not aware of this old English major jab, it’s supposed to be “15 items or fewer”) are disrespecting language because they don’t care enough about it to learn the rules.\nDescriptive grammarians, on the other hand, realize prescription is dictated by upper socioeconomic classes and majorities. The reason, for instance, that you’ve been taught since second grade not to use “ain’t” isn’t because the word doesn’t make sense; it’s because the word makes you sound lower-class. Descriptive grammarians contend that anything a person says that makes sense to the party they are talking to should be considered “grammatical.”\nI agree with the descriptive grammarians. Distinctions that parade under the guise of “good grammar” are just prejudices against lower classes and minorities. But I also agree with the old proverb: Rules are made to be broken. \nAs long as we have upper classes we’ll have prescriptive grammar rules. And yes, those rules are stupid and arbitrary. But knowing the rules and knowing how ridiculous they are gives you the power to break them.\nStandardization is OK to an extent. It can improve communication by decreasing ambiguity and using structure to help people say what they mean. But it also makes people identify themselves with classes, races, genders, etc., to which they feel they do not belong because they want to avoid standing out as an inferior.\nBelieving in “proper grammar” is prejudiced, and believing all standardization of grammar should be abolished or ignored is unrealistic. Knowing the time and place to use standardized grammar is helpful, but saying it should be used all the time or should serve as a universal measure of intelligence is just pretentious. The best way to respect language and get the most out of it is to learn the rules, then break them as often as possible.
(07/25/07 8:18pm)
You know something’s ingrained in American culture when the government finally catches on to it.\nCNN sped up that process earlier this week by introducing the Democratic candidates to the cultural phenomenon that is YouTube.\nMonday evening, the cable news channel aired a debate among the candidates where, rather than being asked questions by a mediator in the traditional fashion, they answered questions ordinary Americans had posed to them in YouTube clips.\nI never watch presidential debates because they’re full of the candidates sidestepping issues and gearing topics toward whatever they want to babble about. But this one piqued my interest. Despite not being impressed by any of the candidates, I was impressed by the way the debates brought democracy to the forefront of American politics.\nAmerican politics are plagued by the problem of looking bureaucratic and seeming irrelevant to the everyman. Even if you feel close to the most prominent political issues because you are gay or plan on chucking your abstinence-only education by getting as many abortions as possible, it’s hard to feel like you have any swaying power on the American government’s course of action in Iraq or Darfur.\nPlenty of us discuss these topics with friends and on Internet message boards and blogs, but we rarely bother to communicate them to our government. We figure they will get lost in the stacks of probably better-written letters sitting in our representative’s office, or we think they might not listen at all if we do not represent a significant interest group or represent an idea they oppose. \nDiscussing our opinions with friends and Internet communities hardly gives us any more political power, and instead, politics ends up serving as a thought experiment.\nBut this debate attempted to bridge this impasse. It allowed the thought experiment to serve as communication with the government and therefore made the everyman relevant on a personal level in American politics.\nIt also showed which candidates were more in tune with the lives of the average American and who knew how to communicate in layman’s terms, as opposed to those wrapped up in Washington and so used to spinning issues that they no longer find the American people themselves important.\nI am not sure whether, in practice, the debates highlighted the relevancy of candidates who would not otherwise have received the support of the American people (Hillary Clinton clearly came out on top). However, it did bring to attention the relevancy of candidates being able to communicate clearly with the American people. \nOf course, not all presidential debates in the future should be formatted this way. As boring as they are, the traditional-style presidential debates show which candidates look better for the formal, bureaucratic tasks required of a president.\nBut Monday night’s debate was good for the conversation between the American people and their leaders, a conversation that is all too often nonexistent.
(07/18/07 9:18pm)
People have many good reasons for wearing clothes. Not getting arrested, for instance. But sexual advertisement? Not so much. \nThis week’s online edition of Newsweek features an article by Jennie Yabroff titled “Girls Gone Mild(er): A New Modesty Movement.” The article discusses a trend of teenage girls who are “rejecting ‘bad girl’ roles embodied by Britney Spears, Bratz Dolls and the nameless, shirtless thousands in ‘Girls Gone Wild’ videos.” Instead, they are choosing to wear the most boring clothes imaginable on the basis of wanting to look “modest.” \nThe article talks in large part about one author who believes this trend is “a welcome corrective to our licentious, oversexed times.” But if it’s a “corrective” force against over-sexualization, it’s an even more sexual movement. \nDressing modestly in and of itself is no less sexual than wearing stripper attire. The latter is more overt about its intentions, but the purpose of both is to advertise a sexual attitude. Onlookers often interpret gender based on an individual’s outward appearances. But performative gender is practically impossible to entirely tease out of the way one dresses. \nIn light of that problem, it’s healthier for teenage girls who are not ready to have sex to use the way they dress to display that attitude than to dress in a way that sends a message about their sexuality that they don’t mean solely because they want to meet societal pressure. \nBut just because sexuality can’t be eliminated from dress doesn’t mean it has to be the focus. Applauding girls for displaying a more truthful representation of their sexuality is still applauding them for thinking the point of the way they dress is to represent a gender role. \nAs I said before, representing gender is not the only reason to wear clothes. And since we have to wear them, we may as well make them into a form of self-expression. Clothes can express their wearer’s interest in a lot of ideas more interesting and less outmoded than the gender expectation by which they abide. They can be used to express interests in art, music, politics or any number of other opinions. \nI’m not saying it’s necessarily wrong to use dress to display sexual interest, just that making it the central focus only propels the nonsense that girls’ and women’s express purpose for dressing should be sex. \nNeither dressing modestly nor leaving nothing to the imagination empowers women because both reduce their self-expression to their gender. On the other hand, using dress as a display first and foremost for other forms of self-expression automatically brings about sexuality in a much more interesting and empowering way. Rather than looking “sexy” because of their position on sex, they look sexy because of a more rounded appearance of their personality. \nThis rhetoric of “dressing modestly” only adds to the problem of over-sexualization. To truly correct the problem, we should let girls – and women – know that the empowering way to be sexy is to be an individual.
(07/11/07 8:41pm)
I would like to think that if Al Gore, Sting and Madonna would stop abusing the privilege of exhaling carbon dioxide, global warming would cease to be a problem. \nMuch to my dismay, their efforts last Saturday with Live Earth, the music event that featured 100-plus artists on seven continents rocking against global warming for 24 hours straight, did not correct this issue. \nYou might be under the impression then that I, a person with a brain and the internet, am among the critics who condemn the event as useless just because it was full of boring entertainers with carbon footprints greater than small countries who propose inane “solutions” to global warming. \nBut on the contrary, I happen to believe the event was productive. Not because AFI and Alec Baldwin did much worthwhile themselves but because the press slammed them for being unproductive. \nI don’t know if it’s what he was attempting, but Al Gore deserves kudos for brain-parenting an event with so much potential for satire. I barely made it through watching Gore’s introduction by Leonardo DiCaprio before I started nodding off, but the most entertaining thing I’ve read all week has been Mark Hemingway of the National Review Online’s “Living Through Live Earth” diary of the event. \nIn his scathing breakdown of the coverage, he points out how ineffectual the “solutions” are that these celebrities propose will help with global warming. KT Tunstall, for instance, boasts of offsetting her carbon emissions via the impractical-if-everyone-practiced-it plan of planting 6,000 trees. And a Czech supermodel suggested that global warming rather than the actual sliding tectonic plates was what caused her husband’s death in the 2004 tsunami. Melissa Etheridge doesn’t even realize that the climate-change problem she’s droning on about is the now-irrelevant theory of global cooling. \nBut Hemingway was by no means the only one to denounce Live Earth for being counterproductive. The press, including such major papers as the Washington Post, the New York Times and the Los Angeles Times brought to attention many hypocrisies of the event, from its massive carbon footprint to its overpriced organic T-shirt sales to the disinterestedness of performers who were allegedly taking stage to stir up passion about the issue at hand. \nI’m not saying the event in itself was completely useless. It brought attention to easy ways of conserving energy and using eco-friendly energy sources. And, I’ll admit, the used-tire stage backdrops looked pretty neat. But by inviting so many well-hated personalities to take the stage and putting on a tepid show, Gore and company made the press the real heroes of Live Earth. \nMaybe its flaws as a hypocritical snooze-fest weren’t supposed to be the reasons for its success (then again, maybe Al Gore is shrewder than I thought and believed it would be funny for his celebrities to look like idiots so the press would condemn them), but Live Earth seems to have become a success in attracting attention to global-warming solutions for the very reason that it wasn’t successful in doing so itself.
(07/05/07 4:47pm)
On Monday night, I came to an embarrassing realization: I didn’t know who won the last four seasons of “American Idol.” \nI consulted Wikipedia to solve this crisis, where I further realized I had not even heard of two of those winners. More research showed me that I wasn’t the only one who had stopped paying attention – Nielsen ratings for the show last season were down nearly 7 million viewers from season five. \nHad I noticed this trend a couple of weeks ago, I would have given it the same attention as the show’s last 5.9 seasons. But this past week, I was also introduced to the series finale of the Palestinian Hamas children’s TV show “Tomorrow’s Pioneers.”\nThe show, which was canceled last week, had only been on the air since April. It featured a costumed mouse named Farfur, obviously plagiarized from Mickey Mouse, who preached to kids endearing values such as anti-Semitism, Islamic domination and how to use an AK-47 assault rifle. In the final episode, Farfur was beaten to death by an Israeli official trying to seize his land.\nIt may have been taken off the air in Israel due to poor ratings, but a clip of the finale’s last five minutes on YouTube has gained popularity. On Tuesday, it received honors as the site’s No. 10 most viewed and most discussed video. \nAuditions for the next Idol competition begin at the end of this month. If the show takes some pointers from the finale of “Tomorrow’s Pioneers,” it could – like Farfur tried to do for Palestine – restore its glory. Here are suggestions: \n• Force contestants to wear knock-off cartoon character costumes. Of course, the reason “Tomorrow’s Pioneers” got away with their knock-off was because he would have been less likely to be recognized as such by Arab audiences than Americans. Therefore, contestants would have to pick characters from movies such as “Fern Gully: The Last Rainforest” or “Titanic: The Animated Movie.”\n• Have contestants write songs spewing hatred toward groups against whom they are prejudiced – blacks, women, etc. One of the major reasons I’ve always hated “American Idol” is because the contestants sing crappy covers. We don’t need more pop stars who don’t know how to write their own music. Granted, increasing songwriting talent would mean sacrificing singing talent, and the people interested in fresh songwriting talent tend to be indie rockers who don’t watch TV. But if the messages are so offensive no one can concentrate on the singing or inspiration, sick viewers will cause the ratings to skyrocket. \n• Have the judges violently beat the season winners. The celebratory atmosphere after the winner is announced is thoughtful but propels the contestants’ delusions that their talent will be useful after that evening. Beating them would provide them with more realistic symbolism about their situation – that despite all that artistic talent, The Man will sack them.\nBy taking these lessons, “American Idol” could reach the high-level entertainment of “Tomorrow’s Pioneers” – and hopefully to its fate, too.
(06/27/07 11:34pm)
Finances are the most central and important matter to the policies the board of trustees deals with, and for this reason the elected trustee needs to have a firm grasp on the University’s economics. For this reason, I endorse candidate Steve Miller.\nMiller has previously served as IU treasurer and is the most financially realistic candidate running for the position. While the other candidates gave vague or unworkable ideas about how to raise money for the school and get it to the places it needs to go, Miller believes that the University would be improved by doing more cost-benefit analyses of our financial problems and choosing what will be most beneficial in the long run rather than what will be most popular at the moment.\nHe supports a more comprehensive financial plan than other candidates, saying in an e-mail interview that to raise more funds for IU, “We need to partner with Purdue, the state government and private industry,” whereas other candidates merely said they would look at their options or gave a way to improve one minor aspect of University finances. \nBecause he is the most financially realistic candidate, I endorse Steve Miller.
(06/27/07 10:03pm)
As a person who doesn’t see herself as fitting anywhere on the American conservative-to-liberal political spectrum, I tend to notice the mainstream media’s biases not only in terms of opinion balance within articles but also in what they choose to report in the first place. \nMany of us who hold minority political persuasions see issues important to us only get occasional and passing attention, if any at all, from mainstream news sources. \nBut when I read this week that the fifth Congress member this year, Sen. Diane Feinstein, D-Calif., declared her desire to revive radio broadcasting’s Fairness Doctrine, I didn’t react with excitement that Congress is becoming more interested in giving opinions like mine a better shot at airtime. \nNo – this “Fairness” Doctrine in which Congress has taken interest is only meant to balance airtime for bipartisan opinions. It disregards the notion that those of us who aren’t Republicans or Democrats are equally as entitled to the right to free speech and suggests that the First Amendment only applies to those who hold majority opinions. \nThe doctrine, a regulation the Federal Communications Commission policed from 1949-1987, stated broadcasters had to provide “reasonable opportunity for discussion of conflicting views on matters of public importance.” \nIn theory, it was a great way to allow for different voices to be heard that couldn’t otherwise get access to the crowded spectrum’s airwaves. But in practice, it caused stations to be hesitant of running controversial material at all because they found it too hard to balance out opinions without being shutdown by a group with an agenda to get them off the air. And shutting them down wasn’t difficult because, as holders of minority political opinions know, mainstream news sources rarely do justice to every side of an issue. \nIn 1987, the FCC finally conceded to this problem and repealed the regulation. Since then, talk radio has been dominated by conservatives, which is what’s brought four of Congress’s Democrats and one of its Independents (a self-identified socialist) to complain: their position isn’t being heard.\nAlthough these people aren’t explicitly excluding those of us who don’t hold liberal or conservative views, their entire conversation is revolving around those positions. And anyone with any intention of including minority viewpoints would know it’s impossible to include all of them.\nFew people with brains currently view Rush Limbaugh, for instance, as an objective news source. But if he’s forced to appear that way, he’ll only do it to keep liberal Congress members and interest groups off his back, and that won’t mean he’ll go the extra mile to get the voices of anarchists, libertarians and communists heard, too.\nThe members of Congress pushing for a return to this regulation are merely operating under the guise of fairness to gain support of a major voting group – liberals – and they’re willing to limit the free speech of the rest of us to do it.\nThe First Amendment was created to protect those of us with minority opinions. Let us keep it.
(06/20/07 10:37pm)
This month, the Indiana Bureau of Motor Vehicles has started issuing its newly redesigned driver’s license and identification cards. \nThe cards look quite a bit different from their predecessors because of a slew of security features added to them. Some of the changes include colorful digital watermarks and “ghost portraits” where the person’s picture is copied in a lighter and smaller version at the card’s edge, and a barcode on the back embedded with secure data that, when scanned, should be able to verify the information on the front of the card. They also have a vertical format if the person is under 21 and bands highlighted in red and yellow telling when the person turns 18 and 21. \nAccording to the BMV, these features are meant in large part to prevent minors from purchasing alcohol. Once again, middle-aged government bureaucrats believe that something they haven’t tried can cut down on underage drinking.\nBut not only will the cards’ new features not stop counterfeiting and underage alcohol consumption, they will actually lead to more.\nCounterfeiters might slow down their manufacturing of Indiana cards when they realize they have become too difficult to copy, but that doesn’t mean they will stop counterfeiting altogether. Instead, they will counterfeit cards from other states, the way many of them already do.\nOfficer Travis Thickstun of the Indiana State Excise Police told the IDS in a June 14 article that he’d seen “close to 100 instances” of people changing their birthdays or the date they turn 21 on their IDs, and that the new design would deter people from tampering with their cards.\nMaybe the new highlighted bands will make detecting tampering easier, but at the same time, they are making more room for counterfeiting. If ID checkers know they can determine whether a person is of age simply by looking at the highlighted bands and seeing if the card is in vertical format, they will be less likely to pay attention to the card as a whole. As long as they are horizontal and don’t bear the highlighted bands, even some of the sloppiest fakes will pass by casual ID checkers.\nOf course, more adamant ID checkers such as bouncers would quickly discover these problems. While it will be easier to slip fakes by gas station clerks, the same won’t be true for bars, which are more likely to run black light and barcode tests. \nBut, as anyone who attends IU should know, just because people can’t drink at bars doesn’t mean they won’t drink. Instead of drinking at bars, they go to parties or have of-age friends buy them alcohol. In these situations, they drink even more than they would have at bars because the alcohol is cheaper. \nThis effort on the government’s part to discourage underage drinking, like many before it, is not only futile but ultimately counterproductive. The government can encourage programs that teach responsible drinking, but trying to coerce kids into abstinence only convinces them to find other ways around the law.
(06/13/07 11:19pm)
In the movie “Saved!” starring super-talent Mandy Moore, a pregnancy-troubled teenage girl named Mary, the film’s protagonist, makes a remark about the biblical Mary’s virgin birth that I’ve always thought solid:\n“…do you ever wonder if she made the whole thing up? I mean, it’s a pretty good one. It’s not like anyone can use virgin birth as an excuse again.” \nAt least I thought it was solid until this week, when while perusing Slate.com I found a June 9 article about a shark recently born in a Nebraska tank with no evidence of a father – meaning the pup would have been the result of parthenogenesis, or virgin birth. Virgin birth is common in many species, but until now it had been thought impossible among sharks and mammals because of the way our genes are arranged.\nObviously, the moral here is that if you end up accidentally pregnant and in trouble, you can now realistically use virgin birth as an excuse. \nBut this got me thinking: Had personal responsibility been the only viable way to explain a pregnancy previously? After some research, I realized I was right to question this assumption. So don’t worry – if you need an alternative reason for your pregnancy, but the party to whom you speak doesn’t believe the parthenogenesis news, you can still use one of these:\n• You’re pregnant with the Antichrist. Mary might have populist dibs on the son-of-God story, but no one has the same for the spawn-of-Satan story. Some Christians interpret biblical prophecies to mean that an Antichrist, an evil antithesis of Jesus, will appear during the world’s End Times. Using this story is brilliant if you’re lucky enough to be explaining yourself to someone who believes such philosophy. Its downside is that when it comes time for you to pop the kid out, few people will want to help you raise the son of Satan.\n• You’re making extra cash as a surrogate mother. Surrogate mothering is a process whereby women choose to be implanted with another couple’s sperm or fertilized egg. The advantage of using this excuse is, unlike claiming you’ve procreated with the devil, you come off as a fantastic human being. Not only are you helping the disadvantaged, you’re so eager to do it that you’re willing to put yourself through the grueling process of childbirth. The drawback here, however, is that people will get suspicious when you don’t have the $15,000-$30,000 you were supposed to make for the favor. I recommend black market activity to compensate.\n• You’re smuggling illegal contraband, but your drug lord was beaten into a coma and you have to wait until he wakes up to remove it. The only problem with this story is that people will pay closer attention to you because they’re scared for your life and, unless they’re morons, inevitably realize you’re a dirty liar.\nIf you can’t get anyone to believe anything here, just keep sporting that baby doll top. Everyone wearing one looks pregnant anyway.
(06/06/07 11:04pm)
Last week, the Connecticut Senate passed a bill that, if signed by the governor, will allow residents who are the children of illegal immigrants to attend the state’s public universities at the same tuition rates as in-state citizens.\nScore one for rewarding illegal activity! Next thing you know, they’ll be repealing prohibition.\nThen again, maybe the reasoning for this bill is sounder than it initially appears.\nCritics of this bill say it’s unfair that U.S. residents who legally dwell in other states must pay twice the tuition of students who aren’t even legally supposed to be in the country. But stripped of its comfy morphological parallelism, this argument is problematic: It ignores both the bill’s economic fairness and its economic advantages.\nFirst, illegal immigrants are equally as compelled to pay the state taxes that contribute to state college funding as legal residents. \nCritics counter this argument by saying that, while this is true, they aren’t contributing as much as legal residents. Illegals drain $10.4 billion more per year in expenses from the federal government than they pay in taxes, according to the Center for Immigration Studies.\nThe same study from which this data came, however, acknowledged that this problem is a result of the fact that first-generation immigrants are overwhelmingly lower-class and therefore pay less tax money, not because they’re freeloading. Illegals are actually less likely than lower-class legal immigrants and citizens to take advantage of government resources because they don’t want to be discovered.\nMany groups pay less in taxes than they use in government resources – not only lower-class residents but members of the middle and upper classes who take advantage of tax exemptions and loopholes. Illegals are not unique in taking more out of the system than they put into it and indeed might be doing it to a lesser extent than a lot of legal residents.\nOf course, having a large population of lower-class residents hanging around, perpetually using more than they’re investing, isn’t a problem that’s in the government’s best interest to leave alone.\nGovernments, especially those in non-border states like Connecticut, might have few ways of curbing their flow of lower-class first-generation illegal immigrants. But they have more leeway in seeing to it that the children of those immigrants don’t pose the same financial problems.\nNot only are illegals at a financial disadvantage when it comes to higher education because they are lower-class, they often cannot receive federal aid. Therefore, many don’t go to college and remain in the same position as their parents.\nBy making it easier for these children to go to college, Connecticut is improving its entire state’s future economy.\nIllegal immigration won’t stop – it may not even slow down – regardless of how many walls we build, how much border security we tighten and how many individuals we manage to deport. But we can follow Connecticut’s example of giving illegals’ children a better chance at a better life and in turn giving ourselves a better economy. \nHere’s to Indiana following suit.
(05/30/07 11:39pm)
No one could have seen it coming. Despite her recent stint in rehab, during which she surely gained a considerable amount of wisdom about the evils of drugs, Lindsay Lohan was arrested last week on DUI charges.\nThe story may be another shallow celebrity gossip story the press is covering to cater to people’s interests in the lives they can’t have, but her lack of success in rehab shows the futility of a problematic trend among celebrities: entering rehab on the false pretense that making sour, drug-induced choices is the drug’s fault, not the user’s.\nTo name some of the more memorable celebrities who have recently done this: \nBritney Spears entered drug rehab earlier this spring after nearly a decade of progressively zanier stunts. Last fall, ex-Congressman Mark Foley entered rehab, claiming that his sexually explicit messages to congressional pages were the result of alcoholism. And Mel Gibson blamed the anti-Semitic remarks he made last year during his DUI arrest on the influence of alcohol, checking into rehab shortly thereafter. \nI can’t blame celebrities for wanting to save face when they get caught using drugs and alcohol. But while they’re using rehab to make themselves look like they’re taking control of their actions, they’re doing the opposite. They’re saying instead that drug and alcohol use are so hard to control that the most responsible decision a person can make when “substances” cause problems is to have rehab fix them.\nBut it’s not the substance that makes the bad decisions – it’s the user. Because the person ingests the substance, he is accountable for his actions under its influence.\nDrugs can be used responsibly, and most users are a testament to this fact. In the past month, 126 million Americans have drank alcohol, according to the National Survey on Drug Use and Health’s most recent data, and 19.6 million have used illicit drugs in the same time period. Of those, only 18.7 million drinkers and 6.8 million illicit drug users were classified as dependent or abusive.\nTrue, not all celebrities who enter drug rehab are playing publicity cards. Some do have substance problems that, despite their best efforts to control them, have taken such deep root that they can only be solved with professional help. And maybe drugs and alcohol did play a part in the mistakes of celebrities such as those above, but they don’t cause anti-Semitism, pederasty or chronic attention-whoring either. \nInstead of saving face by admitting they had made bad decisions and then using public efforts to demonstrate their reform, these celebrities continue the false rhetoric that drugs are an evil demon that lures in the innocent and strangles them. Better to acknowledge that demon’s foothold and have that savior “rehab” stop it, the rhetoric says, than to let drugs destroy your life. \nResponsible drug or alcohol use is possible. Decisions users make under the influence of substances to which they choose to submit are their own faults, and they should take responsibility for their actions.\nAs Lohan has reminded us, rehab doesn’t cure stupidity.
(05/23/07 11:23pm)
As the daughter of a non-denominational pastor, I have many fond memories of the more zealous brethren.\nSome of them were clever zealots who brightened up life with their off-kilter humor, such as my youth pastor who fried an egg on the church camp basketball court one day when the western Kansas temperature reached 105 degrees.\nOthers were nutters who seemed to be some malfunctioning species of Borg, such as Fred Phelps and his Westboro Baptist Church, who I remember fondly for their extraordinary talent at keeping our hometown of Topeka entertained with their self-parody.\nI do not, however, have any fond memories of Moral Majority founder Jerry Falwell, who died May 15.\nYes, the man pulled many outlandish stunts in the name of zealotry. For starters, he was at the front of the “Clinton Chronicles” scandal, wherein he funded and publicized a documentary that featured people falsely suggest then-President Bill Clinton was involved in drug-smuggling operations and was causing many of his dissenters to run for their lives. And of course, who could forget his denouncement of the purple Teletubby for being gay?\nBut despite his appearance as a match with Fred Phelps in the realm of religious wackjobbery, Falwell does not fall into the same category. Unlike Phelps and his Westboro gang, many people took Falwell seriously. \nEarly last week, Focus on the Family Chairman James Dobson sent out a press release commending Falwell’s contributions, saying, “Because Jerry, and his Moral Majority, were the first ones out of the trenches in the culture war, they got shot at repeatedly by the national media and by liberal church leaders. But he always weathered the onslaught, permanently stamping the conservative American church with respectability on social action.”\nFalwell has always had his loud-mouthed media-candy subculture of followers, but “respectability” isn’t the first word a substantial portion of evangelicals whom I’ve known associate with the man.\nMost people realize Falwell was never representative of America’s “moral” or general Christian voting population, but few realize exactly how poorly he represented the groups whose mention bring his face to mind: “evangelicals,” “fundamentalists,” the “Religious Right.” \nThe media’s common portrayal of evangelicals as emotionally-driven zombies is not entirely devoid of truth, due in large part to Falwell’s immense effect on the movement.\nBut many evangelical Christians found themselves between a rock and hard place regarding Falwell’s influence. Some were far more liberal than Falwell told them – and the media – was appropriate, making them hesitant about their religious identification. Still others agreed on some of his political positions, e.g. pro-life activism, Zionism and skepticism about homosexual rights, but didn’t want to be connected with him because they had broken away from any number of his other “traditional” views. But the majority of evangelicals didn’t even associate their movement with him in the first place: they considered him as a phony.\nEvangelical Christians can think for themselves – Falwell only discouraged free thought and gave the media false impressions of the faithful.\nThe man is dead. May his legacy die with him.
(05/20/07 11:46pm)
This August, you’ll say good-bye to your high school circle and be thrust into a world where you will have few or no friends. But don’t worry; thousands of freshmen will be having the same problem as you, and in no time you’ll have a gang of newbies with whom you’ll be wreaking havoc with in the dorm lounges. \nEventually, though, the appeal of dorm floor events will wear off when you and your crew get tired of listening to that omnipresent guy on your floor who never shuts up because he has lots of “important” things to say (Warning: Despite this character’s seeming dysfunction, he is more powerful than he appears. He is capable of shape-shifting and is enrolled in every philosophy class at IU).\nBy this point, it will be your first Monday here.\nFortunately, the solution to your problem lies straight ahead. You’ll be about to head off to your first classes, where you will have the chance to mix with upperclassmen.\nMost of these people live in apartments, houses, fraternities and sororities. If not, they have plenty of friends who do. Not only can they get you out of the dorms, they will take you to all the places in Bloomington you’ll want to be. \nBut first, you have to make friends with them.\nUpperclassmen are a bit more difficult to befriend than freshmen. They already have a group of friends, and having endured their own pretentious floormate experiences, they won’t need you like you’ll need them. But as long as you’re aware of your secret weapon, you’ll be fine. Two words: meal points.\nYou can say whatever you want in your first-day conversations with upperclassmen, utilizing Woody Allen-level wit or “Hi, I’m majoring in Grass Growth,” but chances are they’ll see you as another freshman in a lecture class who introduces themselves on the first day and never talks to them again. \nIf, however, you conclude your monologue on fescue by saying, “Would you like to eat lunch with me? I have meal points,” they will suddenly believe you are the most fascinating person they have ever met. \nAs a freshman, you will have an excessive number of meal points, no matter which plan you choose. Unless you regularly consume between 3,000 and 7,000 pounds of food a day, you’ll have plenty to share.\nUpperclassmen have come to the sad realization that the food in grocery stores they moved off campus for, sold at reasonable prices someone in their right mind would pay, can’t compare to your wealth of money-less “meal points” just waiting to be exploited.\nYou sharing your meal points also means they don’t have to cook (microwave), they can replace their diet of Ramen and Quaker Instant Oatmeal with foods of recognizable nutritional value and they can make a new friend. For you, it will mean a free pass to the best Bloomington has to offer.\nSo when you’re low on weekend activities and high on meal points, remember – friends in high places are easy to buy.
(05/17/07 12:37am)
Hillary Clinton is a phenomenon, and I don’t mean that in a good way.\nAccording to data released Monday by the American Research Group, Clinton holds the lead in the race for the Democratic presidential nomination, with 39 percent of expected Democratic primary voters declaring their support. Illinois Sen. Barrack Obama is second at 22 percent, 17 points behind Clinton and Sen. John Edwards takes third at 19 percent.\nThe same study shows her numbers have risen since March, when they were at 34 percent, while Obama fell from 31 percent. \nBut contrary to these statistics, I know only one or two Clintonites out of all my politically-charged friends, and I’m not entirely sure they aren’t joking.\nFacebook says I’m not imagining things: When I typed “Hillary Clinton” into a group search, the first five pages of Hillary-titled groups yielded 37 anti-Hillary groups and 7 pro-Hillary groups, with the latter generally claiming far fewer members. The opposite results proved true for Obama and Edwards: There were anti-candidate groups for both, but far more pro-candidate groups. \nApparently, the people making Hillary’s numbers rise are our elders.\nThe problem isn’t that Clinton isn’t paying attention to the 18-24 age group. She has pledged to use her presidency to make college more affordable and pull American troops out of Iraq. She’s taking advantage of MySpace and YouTube, and her campaign Web site has a service that sends campaign updates via text message.\nBut she has been flip-flopping on issues for as long as we’ve been old enough to pay attention to politics. Her attempts to prove she’s knowledgeable about the composition of her audience have ended in disaster (see: Southern accent fiasco), and she hasn’t convinced us she’s able to stand on her own without a husband we’ve seen mostly as a gripping speaker and the butt of a lot of late-night TV sex jokes.\nOur elders, by contrast, remember more about her as a first lady and have a longer memory of her workings under the public eye. To them, she has established herself over time as an intelligent and strong leader.\nBut I’m not saying their experience justifies their support any more than our lack of experience justifies our position.\nOlder generations may be swayed by her abilities to stand above expectations in terms of her political role and consider her ability to influence a significant amount of political change while maintaining a commanding presence in the media as outstanding qualifications for president. But our generation, the most culturally prominent force in America, needs more.\nWe want a leader who’s genuine, who is interested in us on a more intimate level than through dispassionate research and calculated statistics. No one doubts that we are technologically savvy, but that’s not enough. We want a leader who understands our concerns beyond obvious issues like our not having $100,000 right out of high school and wanting our friends in the armed forces back alive. \nHillary Clinton may win the Democratic nomination and she may even win the presidency, but if we keep up our demand for genuineness in government officers, our generation may revolutionize politics. Keep up the good work.