Just saw Snopes’ post on Ben Stein’s commentary on the Oscars and the politics of Hollywood, including this rather disingenuous statement:
Basically, the sad truth is that Hollywood does not think of itself as part of America, and so, to Hollywood, the war to save freedom from Islamic terrorists is happening to someone else.
Sure, he’s talking about Hollywood specifically, but it’s the kind of “You’re not really American” rhetoric we see a lot in political polemic.
Has it occurred to people on the right that us “lefties” (which seems to mean anyone who is less conservative than President Bush) do think that fighting terrorism is a good thing, but that our nation is currently going about it the wrong way? That maybe invading Iraq wasn’t the best way to curtail global terrorism? That it might be possible to spy on terrorists without bypassing that Constitutionally-guaranteed “due process of law” in a way that sets precedent for warrantless spying on citizens who aren’t terrorists?
We don’t hate America, but we’re not particularly thrilled about some of the things our government has been doing lately.
I do agree that the Academy Awards are pointless in the grand scheme of things, but I’m sick and tired of the false dilemmas rampant in what passes for political discourse these days.





