When was the last time that Americans stood united? And I mean for an elongated period of time, without a tragedy to prompt it.
I know you already think I’m going to turn this into some right-winged country song–but I’m not.
Don’t get me wrong–I’m sure we all felt a lot more American and a lot more united after 9/11. Who wouldn’t? And 10 years later, we had this:
But why should we only unite over tragedy? I just feel like it’s been a long time since we’ve all simply felt proud to be Americans. And I know that Americans get a bad rap, and that we’re terribly ignorant tourists who wear fanny packs and complain about our first world problems. I’m not saying we’re perfect, particularly to the outside eye, but when did we lose pride just being American?
The last time that I recall was the 1990’s. Maybe it’s because I was a kid not facing adult issues, but weren’t we all proud of where we lived, where we worked, what we did? And before that, the 1950’s when folks were proud of the lives they lead. The times were great–kids got to be kids, people took pride in their homes and their jobs, and they were truly enjoying the times. Everyone was united and proud to be American. But the 1950’s came right after a war…and the 1990’s? Do we count Desert Storm?
I think part of the problem is this: People are too focused on celebrating their individualism and accentuating their differences that they’ve forgotten that they are a part of something bigger.
I’m sure that some of you don’t actually think I believe this, but I do appreciate individualism, and I do think it should be celebrated. I’m not sure if there’s really anything we can do about it. Maybe we only feel that pride when something goes really great for the USA, or when something bad happens to the USA.
I don’t know. I just think this whole country would be stronger and happier if we realized that we’re a part of something bigger and if we quit focusing so much on being different from one another.