Africa's Got Nothing on Us

I'm getting sick and tired of seeing all these promo cards and fliers around campus for various events that are intended to "raise awareness" about how much it sucks to live in Africa. Is there anyone who doesn't know that Africa is the most underdeveloped, impoverished, and politically corrupt continent in the world? Is anyone arguing that Africa is actually a good place to live right now? Besides those in Africa who are in power and privilege, nobody thinks Africa is in good shape.

So why are we constantly being reminded of what we already know? If we can all agree that Africa is a humanitarian shithole, why don't these groups that spend immense amounts of time and resources on "raising awareness" work to actually SOLVE the problems in Africa? Is it because they don't know how to solve the problems, or simply because they can't? Perhaps both? Even if there was some grand scheme drawn up to solve all of Africa's problems, there would be no way to implement it because most leaders of African nations don't care about their people. Whatever the case, these groups are not going to solve the problems in Africa. So instead, they educate people on the problems, which is all they can really do. That's fine. After all, the first step in solving a problem is recognizing it.

So education is the best these groups can do. Fine. My next question for them, then, is who can solve the problems in Africa? Which "student" of theirs is going to initiate change for the better in Africa? Will it be other ordinary American citizens like the ones who are educating people? Obviously not. If an ordinary American citizen could make Africa all better, then the ones educating people would be doing just that instead of wasting time educating people. Their answer to my question would be that the U.S. government can make things better in Africa. The idea is that you educate people and have them contact their representatives and say that they want something to be done to help the people in Africa.

Okay... And what exactly should be done? Should the Secretary of State politely ask the despots of each African nation to stop being despots? Should we conquer and colonize Africa, installing democracy in every nation on the continent? The fact of the matter is that the U.S. government will never take action in Africa for the sake of being humanitarian. If there's no economic gain to be had as a result of invading Africa, then it's not going to happen. The military costs a lot of money, as we're seeing in Iraq, so governments don't enter into armed conflict just for the hell of it. They do it for the money. The Rwandan Genocide of 1994 rendered 800,000+ dead in 100 days (more deaths per day than in Nazi Germany), but that was not enough to get the government to intervene. If genocide isn't enough, you can forget about disease and poverty.

Africa isn't the only place on earth with disease and poverty, though. Guess where else those things exist: THE UNITED STATES OF AMERICA. I know it's hard to fathom, but there are people in the U.S. who struggle through life both physically and financially. Terrible and shocking as that may be, there's good news: American citizens can actually have an impact on American policy. Unlike with Africa, educating people about the poor conditions in America can turn into change, because the government that represents those people can actually do something about the conditions in America. Whether or not they do is another story, but nonetheless.

It's hypocritical to gawk at the poor conditions in Africa when we have some of the same problems right at home. You don't need to look on the other side of the globe to find compromised people. They're right outside your door.

Home

© 2006 Pflanzenfaser

pflanzenfaser@gmail.com