Sunday, January 17, 2010

WHY?


Why are white Americans saying we shouldn't help Haiti because they defended and gained independence over French rule in the 1800's? Didn't the newly proclaimed white Americans defend America against the British? Should we not help them because they fought for what they believed in? In the 1800's? Didn't white Americans fight to obtain Native American Indian land? See the contradiction? Enough of this contradictory bullshit. It's 2010. Get some real knowledge! Stop reading American history books.