I have a question for all of you americans, if you would care to answer. Anyone who wants to!
Did you learn in school that the USA "won" the vietnam war?
A social studies teacher (only a high school teacher, I know, but this guy was actually smarter then most) I had in high school (this was only 2 or 3 years ago) told us that nowadays all american kids are generally taught that the USA won that war.
Im Canadian, and I lean on the side that they won the war, they crushed there enemy. If the people were not against it, Im sure the US would of been a clear winner.