Do they learn about in more of a pro-England way, or do their history books show themselves as being in the wrong?
-
To the school that I came from, they don't show us much about American history. We have to learn about our country and how it became what it is today. To be honest I did not even know what thanksgiving (is that correct?) was until someone told me.