For many thousands of years North and South America were inhabited solely by Native Americans. As we all know Europeans showed up eventually and decimated Native populations, committing countless atrocious acts. After completely devastating the native way of life and culture, those who make up that ethnic group are now mostly an oppressed minority here, and people of European descent have more or less taken over.
And now many people make arguments about reparations and the like, and I wonder what people of this forum think about that concept. Do the ancestors of Europeans who wronged natives owe the ancestors of those natives? Does the fact that Europeans 'won the war' against natives mean anything other than that they're now the group which is economically dominant? What, if anything, should be done to rectify what happened in the past? Can we even make a direct link between this time period and eras gone by?
What do you think?
				
			And now many people make arguments about reparations and the like, and I wonder what people of this forum think about that concept. Do the ancestors of Europeans who wronged natives owe the ancestors of those natives? Does the fact that Europeans 'won the war' against natives mean anything other than that they're now the group which is economically dominant? What, if anything, should be done to rectify what happened in the past? Can we even make a direct link between this time period and eras gone by?
What do you think?