Native Americans in the United States
Native Americans, sometimes called American Indians, First Americans or Indigenous Americans, are the Indigenous peoples of the United States or portions thereof, such as American Indians from the contiguous United States and Alaska Natives. (Wikipedia)
Related
Are White People Evil? Colonialism
This is the first episode in our series: Are White People Evil? It is important to know the facts...
The MOST Extreme Zionism - Meir Kahane
In this episode and as a prelude to looking into the Jewish Power Party, Jack delves into the story...