adividedworld.com
Is The United States a Racist Country?
The Left would have have us believe the U.S. is a fundamentally racist country. Yet, much empirical evidence says exactly the opposite.
C. B. Thorington