Most American elementary schools and high schools, and nearly all colleges and universities, teach everything that is significant from a liberal/Left perspective.
Historically Black Colleges and Universities, or HBCUs, have played an important role in enriching the lives of not just African Americans, but our entire country.