Index/Topics/US University Faculty Political Bias

US University Faculty Political Bias

The claim that universities and colleges in the US have a liberal bias in their faculty, potentially leading to indoctrination or hiring discrimination.