Colleges, universities have become so "woke" that students are shifting back to technical education to learn useful skills
(Natural News) It is no longer the case that getting a college or university degree expands an individual’s horizons, offering him or her better job prospects and the hope of a better life. Ever since far-left “wokeism” took over American education, increasingly more young people are coming to the stark realization that getting a Bachelor’s... |