Many colleges teach and practice DEI and other woke BS. They are full of liberal professors peddling their personal beliefs. I did not graduate with a degree, but had enough credits, but not the liberal credits required. In other words, I only took courses that interested and benefited me; mostly real estate, finance, business and foreign languages.
Glad you feel better about yourself.