My boyfriend owns his own construction business. I feel like this is the only way to make a decent living anymore.
I have my bachelor's degree along with a few other degrees and I can't make a quarter of what he makes.
There's a part of me that feels like colleges are strictly money making businesses that don't provide the student with much of anything after graduation except a pile of debt.
Things used to be different. What does everyone think?