This post is going to talk about real life. Now as I, and most of you, went through college we had the idea that we would be marketable, wanted, and have people asking if we wanted a job. I think most of us were under the delusion that having a degree was going to put us ahead of the game. This is still somewhat true. There have been studies that have shown the positive correlation between a person with a higher degree, better socioeconomic status, and better health. However, that is a long term study. Even though me and my classmates hold degrees we are facing unemployment. According to an article in the Huffington Post, college graduates unemployment rate is nearly 11%, doubling what is has in the past. Not to mention, 60% of us are working jobs that don't require a degree. What is up with that? I mean really? Here you have us 20-somethings, some of us have debt out the yin yang and we aren't able to find a job in the place we trained to be; the field we studied for, struggled for, and spent thousands of dollars on. So what are we left with? We're left with a minimum wage service job, living with our parents, making just enough to pay Sally Mae. It is simply ridiculous.
In a nutshell, I am just saying, Wake up employers of America! You have thousands of fresh new workers who can make your business flourish! Just take a chance on us and we won't let you down.
***Disclaimer: I understand that this is not the case 100% of the time. It is just a general trend. I know that some people were actually offered jobs before they graduated :)