All my life I’ve been told college is the key to success, that you can’t get anywhere in life without a degree especially in this economy. What I’ve seen is something completely different. College is a great way to figure out what you want to do and possibly a great way to find the people that you need to make that dream a reality. That is if you’re in the right time and the right place to get those things. Most people I find (atleast the ones I know) don’t know what they want to do, they go to school because that’s just the next step. When in reality that shouldn’t, isn’t the next thing that you should be doing you should be figuring out what you want to do in life. That’s what people think high school is for, however when you’re actually in high school all you hear are people telling you to go into nursing or do something that involves computers because”that’s where the money’s at.” While those are great career choices everyone is not into long hours helping sick people or into technology. Why are we hardwiring kids to get into a career that they could potentially hate? And probably having them living paycheck to paycheck like most americans? Also how long do you think those nursing jobs will last? I went around and asked the working class and some college students, “Why did you choose this career/course?” Very few of them actually said it’s because it was something they loved, most said something relating to “I’m guaranteed a job, I fell in to it, or I get paid well.” While I do believe that everyone should attend college it shouldn’t be just so you can make money, that’s what’s being taught in school “It’s all about money.” When in reality you should be learning to achieve your dream not just make ends meet.