It is necessary according to society, but whether or not it is necessary for you, I guess it depends on what you want to do. Certainly if you are ambitious and want to get those high paying jobs, then a college degree is usually necessary for the MOST cases, but there are always exceptions. In order to be able to compete for good paying jobs, you need something to set yourself apart from the rest of the competition, otherwise you won't get the job. And the two main things people look at is education and experience, so ideally it is good to have both.