Obligatory college thinkpiece.

25 August 2020

It seems that every blogger, at one point or other, has to write a thinkpiece about whether or not college is relevant or worthwhile in the 21st century.  I seem to have some spare time on my hands, and I haven't bothered t write one yet, so I figured that I might as well.  I've been out of college for about seventeen years as I write this, so I haven't completely forgotten everything about the experience.  Unfortunately, because I can only speak to my experience in education, this text will be unavoidably skewed in the direction of my perspective.

The context of such a question is this: The world is changing faster than anyone can keep up.  This includes society and culture (and the myriad subparts thereof), technology, finance, and politics.  Schools necessarily teach a dated snapshot of knowledge at any one time, not only due to the instructors' expertise being dated but due to how long it takes to write and edit instructional materials.  On top of that, the quality and comprehensiveness of education can vary wildly from place to place, and even from text to text.  Even the style and intended purpose of education varies greatly from country to country.

So.  Is college worthwhile or even relevant anymore?

I don't know.  It depends on why you want to go to college, and what you intend to do while you're there.

I may as well give my background first, so that there's a context for where I'm writing from.  I started college in the fall of 1996, but didn't actually graduate until 2003 for a number of reasons.  I pretty much worked full time for the latter half, which meant that I could only take so many credits of classes at a time, which also made sure that I had time in my schedule for homework and sleep.  I also had pretty serious reservations about the classes available to me - two semesters of COBOL were a pre-requisite for graduating with a CS degree, but the course on Java was cancelled after a single semester because "Java was obsolete and wouldn't be useful."  On the other hand, I had the opportunity to abuse the selection of electives available to me, because they afforded me the chance to study nontechnical things to satisfy my curiosity and get class credits for it.

For the record, I will never work with COBOL again.  You can fuck right off.

After changing schools (and losing roughly a third of the course credits I'd accumulated) I realized that not only would I have a chance to round out my CS education with more modern and interesting courses, but I now had the opportunity to take different electives.  Long ago, a certain movie I'd watched as a kid impressed upon me the importance of studying things outside of one's chosen field - "All science, no philosophy" would lead to nothing good.  Having interests that didn't involve computers (contrary to the stereotypes of computer geeks) means that you can learn to look at things from different perspectives and learn new ways to think about things.  It's made building credibility in some fields difficult, I'll admit, but I like to think that being successful enough to build a good life for myself and my family is all the counter argument I need.

Something that I noticed in college is that you could tell who was just there because they smelled money.  In the late 90's computers became the hot new industry to work in, and it paid extremely well.  Some of you may remember the first dot-com boom and the utter madness surrounding it; for others, it might be before your times.  Suffice it to say that salaries and bonuses during that time were simply insane.  It was common for people to drop out of college to go work for a dot-com startup that had just gotten one or two hundred million dollars of startup capital from angel investors (no, I'm not joking) and immediately start making $150k-$250kus right off the bat just because they knew HTML or SQL (again, I'm not kidding).  I don't know how well they knew their stuff or how well they fared during those times and at those ages.  I think the dot-com bubble blowing up in everyone's faces between 2001 and 2002 might say something about that, though.

The folks who were studying CS seemed to be motivated by the paycheques they hoped they'd be pulling down sooner or later could be easily distinguished from the folks who genuinely loved computers and wanted to work with them professionally.  The folks who had their eyes on the money did only what they had to do for their classes, and that was pretty much it.  The folks who loved computers.. loved computers.  They messed around with them in their spare time, when they didn't have to.  One guy I remember hacked around on a homemade 3D graphics system for fun.  Another guy organized the comp.sci club, built a computer lab in our dorm out of spare parts (way more successfully than I, might I add), and taught classes in the very first iteration of Javascript on the weekends.  The usual less-than-legal shenanagains also took place from time to time, as one might expect.  My point is, they put in the work to get as good in their respective fields as they could, and didn't just want a paycheque, and it shows in what they do.

Whether or not the old saying "if you do what you love, you'll never work a day in your whole life" is a load of bollocks or not is not something I will address.

So, given that pretty much everything taught in college is a snapshot of something at that period in time, how useful can it possibly be?  The answer is, very.  Just because something is a snapshot doesn't mean it's obsolete.  A great deal of human knowledge as we know it today is built on top of older knowledge, which is in turn built atop even older discoveries.  Take math: Basic arithmatic dates back to 3000 years before the common era.  Does that mean adding, subtracting, multiplying, and dividing are worthless and obsolete?  No!  Then algebra was created: Basic problem solving in math.  Then geometry.  And so forth.  Everything we take for granted is built using fundamental principles from long ago, put together like legos if you want to think about it that way.  Sometimes they provide context by answering the (often unspoken) question, "Why are things this way?" or "What happened?"  Also, and this seems to get a bit of short shrift, one's knowledge is kept up to date by keeping abreast of what's happening in your field and by reading.  Seriously.  Have you ever heard of medical journals?  Engineering journals?  Hacker 'zines?  That's how experts in their fields keep their knowledge up to date, by reading about the new things happening, maybe publishing some of their work, and in essence using those publications as the latest generation of textbooks to learn from.

unpaid internships are a scam.  run away.

But can't you teach yourself all this stuff on your own?  The Silicon Valley webshit community learns four new programming languages a year and masters a new web framework before brunch every Sunday.  Well... yes.  You can.  And there is no shortage of people who teach themselves new things all the time.  However, how many folks out there would sit down and teach themselves calculus just because it seemed like a useful thing to know?  Probably not too many.  In addition to enculturation and social conditioning, schools provide motivation for learning things that don't seem useful or relevant, like history and literature.  The idea is more or less to make students well rounded by theoretically getting the same "background knowledge" if you will.  It seems, however, very rare that schools teach the sorts of thinking and reasoning that go along with them, or even bother to explain why being well rounded is important.  What I'm trying to say (and seem to be failing at) is that it may not be likely that your average person could assimilate an education as completely and thoroughly solo as they could inside of a guided framework.  I know, I know... there are going to be folks out there saying that they taught themselves this or that "just because" and that I'm full of it.  Statistical outliers don't get counted.  To some extent, everyone needs some kind of external framework to guide what they do, especially during an age with relatively little life experience.

This doesn't make me any better or worse than anyone else.  My intent is not to brag about my education or my accomplishments.  When I graduated from high school I didn't have any of the advantages that are usually associated with people that go away to college - I didn't have parents who could introduce me to well connected people that could pull strings for me.  I didn't have a huge college fund to pay for at least a couple of semesters of school.  I didn't have the savings that would have made a gap year possible, and in fact taking one would probably have made it difficult to attend school because colleges and employers alike in the United States look upon gaps in one's history somewhat unfavorably (unless one has the kind of background where it's expected of you, of course).  What I did was use all the different ways that I think about computers, hacking, literature, and everything else to figure out some of those tricks from first principles, and, to be fair, I got really lucky in other ways.  I didn't go to college with the same mindset that a lot of people went to public school with (in fact, that mindset caused no end of trouble in middle and high school, but that's neither here nor there), which was "I don't want to be here, but they shovel crap into my head because they have to."  I wanted a real education, and you don't get one passively.  To get a real education you have to work for it, just like to build a ripped body you have to go to the gym every day and work out.