Sudikoff, the computer science building at Dartmouth College, whose basement is filled later than any fraternity’s, has a way of making people go slightly insane. Maybe it’s the grueling lab assignments. Maybe it’s the long hours without sunlight. Maybe it’s the frustration of being told by a Teacher’s Assistant that you have to completely rewrite your implementation of breadth-first search, as I would be told at 4 am one late night.
Whatever it is, the building now known as Sudikoff has not changed much since it was first built in the late 1800’s. The bathrooms complete with showers hint at its last incarnation: the building used to be a mental ward for a now long-gone hospital. Today, “the Koff” houses the crazies that decide to major in computer science at a liberal arts college.
The stereotypical computer science nerd is male, wears thick-rimmed glasses and pocket protectors, and can speak shell script to the Linux kernel better than he can speak to girls. I’ve met this stereotype many times, and many of them are great people that I consider my good friends. But at Dartmouth, I have seen jocks from the South, preppy kids from New England boarding schools, and hipsters from California join the ranks of the outwardly eccentric in high-level computer science classes.
Regardless of their outward appearance or the initial impressions they are capable of giving, computer science majors at Dartmouth are distinct from the rest of campus. All of the students that get through the intro courses and still want more share a passion bordering on obsessiveness for building things with computers, and their “inner nerd” invariably seeps out of their academic lives.
“Obsessiveness” is definitely the correct word to describe my long history with programming. My fascination with computers has been with me my entire life – some of my earliest memories are of me playing Reader Rabbit games and dreaming up stories for my own video games. The problem was that, while I was an adept consumer of computer content and I had some conception that to make video games you had to type code into a computer, I had no idea where to actually type the code. Simply opening up Microsoft Word and typing instructions didn’t work. (Trust me, I’ve tried.)
Then came sixth grade computer class. Most of the year was spent bringing my deficient typing speed up to 30 words per minute, but the last project of the year involved building a website about a famous person that shares your birthday. I picked Harry Houdini.
I was so excited that I spent about 15 hours and learned the basics of two computer languages for an assignment that everyone else in the class finished using a website builder in two 45 minute class periods. While most peoples’ sites were nothing more than a white background with blocks of text largely copied from Wikipedia, mine was awesome. It was ominously black with red links. It had a rotating and fading spooky animation of the words “Harry Houdini” at the top. It had a section with three different Harry Houdini-related games accessible only if you knew the password. It represented everything that defined how the Internet looked nine years ago and makes the web-savvy of 2012 groan. I was incredibly proud of it.
The Harry Houdini website turned out to be just the tip of the iceberg. I remember many late nights spent trying to get a picture of a blue ball to bounce around my screen. I would give up in frustration and get into bed when it wasn’t working, only to have a sudden flash of insight and jump up to the computer to give it another shot. The “Tests and Random Websites” folder on my old laptop has about 50 different files, including a game where you have to drop a ball from one moving platform to another, a calculator program, and a clone of Frogger.
In high school, the games I made got more complicated. My two favorites were a lunar lander game, where the user flies a spacecraft through various obstacles, and a game called “Ant” that I emailed to my entire class where the user is an ant that has to collect leaves and twigs while avoiding a horde of bees.
Throughout the trials and tribulations of my early years as a programmer, I never thought what I was doing was a legitimate academic discipline that I might pursue in college. Making the ant game did manage to teach me the basics of trigonometry long before we covered it in math class. But programming was just a hobby. All I really wanted to do was tell stories of my own creation through video games and have a cool personal website that I could show off to people. Because of that mindset, computer people would call me a “hacker,” and they wouldn’t mean that I was skilled in accessing people’s personal, password-protected data in the way that most people use the term. They would use the term’s original meaning: a hacker is someone who builds things (or “hacks” them together) using computers.
My story, though strange, is far from unique. If you keep track of Hacker News, you’ll read a similar origin story once a week. People do not learn to program because they love code. They do it because they love the end result. Oftentimes they want to build a better video game or website. Sometimes they want to build a bot for the MMORPG they’ve spent way too much time playing. Nevertheless, every programmer that I’ve met recalls the wonder and excitement they felt when they first started to work with computers, and looking at their eyes, it’s easy to see that they still haven’t fully lost that sense.
Even at Dartmouth, in the more high-browed academic world of “computer scientists,” the hacker ethos in it’s original sense is still very much alive. Even computer science professors love the playfully creative, and sometimes devious, nature of hackers. One of my professors is fond of telling a story about a student who, during a lecture, figured out how to remotely access the professor’s screen and began writing “offensive” messages for the whole class to see. When the student confessed to the crime a couple of terms later, the professor was more impressed than angry. He would go on to judge that student’s thesis, and while the professor pretends that he sabotaged the student’s thesis and “got the last laugh,” he called the student a “genius.”
Dartmouth hackers sometimes switch into what appears to be a foreign language. While editing an issue of the school paper, I had a conversation with the paper’s technical director about the inner workings of their website. We were discussing how to make a headline span the entire top portion of TheDartmouth.com if it was especially important. What we actually said was: “yeah you’re right that PHP is close to C, but Ruby is way more expressive, and gives you really modular code running on Rails compared to Zend or WordPress” and “that’s easy with MVC, you’ll just have the same model, but you’ll call a different layout in the controller” and “can you add my handle to the git repo?”
“Look at them in their natural habitat,” laughed the editor-in-chief, who was seeing the hacker side of me for the first time.
Hackers can get into arguments about seemingly trivial things, like how to write code efficiently. The more code you write, the more the little extra keystrokes start to get on your nerves – clicking on File, then Save, switching between windows, even using the mouse at all. Many hackers prefer to use primitive text editors that were made before computers had mouses so that their hands never have to leave the “left pointer on f, right pointer on j” typing position. In the text editor known as vim, you hit Escape if you want to issue a command, such as typing “j” to move the cursor down one line of text, and you press “i” if you want to actually type text at the position of the cursor. That’s overly complicated for the average user, but a lifesaver for the hacker.
Watching a fellow hacker code inefficiently can be painful. During my second college computer science course, I went to office hours to get help on an assignment, and as the TA watched me work with my computer, he became progressively more flustered. Finally, he stopped me, went on a rant about how Sublime Text 2 was a better text editor than the one I was using, and refused to help me further until I downloaded it. When I complained that I had already paid for my text editor and did not want to spend another $50, he said my argument did not make any sense according to fundamental principles of economics.
“If you’ve already paid for Justin Bieber tickets and you have the opportunity to buy tickets to a better concert, do you go to the Bieber concert, or do you buy the other tickets?” he asked.
I had no choice but to concede that he was right.
His reaction, however, did bring out one of the less amiable traits of hackers: while they are typically right, they are frequently stubbornly and arrogantly right. In a discipline where there are infinite ways to solve a given problem but usually only one elegant way, seeing “bad” code often triggers hacker disdain. What exactly is elegant code? That’s a question for which many millions of words of tech blog posts have been spilt. In general, it means code that is written in short (Facebook requires that lines of code never exceed 80 characters), expressive lines. The function of the line of code should be almost immediately clear to a proficient reader.
What is certain is that code that does not meet this elusive criteria of elegance can often be mocked or immediately written off by good programmers. In that same meeting with the TA, when I was not immediately able to see the right way to solve my problems, he just told me that I must be brain-dead from lack of sleep. He prescribed a nap followed by caffeine. (I will admit that this was helpful.)
I am guilty of this arrogance as well. When I’ve tried to help a fellow student with a computer science problem, and they don’t immediately understand what I am saying, I have to muster all of my strength not to burst out laughing.
I remember having trouble explaining an implementation of a doubly-linked list to a fellow classmate in my intro to computer science course. The ten lines of code were intimidating when looked at as a block, and filled with odd syntax, but when taken slowly, line-by-line, every line has a purpose and makes sense. After drawing numerous diagrams, going through each line multiple times, and explaining the overall concept more than once, I burst out laughing. She was not pleased, to say the least. But to me, the arrogant hacker who already understood the problem, it seemed as simple as understanding her next sentence: “You’re not very good at explaining this stuff.”
While many hackers will wave their arms and say “well, and then there’s some magic” as they explain how some code works, they are just being lazy. Computer science is not magic. Every last character in a line of code has meaning, and there are no concepts that you “just have to memorize,” as a Chemistry major once told me as he was describing how awful Organic Chemistry is. In programming, everything can be explained down to the level of transistors transferring electricity.
Those few who actually understand computer science feel a sense of geek-superiority over the rest of the Dartmouth campus. Certainly we anticipate a quick payoff for our hours of work. At 4 am in the basement of Sudikoff, I asked the TA – who, due to both dedication and his being paid by the hour, was still in Sudikoff with the ten students trying to finish the assignment – whether this late night would typify the rest of my time at Dartmouth. With a smile that said “you don’t even know how right you are,” he nodded.
“It’s three years of your life for a guaranteed six-figure salary,” he said. “It’s worth it.”
Even the computer science professors are arrogant, if a bit facetiously. “What did the computer science major say to the art history major?” my professor asked on the first day of class. He paused, then answered, smirking, “Can I get fries with my burger?”
Often enough, computer science majors get so caught up in this mindset that they find themselves unable to perform in classes that are not math or engineering-based. A friend of mine who is planning on double majoring in computer science and engineering sciences, told me that he tries to minimize the time he spends in classes he deems useless. He told me recently that focusing on his international studies class in the previous term proved impossible.
“I just kept thinking, what am I doing here? And so I stopped doing the readings,” he explained.
But that attitude is changing, as a more diverse crowd of people flock to the major. What used to feel like an exclusive club has very recently begun to broaden to include people who are also interested in other disciplines. One girl I talked to decided to stop majoring in computer science and instead major in creative writing, but still kept computer science as her minor. These two disciplines seem to epitomize the unbreachable left-brain right-brain split, and yet she told me she felt they complemented each other nicely.
The numbers speak for themselves: one of my professors spent the whole first lecture trying to scare us into dropping the class because the enrollment was three times too high this term. Yet the room was still packed for the next class session.
“Mission: Scare people away from this class = fail,” a friend sitting next to me scribbled in his notebook.
A hacker friend at an internet start up I worked at has a theory on why interest in computer science has skyrocketed over the past year: he blames The Social Network, the 2010 blockbuster film that glorifies the college years of hacker-extraordinaire Mark Zuckerberg, the founder of Facebook. While the movie probably is not solely to blame, it is true that computer programming has suddenly become cool. With the wild success of Facebook, Twitter, Foursquare, Instagram, and countless other companies founded and run by hackers, dropping out of school to be a tech entrepreneur seems to be slowly replacing corporate recruiting as the most sought-after outcome of an undergraduate education.
Right now, many Dartmouth hackers that saw themselves as a special breed aren’t too happy with the influx of new majors. If just anyone can hack, then it’s not quite as exclusive as we thought. Time will tell whether those lured by the get-rich-quick promises of hacking will endure the suffering required of computer science majors.
Back in the basement of Sudikoff, I put these grand thoughts in the back of my mind as I frantically tried to finish a computer program that could, given any actor’s name, print out the shortest distance between that actor and Kevin Bacon, using only their co-stars and the movies they have been in. That’s a pretty difficult problem, made even more difficult by the fact that it needed to be written in the esoteric programming language called Haskell. It’s not easy to describe why Haskell is so challenging – suffice it to say that it edges out “moments of inertia” from AP physics as the most difficult concept I’ve ever had to hold in my brain. The assignment was due the next day.
At 5 am, the TA announced that those of us that had entered the cramped lab at 9 pm and were still around had just completed a full 9-to-5 workday. So had he.
“Congratulations,” he said.
A couple months later, as I walked into Sudikoff yet again, I knew that I had easily another 15 hours hunched over my Macbook before my latest lab assignment could be submitted. I thanked myself for learning that starting early is not only recommended, but essential to surviving a computer science major.
I settled into the desk chair and heard the familiar whirr of computer fans and the sound of typing and the buzzing of the electric lights above my head. I was actually looking forward to those 15 hours. Maybe I really had gone a little insane.