Archive for December 2011

My students are awesome (part 2): cs100

CS100 is a your basic CS2 course, providing an introduction to data structures, big O, and of course plenty of practice coding tricky problems. This course is a great favorite of mine to teach, mostly because of the office hours. It’s just fun to work with students who are just starting to get serious with Computer Science; they are still capable of having fun just getting the computer to output the right stuff.

For extra credit, we gave the student an opportunity to write a song about Computer Science. There were some super awesome submissions. One of my favorite lines is the refrain from this adaptation of Avril Lavigne’s Complicated (Noelle Suaifan):

Why’d you have to go and make Java so complicated?
I see the way you’re forgetting to import packages gets me frustrated
CompSci’s like this, you
And you code and your program implodes and takes forever to load
and you don’t insert that node and your code is the biggest mess
But promise me you’re never gonna switch your major to Art History
No, no, no

Two other great submissions:

Check out the complete list.

Pretty much all of these songs are about how difficult the assignments in CS100 are. Should I be concerned?


(Contains 2 attachments.)

Using an Intermediate Network to Optimize Parameters in Backpropagation Neural Networks

This is a post not really about what I did, but what 3 of the high-school students I worked with did. I met Gil and Claus in my automata class at GHP this summer. As part of the summer research project, they worked on developing their own kind of automata and ended up with something similar to a Neural Network. This fall, Gil, Claus, and Anirudh contacted me and proposed continuing that research but expanding the neural network connection for the Siemens Science Competition. Although I agreed to advise them where I could, this project was basically completely done start-to-finish by Gil, Claus, and Anirudh.

What eventually came out of this project was an attempt to build a Neural Network capable of optimizing the parameters of other neural networks. As the paper says:

The fundamental goal of this project is to create a backpropagation neural network that can determine the optimal training parameters to train another neural network with a different goal based on only characteristics of the training data. Such a neural network has the capability of improving the accuracy and speed of the training process for backpropagation networks used for any application.

In order to accomplish this Gil, Claus, and Anirudh researched neural networks, programmed their own network framework from scratch, and attempted to design a neural network that could help optimize other neural network parameters. Unfortunately the result they got was that the intermediate network wasn’t able to optimize properly. My opinion is their approach still has merit, and whether or not they get back to this particular project I hope they weren’t too disappointed with the negative result. This was a very cool project that required a great deal of both math and Computer Science to accomplish.


(Contains 2 attachments.)

My students are awesome (part 1): CS149s


So one of the coolest traditions at in Duke is CS149s. Officially, it’s called the problem solving seminar. Unofficially, it’s weekly preparation for ACM ICPC competition. Students work hard in this course: every week they work on a problem set consisting of old ICPC competitions or TopCoder. It’s no small part from this class that Duke has gone to the ICPC international competition every year but 1 since 1994 (something like that anyway…don’t quote me on that).

Coming to Duke for the first time and never having competed in ICPC myself, being handed the reins of this course was a little bit scary. But lucky for me I had two incredible TAs – Kevin Kauffman and Siyang Chen. They helped me a lot in understanding what kind of preparation would be important and I like to think that I helped them with the pedagogical approach to teach it. I definitely changed the course – I made topics more explicit, I required a minimum number of problems each week, and I added a new post-contest project.

This year, Duke continued its streak by placing first in its region. My winning team was Jie Li, Yuqian Li, and Joe Keefer. I would like to claim some partial credit for that, but I’m pretty sure it’s strictly their own natural awesomeness. What I am especially happy about is that every Duke team managed to get 3 problems this year – including teams with non-majors and folks in intro CS. Everyone worked this year and I think it showed.

Here pictures of everyone and more stuff about the competition.

The post-contest project we did was programming ants for the AI challenge. If you’re curious, click here to watch the 8 Ant AIs we had submitted duke it out in the Ultimate Ant Throwdown we had at the last day of class.