If an algorithm is itself a model for understanding how to accomplish a task, the difficulty lies in creating a model to represent what is a model already, just as I found when trying to abstract an abstraction. Not surprisingly, the two skills of modelling and abstraction rely on each other, as a model is a useful abstraction. To accomplish this goal, I’ve decided to focus on the transition from algorithm to implementation, or in computing, creating a program based on an algorithm. Part of this process is accomplished by a person, the rest by a computer, so this lends itself well to computational thinking.
To gain a better understanding of how an algorithm is transformed into a program, I created a model that views this across physical scale, from the large to the very small; transitions from digital to physical; and demonstrates changes in logic, from the abstract to the specific. It elucidates what occurs behind the scenes when a program is executed, and shows how layers of abstraction in a computer provides human with the ability to intuitively instruct them.
Along with the graphic on where algorithms fall on a problem solving spectrum, I view these models as a way to give context to students/visitors on where algorithms fit into the process of creating solutions. Until this assignment, I would use the terms “algorithm” and “program” interchangeably, even after over a month of focusing on this topic. By taking the time to take a closer look at the differences, in effect zooming in and out in level of detail, I was able to distinguish between the two, and hope the model can help others do the same. Students I work with are often eager to get their hands on the tool we are using, whether physical or software, and slowing down enough to take the time to create a plan, or algorithm, in this context would be beneficial.
I also find that students or the general public treat what happens on a computer as a black box or beyond their ability to understand. I hope this model also addresses that misunderstanding. We use algorithms partly because they are expressed in a way that can be understood by anyone and avoid including details that are tied to a particular implementation. Everything included in the algorithm can be traced down through increasingly detailed instructions, so it’s simply a matter of becoming familiar with the inner-workings of computers to not treat it as mysterious. Experienced programmers may focus on one level of abstraction when problem solving, but having an understanding of low levels of abstraction can also be helpful when deciding how to implement an algorithm.
Edit 11/25/14: Just a few days after posting this, I came across this excellent video on how computers work from MIT, using the Digi-Comp II from EvilMadScientist: