Wednesday, April 2, 2014

Sorting and Efficiency


Well, it's that time again. I'm glad to see that you've all survived the broiling maelstrom that was March. I would love to make small-talk about the weather, but I'm driven by a purpose today.
This week we’ve been focusing on sorting and efficiency. Sorting is an interesting problem, because it has a huge number of solutions, despite its apparent simplicity. The simple algorithms that we learned about in 108 (bubble sort, select sort, insertion sort) all perform decently well when the list is close to sorted, but their complexity grows exponentially in the average and worst cases. This is because these three algorithms perform a number of redundant operations over and over again. Bubble sort, for instance, works by making comparisons between adjacent elements and swapping them if they are in the wrong order. Even in small lists, this means that the algorithm will make several passes and make the same comparisons several times. The larger the list, the more redundant comparisons are made and the worse the performance. More enlightened algorithms such as quicksort, heap sort and merge sort use different methods to avoid this exponential growth when sorting large lists. Merge sort, for example, uses a divide and conquer approach, wherein it starts from very small (1 element) sorted sublists that are then merged. This is one method of circumventing redundant comparisons. The result is a good (O(n log(n))) time complexity for both average and worst case.  Quicksort has the same complexity for the average case, but something interesting happens in the worst case. Quicksort operates by choosing pivot values, and putting smaller values to the left of the pivot and greater values to the right. In many cases this proves to be faster than other O(n log(n))) algorithms, but in the case of a list that is reverse-ordered, quicksort performs very poorly. In the case of a reverse-ordered list, almost every element will be a pivot at some point, and be compared to every other element, resulting in a poor O(n2) complexity.
Anyway, this whole efficiency thing has made me reconsider some of my programming habits. While my “if it hurts, it works” way of doing things has been effective enough thus far, I imagine in the future I will have to focus more on how my program works, rather than just acknowledging that it works. So far I’ve been satisfied with achieving the right results, but I know that in the real world the efficiency with which my program operates will determine my degree of success. Or perhaps improvements in processing power and memory will be so great by the completion of my degree that efficient code will be obsolete; an artifact of a more austere past. Perhaps we will even have codes that write themselves… Sloppy inefficient programs making sloppier, more inefficient programs. It’s truly a utopian dream.  

Friday, February 28, 2014

Week 7 - Recursion



Recently, there was a popular movie starring some very famous people called Inception. I’m not going to talk about that at all because it’s super obvious and it’s been done to death.

Now, recursion is a little unsettling at first. How can we have a definition that references itself? The Oxford English dictionary would never define “pointless exercise” as “an exercise which has no point”. Our intuition tells us that we need to enlist the help of some outside information to produce a workable definition. Not so with recursion. All we need is a base case, or terminating condition, which prevents us from going too far down the recursive rabbit-hole.

Admittedly, it takes something akin to a leap of faith to write that first self-referencing function. And the first time it works it’s a little bit magical. But with a little bit of tracing it’s not so hard to understand how recursive solutions work, and that recursive solutions are both powerful and elegant. This stems from the fact that many of the problems we face in computer science are inherently composed of similar nested sub-problems that can be reduced to base cases. The tool we used when confronted with these repetitive problems in the past was iteration, which is only a tenable solution when confronted with reasonably shallow nesting and predictable conditions. When the nesting becomes more complex and branching, an iterative solution becomes prohibitively complex, and recursion is much more favourable. In the case of binary trees, for instance, a recursive solution allows us to traverse the data therein without knowing exactly how deep the nesting goes, as iteration would. See What's Gabbening for an exciting and enlightening picture of such a tree.  As long as a recursive function has a correct base case at which it terminates, it could perform operations on an almost infinitely branching tree, with computer power being the only limiting factor.

Recursion has so profoundly changed my worldview that I’m even using it when I don’t have to. For instance in this week’s lab involving linked lists I proudly flaunted my recursive solution, only to find that apparently there was a non-recursive solution that was easier to implement.  I’m still a little indignant. 

Anyway, I’m going to repeat this joke here because probably no one saw it a few weeks ago. I think it’s classic.

What’s Benoit B. Mandelbrot’s middle name?

Benoit B. Mandelbrot.

Friday, February 14, 2014

Week 5



 
Well, fellow Sloggers, I’m sure you need no introduction to the ordeal that was A1. Talk has turned now to the notions of trees and nodes. Although recent lectures have been a bit of a blur, given mainly to the midterm season malaise fogging my brain, one point has stuck: the mantra, “a node cannot have two parents”. If you’re in the Monday/Wednesday morning section at 10, you know what I’m talking about.

Yeah now I’m just stalling.

Anyway, I’ve been enjoying the labs more than I thought I would. This week we learned about two comprehensions, zip and filter, that I wish had been on my radar before now. I can remember numerous times that I had to write such functions longhand, with for loops and if statements and all that. Now it seems like a waste of time. The natural next step was to see what other sort of time-saving (line-saving?) functions already exist in Python for me to abuse and exploit. There is a function called “reversed” that reverses a given iterable. I’m sure that would have been handy at some point. Even more potentially useful is the function “map” which applies a given function to every item of a given iterable. That could have saved me innumerable lines of code over my lengthy programming career. Probably, I don’t 
know, a dozen or two at least.

Right, I started this by talking about trees and nodes. It seems like I’m never going to be able to get away from this recursion stuff. And honestly I’m not sure I’d want to. Looking at these recursive trees blossom is pretty enthralling. Not that I’d admit it to my friends. 

Uhhh....

Monday, February 3, 2014

Week 4



Well a whole week has passed since my last Slog thing. I can barely recognize the person I was back then. I have a feeling that staying regular with these things will require a herculean effort. But I’ll try. I Sure will. Yup.

They say that we’re afraid of that which we don’t understand. And it’s true that, for a long time, exceptions both baffled and terrified me. But this week we pulled back the curtain a bit, and they’re not quite so menacing anymore .After being on the receiving end of their curt disapprobation for so long, it’s gratifying to finally be telling someone else that they messed up. It’s also good to have some sort of idea what the wall of text Python spits out at me means. I imagine in a statistical sense there are more wrong ways to use code than correct ways, so I’m glad exceptions exist to hold anarchy at bay.

Also it’s pretty exciting to finally learn what all this recursion stuff is about. A while ago if you’d asked me, I probably would’ve told you that a function that calls itself was impossible at best, and irresponsible to the stability of the universe at worst. At risk of collapsing space as we know it, I’m starting to see how useful a function with such behaviour could be. It reminds me of a joke I once heard:

What’s Benoit B. Mandelbrot’s middle name?

Benoit B. Mandelbrot.

Wednesday, January 22, 2014

Week 3 - Object-Oriented Programming



I’ve heard the term ‘Object-oriented programming’ thrown about for some time now, but I’ve never taken the initiative to actually research the concept. Everyone I knew in high school who took that sort of initiative in computer science seemed to end up at Waterloo for some reason. Take that how you will. But now that I’ve given it some effort (albeit under considerable duress), I’m starting to see the broader layout of programming unfold before me.
Wikipedia tells me that object-oriented programming is one of five major “paradigms”, or fundamental methods of structuring computer program. Fun fact: The word ‘paradigm’ was languishing in obscurity until Thomas Kuhn resurrected it in his 1962 book The Structure of Scientific Revolutions. Since then it has been co-opted by a startling variety of other fields, with computer science obviously among their ranks.
But I digress. An object is an abstract data type that contains within it both data and behaviour. For instance, a class called Box could have a list of attributes corresponding to its contents, and some methods to add contents, steal contents, or anything you can imagine. It is the interaction of these objects that produce the function of the overall program. Encapsulating information in this way avoids “spaghetti code”, wherein “go to” statements that jump around within the program make the code unreadable. Furthermore, this compartmentalization ensures that changes to one object do not affect others.
 Some people who know a lot more about this than I do criticize OOP for trying to reduce language to classes. Steve Yegge likens it to putting too much emphasis on the nouns in English. Paul Graham says that OOP’s are popular because they prevent mediocre programmers from “doing too much damage”.  As a mediocre programmer I can’t help but feel slighted by that.  
Or so Wikipedia tells me. To be honest, I’m taking what I read in this article with a grain of salt. Although I have to give the author some credit for the use of some pretty florid language. I particularly liked the phrase “object-oriented technology often gets mired in the weeds”. I think the figurative language really highlights the symbolic nature of programming in general; things representing other things, right?
I’d like to take this opportunity to apologize for the generally sarcastic, tangential, and bombastic tone of this entry, and for any mental anguish caused to the reader. I think it stems from my deeper aversion to earnest self-expression, probably developed in early childhood. It’s not your fault. Thanks for reading.