Friday 4 April 2014

Week 12: Constant Time Access and Review

This week, we briefly went over constant time access on Monday then did review on Wednesday. I also got back my mark for term test 2. Even with the 3 mark boost I got, I did not do as well as I would have liked. I wish I had spent more time on LinkedLists because even though I understood the examples, I didn't write a lot of the code on my own. I need to practice writing recursive functions, especially ones involving LinkedList, binary trees, and other ADTs.

It's interesting to see other students' take on certain topics in the course because sometimes just the wording, or the example they present, makes it easier to understand. For example, I was having difficulty understanding the lab involving binary search trees, so I looked through the sLOGs and found this one: http://codinginmotion.blogspot.ca/. This person did an overview of rec1, rec2, and rec3 and included code of one of the functions. It didn't even use recursion! For me, sometimes I get caught up in all of the new concepts that we learn in class that I forget my basics. For instance, in exercise 4, we wrote functions for binary search trees, but we had to write it without using recursion. Modelling my function after the given insert function, I was able to figure out e4a and e4b using a while loop for each.

I don't think I personally will be keeping a sLOG after this post, but I will definitely refer back to others' blogs to see their take on each topic. It will help me with studying and my future endeavors in computer science. With that said, it's been a fun ride, CSC148. Now to take on the final exam!

Monday 31 March 2014

Week 11: Sorting and Efficiency

This week in lecture and in the lab, we looked at sorting algorithms and their efficiency. The algorithms we looked at were insertion sort, selection sort, bubble sort, quick sort, and merge sort. We compared their efficiency to timsort, Python's built-in sorting method), but only by run time since no code was provided. Here are some observations I've made about each of the algorithms:
  
Insertion Sort
There were two versions of insertion sort, the second of which included a call to "del." This version of insertion sort ended up being faster than the first one. To compare, on a 2000 item list, insertion_sort_1 took 0.44 seconds while insertion_sort_2 took only 0.26 seconds.  In insertion_sort_2, the algorithm will look for the insertion point, and once it is found, the item that needs to be sorted gets stored in a variable and deleted from the list. This makes the algorithm run time faster because once the insert method is called, the list is now one item shorter. Thus, it can put the item in the right spot by looping over fewer elements.


  
Selection Sort
Selection sort is interesting because it is always in O(n^2). It goes through the entire list to make comparisons to ensure that each element is properly swapped if need be. Compared to other algorithms,  it is faster than bubble sort, but slower than insertion_sort_2. In the lab, my partner and I discussed ways of making this algorithm faster, and we discovered that since the while loop called "len" during each iteration, it made the run time a little slower. To fix this problem, we assigned len(L) a variable before the while loop so that "len" would only be called once.

Bubble Sort
Bubble sort is by far the slowest sorting algorithm. Although there were two versions of it included in the lab, both took approximately 0.7 seconds on a 2000 item list. Bubble_sort_1 simply makes swaps until the largest item is at the end of the list, which is the sorted section. Bubble_sort_2 checks if there were any swaps on the previous scan of the list, and if not, the algorithm will stop early. It was interesting to see that the second version was not actually faster as expected. Looking over the code, I think it was slower than bubble_sort_1 because of the extra while loop. It's one extra step to check, so if this version was used on the worst case, the run time would be much slower than bubble_sort_1.


Quick Sort
Quick sort is a fairly fast sorting algorithms, but unlike most sorting algorithms, it's slower on nearly sorted or sorted lists. Since it partitions smaller elements to the left and larger elements on the right of the pivot, there will be an unequal number of elements in the right half. The algorithm works much faster when used simultaneously on halves with a similar number of elements. Of the two versions of quick sort below, the second method has a faster run time because it doesn't make copies of the sub lists. Instead, it works with the original list all throughout. The difference is noticeable, with quicksort_2 being 0.3 seconds faster than quicksort_1.



Merge Sort
Merge sort works recursively by splitting the list into two halves, sorting those halves, then merging them together. We worked with two versions of merge sort: mergesort_1 works by creating new lists while mergesort_2 continually updates the original list. The second method is faster (0.010302 seconds vs. 0.009599 seconds). Similar to quick sort, this is because it takes more time and memory to create new lists each time, whereas only using the original list is more efficient.



Although there are six sorting algorithms mentioned above, there exist even more to do the same simple task: sort a list of numbers in non-decreasing order. This is one of the key aspects of computer science. Even though there may be a best way to do a task (in this case, Python's built in method was the most efficient), it's important to notice that there are multiple ways to look at a problem and solve it.

Sunday 23 March 2014

Week 10: Sorting Algorithms

This week I finished A2 finally. I went for help during office hours, and it gave me a lot of insight into how to use recursion and how to approach these problems by breaking them down. It helped me to do the doc string examples first before writing the code. I didn't get the regex_match one to work fully, so i will have to go back and work on it.

We did sorting algorithms in lecture, and honestly it's a little fuzzy to me. I understand how merge sort and quick sort work, but the implementation seems complicated. I'm going to look over the material and study for test 2 at the same time.

Week 9: Binary Trees

This week was probably the toughest one material wise. I had difficulty getting recursion to work on E3, and I'm unsure of how to write the functions for A2 part 2. I definitely need to ask for help when I need it. I think part of it is that I haven't taken enough time to review the material on my own time, and I also need to get clarification on certain parts of A2. There are so many cases to take into consideration, especially with the regex_match function. That will likely be the most difficult one to deal with.

Saturday 8 March 2014

Week 8: Linked Lists

This week we did more on linked lists. We had done a lab for linked lists in week 7, but we did a continuation of it in this week's lab. My partner and I had a lot of trouble figuring out the different methods for linked lists. Even though we drew pictures, we still had difficulties. Clearly this is an important concept that I need to figure out not just for future assessments (like the final exam), but for my own understanding of computer science. I'm so used to the mechanical coding we learned in 108. This course is more in depth when it comes to problem solving, and it is something I need to work on.

Since I didn't finish either lab 6 or 7 during tutorial time, I want to work on it on my own time. I believe if I can do the labs on my own, I will have a solid understanding of linked lists and recursion.

Thursday 27 February 2014

Week 7: Recursion

So I've written about recursion before, but now I think I understand it a lot better. After reviewing the recursive functions from class examples repeatedly, I can confidently break down what recursion actually means.

Recursion involves a function calling itself. However, there is always a base case, where there is no recursion involved. For example, take a look at the nested_depth function:








Here, the else clause is the base case. If L is an int, no recursion is involved, and 0 is returned. Recursion happens when there are lists, so if I have L = [1, 2, [3]], the function will go through the entire list before returning a value. Since L is a list, it will return 1. The first element of the list is 1, which is not a list, so it will return 0. Next, 2 is also not a list, so it returns 0. The next element is a list, which will add 1. for a total of 2. Now that the function has reached the end of the list, it will return 2.

Tracing through examples, starting with the base case, is very helpful because once I know the output of a simpler example, I can apply it to a more complicated example.

Sunday 16 February 2014

Week 6: Tree ADTs

This week we looked at Tree ADTs, and I think they helped me better understand recursion. I'm still having a little bit of trouble writing recursive functions myself, but attempting to write functions for pre-order, post-order, and inorder helped me in a couple of ways: first, I practiced writing list comprehensions, and I increased my understanding of recursion.

I think what really helped this week was the lab that we did this week. It was a relatively short lab, but we still took the whole time because we really wanted to take our time to understand the recursion behind the functions.

I am still a little confused about namespaces and scopes, but there was this post on http://aheapofpythons.wordpress.com/ that helped me understand it a bit better. I think I work by myself too much at times. In 108, I did all the assignments on my own, and I always did the worksheets in class on my own. In 148, the only time I ever work with people is during the labs. I've realized that it helps a lot, and now reading other people's blogs, I will definitely make an effort to ask other people for help if I need it.