Thursday 13 December 2012

Exam was epic

Well I can officially say goodbye to CSC236 after a quite interesting but also challenging term. Thank you to the all the TAs but in specific Lila (who had really cool shirts) and Lalla were very helpful with extra help on assignments. Thank you to everyone involved with this course and of course Prof. Heap for his lecture abilities!

Monday 3 December 2012

Last week of class!

Wow! Where has this semester gone?? Second year has flown by for me.

Well, our final week lecture content consisted of using elimination techniques to take a NFSA and convert it to a regex expression. The technique, although very procedural, was very long and I suppose I will have to just take my time during the exam to make sure I get it right.

Having assignment 3 behind us now I found a very cool concept for Q1 which was called the Pigeonhole Principle. I gotta say, this question drove me bonkers without knowing about this mathematical principle, but once I was told to research this online I had that amazing rush of finally getting somewhere with the question! Haha. The question was asking us to prove why a DFSA under 16 states couldn't exist for a certain language. Basically, the whole idea behind the question was that if for some reason there wasn't 16 states available and the nature of the machine required to have knowledge of the 16 different permutations of the length 4 binary string then there must be a state which has two different strings on it if there is less than 16 states in the machine. But anyways, in short the answer followed from this fact that eventually the machine will drive the two strings to a state which will both be an accepting state and denying state which helped us devise contradiction.

Some interesting examples that helped me understand the application to Q1 of the Pigeonhole Principle I found are:


  • Softball team

    Imagine five people who want to play softball (n = 5 items), with a limitation of only four softball teams (m = 4 holes) to choose from. The pigeonhole principle tells us that they cannot all play for different teams. At least 2 must play on the same team.

    [edit]Sock-picking

    Assuming that in a box there are 10 black socks and 12 blue socks, calculate the maximum number of socks needed to be drawn from the box before a pair of the same color can be made. Using the pigeonhole principle, to have at least one pair of the same color (m = 2 holes, one per color) using one pigeonhole per color, you need only three socks (n = 3 items). In this example, if the first and second sock drawn are not of the same color, the very next sock drawn would complete at least one same-color pair (m = 2).

    (Cited from http://en.wikipedia.org/wiki/Pigeonhole_principle)

Monday 26 November 2012

Epsilon: The Magic Maker?

This week in lecture we explored regular expressions more closely. We are now constructing entire languages through building simple patterns and logic!

But one thing that caught my eye was that measly epsilon symbol that seems to break all the rules! I guess because we are now also studying NFSAs and not DFSAs that there are major differences between them that I will have to get used to, but epsilon still seems to just confuse things. Like, what tells the machine to take the epsilon transition? It seems like more of a probability class now that way prof Heap showed it to us! Haha. But really, I guess the correct way to look at these epsilon transitions is that we must examine all possibilities and paths given that these transitions may or may not occur, and through this we can devise if a string is accepted by the machine according to the language it represents.

I wish I could use epsilons to show my transitions to how I get answers on tests! HA... yea not funny.


Sunday 18 November 2012

Woo Fall Break! Sleep!

Fall break was nice and relaxing... finally I could sleep at a normal time and not worry too much about 236 assignments!

Anyways, we started into the automata and languages part of the course which I believe is the "final frontier" of 236. This is a pretty interesting concept and it ties nicely into our 207 class at this point since we just started studying regular expressions in Java.

An interesting example that Prof Heap went over was the pop machine one. How simple machines can be written using the concepts of "states" and transition states is really cool, and its just another way to look at programming. Elevators and other machines we use in our daily lives use computers use very limited CPU power and memory capacities and I understand why we try to write machines like this with very small amount of states.


Sunday 11 November 2012

Wait, what just happened?

So we wrote term test 2 on Monday and to be completely honest I felt as though I wasn't really ready to write it even after studying the entire weekend before for it. For some reason time complexity wasn't really making sense for me this year as it did last year in CSC165, perhaps its the way that we use induction now instead of purely direct proofs like last year?

A question on the midterm that I know I didn't do well on was very closely related to the second question on A2 which I think would be valuable to go through now:

For some context the part of the question that is relevant to time complexity question I had issues with was to show that the closed form for binary search was in the O(logn). Lets show that now...

A closed form of the recurrence function for Binary Search is: T(n) = logn + 1

To show that T ∈ Θ(logn) one can demonstrate 1) T ∈ O(logn) and 2) T ∈ Ω(logn).

Using the fact that T(n) has been proved to be a non-decreasing function I will use a Lemma discussed in lecture (http://www.cdf.toronto.edu/~heap/236/F12/Lectures/W6/thursday-annotated.pdf Page 6) that when n is not necessarily an integer power of 2 we have possible values of T(n) within a range of ň/2 < n ≤ ň. (Where ň represents 2⌈ logn ⌉, and n represents 2 logn).

So for part 1 I will show  T ∈ O(logn)
Claim T(n) ≤ clogn

T(n) ≤ T(ň) # Because we proved T is non-decreasing
        = lgň + 1 # Closed form of T 
        ≤ lg2n + 1 # Because n > ň/2 from above fact 
        = (lgn + log2) + 1 # By log rules
        = (lgn + 1) + 1
        = lgn + 2 
T(n) ≤ lgn + 2

Thus T ∈ O(lgn) with c = 1 & B = 1 (Breakpoint)

Now the second part 2
Claim T(n) ≥ clogn
T(n) ≥ T(ň/2) # Because we proved T is non-decreasing 
        ≥ log(ň/2) + 1 # Closed form of T
        = (logň – log2) + 1 # Log rules
        = logň – 1 + 1 
        = logň 
T(n) ≥ logn # Because ň is the ceiling of n (ň = 2⌈ logn ⌉ & n = 2logn )

Thus T ∈ Ω(logn) with c = 1 & B = 1 (Breakpoint)

Wow, now we're done
By showing that T is both in O(logn) & Ω (logn) then this implies T ∈ Θ(logn)!

I dunno why this simple kind of proof messed me up during the term test. I blaim it on the time constraint or maybe I was just having a bad day! haha

Saturday 3 November 2012

Golden Ratios... Golden Ratios Everywhere


Something I found very fascinating during my work on Assignment 2 was the usage of "Golden Ratios" in many different aspects of maths, finances, music, architecture... the list goes on! 

As mentioned in class deriving these ratios is not that hard at this day and age but according to many research documents the first usage of these ratios can be traced back to as early as Phidias of 490–430 BC!


I believe because our A2 is now past due I can discuss some methods I took to derive the golden ratio relationship with the question 1 of the assignment:


Open first glance with the OMCOFS I just started listing all the possibilities starting with the empty binary string of length 0 ("") and continued until a pattern was beginning to form in my head:

Length (n)
OMCOFS
0
1
2
3
4 …
{“”} = 1
{0} = 1
{00, 11} = 2
{000,011,110} = 3
{0000, 0011, 0110, 1100, 1111} = 5

It is clear from this list that something fishy is going on with the growth of OMCOFS as the binary string grows by 1 length. And after some careful observation I claimed that the growth is basically the Fibonacci sequence offset by 1. After the two first base cases it is clear to me that for example the binary string of length 4 is comprised of the amount of OMCOFS in the last 2 previous so H(2) + H(3) = H(4). Developing a recurrence after this point was pretty trivial and was basically the Fibonacci sequence with a different base case for n = 0: 


Now after this moment of triumph I began to have some issues unwinding the recurrence because for some reason when I was unwinding the recurrence began to EXPLODE with terms which I couldn't reduce enough to find a pattern to Fibonacci... only after some guidance by the wonderful Computer Science Aid Center TAs I was shown that perhaps we shouldn't unwind blindly and maybe try to unwind one of the recursive calls at the time... This was enough information for me to eventually wind enough and find the Fibonacci connection! 

Now that I unwound my recurrence and showed that there certainly is a closed form for the H(n) function which was Fibonacci sequence offset by 1 proving that it was basically a piece of cake since this was done both in class and done in lecture notes!!


Overall I had a much better experience with this assignment then the first, I felt more prepared for this one. I believe I just needed to be "warmed back up" into logic courses since the last time I did something like this was my first semester of University last year. This assignment, although only two questions, required abstract thinking ("All I need to mention is how abstract the definition of a OMCOF is! haha") and I believe this was very beneficial to my understanding of the course content.


Well thank you for sitting though my quick overview of my take on question 1 of Assignment 2, I sure hope that others share my same approach on it since I believe I was right.. maybe I was wrong? Who knows until the marks come out :P

Tuesday 23 October 2012

Moving on up

In my previous post I titled it as "Well into the "Good Stuff"". Well what I mean by the "Good Stuff" is almost exactly what happened in CSC165 near the end of the course. We have begun to move into the actual proving of correctness of programs and running time complexity. I always have an issue with these logic courses near in the beginning where I lose track of what the end game is like... optimizing and recognizing places for improvement in our programs!

I look forward to learning more about Big O and other Time Complexity material as this is what I found most interesting in CSC165, and now I realize why induction is stressed so heavily in the beginning of the course.

All the best!