Tuesday 2 December 2014

Other slogs to check out

I know I don't usually do this, but to be 100% honest, it's for the feedback marking criteria. Anyways, definitely check out these slogs cause they've actually been really useful throughout the year to read despite me not sharing with you guys on this slog haha.

http://qhycsc165slog.blogspot.ca/ (I find that he explains things very clearly)
http://165choi.blogspot.ca/ (Also explains questions very well, however very long posts)
http://thesnailslog.blogspot.ca/ (Informative & rather entertaining to read)
http://sunnywangthebrilliantone.blogspot.ca/ (Brings up interesting things to consider)

Saturday 29 November 2014

Week 11.1 + 12

Just to add onto week 11, I finally understand how reduction works! So if you wish to see if a program is computable or if it halts, we can insert that function inside the halt function. Now inside the halt function there will be another function that runs the arguments inside the halt parameter. Now, if the program that we're trying to halt has the same result as halt, it reduces to halt and is not computable.

Now for week 12, the good thing is that countability and induction is not on the exam, but we should learn it anyways because why not. Anyways for countability, I found it interesting to have such a specific definition that allowed the set of natural numbers to be equal in size of the set of even natural numbers. This is simply because for each natural number, there is an equivalent even natural number, and so the pairing goes to infinity, thus being the same size. I found this confusing as I always thought that some infinities are greater than others. For example, there are more real numbers between 0 and 1 than there are in the infinite set of integers according to random credible sources I googled online. Anyhow, it all made sense once Professor Larry defined the definition of mapping.

Now for induction, I find that it reminds me of recursion. Where if I prove one case, the rest are also proven. So there is always one base case, and with the inductive step, all the cases after the base case are proven as well. A good metaphor is the domino effect, where if one domino is proven to be falling, so will all the ones after it.

Anyhow, I think this is the end of CSC165 and my SLOG. And despite not having that many posts, I've always tried to catch up and do multiple posts together in one. Now its off to the exam and wish me good luck!

Wednesday 12 November 2014

Week 10+11 - big-theta and computability!

Oh man, I think I'm really behind, where I think I understand what's going on in class, but the second I walk out of class, I have no clue what I'm doing. Anyways, this week we learned about proving big theta. Similar to proving Big O, you try to under-estimate the antecedent, and over estimate the subsequent to create a link between the two, proving that one function is smaller than the other. Furthermore, we should always try to keep in mind the definition of big theta and big O, and remember that we should consider the point which the functions start diverging and increase at different rates, helping us select variables to use for the proof, where we pick B and c based on known B's and c's.

Now onto harder, but more interesting things, computability. Computability is where we want to see if a function can return a result. Through the analyzing the algorithm we can predict what the computer is able to compute. However, not all things are computable. For example, if there existed a function which could see the computability of any function. If it were to compute itself, there would be a contradiction. Therefore it is actually impossible to make a single function that can see the computability of any function. Furthermore, if you're thinking what if we just run the function and see when it halts? Well if the function did not halt at all, you would never get a result back!

And so, what are reductions? This part I don't understand too well just yet but i'll see if my explanations are correct. Assume there are two functions f(n) and g(n). If f(n) was implemented by extending g(n), we can conclude that f(n) reduces to g(n). Therefore if g is computable, it implies that f is computable. And with the contra-positive, if f is non-computable, g is non-computable! Through this we can prove if functions are non-computable by relating it back to the halt function!

Anyways that's all I've got for today, this course is getting very interesting but unfortunately very hard. I hope that I can end up with a 75% to apply for the computer science minor, but with how mind boggling stuff is getting, I think I'm going to have to put in more effort.

Sunday 9 November 2014

Week 9 - TERM TEST 2

Sorry for being slightly late to post this slog, I had infact almost totally forgotten about it until now.
Anyhow, I feel rather confident about the test. This time its hopefully justified, I was able to prove the first and the third question however the second one was a bit tricky but I think I pulled through by using contradiction of the negation. Anyhow just like my prof says, I don't always take a term test at 6 pm, but when I do, I have 2 lectures right after.

This week we went through the Big O of polynomials, non polynomials as well as limits. Personally I found it a bit hard to focus due to the stressing factor that is a term test, however I tried my best to understand the lecture and have tried going back and reviewing the lecture slides. One issue that I have with proving the Big O of polynomials is that the definition of Big O is so long that I tend to get lost easy. But overall, I realized that picking B and C is important but must also be used to show its bounding relationship. Also for polynomials, by checking the degree of the function, you can figure out who is the Big O of who, where the terms with the largest degree and see which is larger. eg. n^1984 is less than 2n^1984.

Furthermore, with non polynomials, we can use limits to help us prove Big O. If the ratio of f(n) over g(n) approaches infinity when n approaches infinity, that means that f(n) grows faster than g(n).

Anyhow, hopefully I can wrap my head around the definition of Big O, which seems to currently be an issue. This is perhaps solely because I get confused rather easily when equations get long, but I suppose I'll survive. See you guys next time! (Lookin' at you TAs since you guys are probably the only ones who read this)

Saturday 1 November 2014

Week 8 - Oh no we still have to do proofs?? + Problem Solving Wiki

All along I've been hoping to stray away from proofs and start going through programming algorithms, but of-course, they had to combine both. This week we mainly worked on algorithm analysis. We went through what upper bound O(n^2) and lower bound Omega(n^2) means in analyzing an algorithm. Upper-bound is limit which the actual run time of the program will never be larger than, and lower-bound is the opposite, which the program will run the fastest. What I found was surprising was that you don't need to define the function's speed, but you're allowed to over-estimate when solving for the upper-bound and underestimate for lower-bound. However, I find it difficult to understand

PROBLEM SOLVING. Penny piles!
There are two drawers, the left drawer has 64 pennies, the right drawer contains zero. Is it possible to make it so that one drawer has 48 pennies by transferring half of one pile to the other (You cannot transfer the pennies if there are an odd number of them in one drawer)?

Working as a group, we all tried our best to understand the problem as much as we could. We drew a tree diagram to help us illustrate the possible answers. We first saw that it was possible to make one drawer contain 48 pennies through trial on paper. However, how could we prove that you can make any number from 0-64?

We continued to seek an algorithm between different criteria of numbers. We tried odd numbers, even numbers, prime numbers, and through the analysis of the steps we used, we found a pattern. Instead of seeking a specific number, we should instead consider the specific number and its pair that is used to generate 64. For example, if we look at the number 63, we see that the other drawer must have 1 penny. The only possible way to obtain this combination is if a drawer of 2 pennies were halved. Therefore the next combination is 2, 62. Using the same method of backtracking, we would see the next combination would be (4,60), then (8,56), then (16,48), then (32,32), which leads us to the beginning. Using the algorithm of doubling the smaller number, we found that any number from 0-64. This is because any odd number doubled would result in an even number which belong in the tree of combinations of pennies that are based on the power of 2.

Now the question is, what if you started with a different number of coins? Well odd numbers would not work as you cannot split the odd number of pennies apart. Furthermore, the starting number must be a power of 2, otherwise at some point of the algorithm, you will not be able to continue to the last number. Therefore any number with a power of 2 would allow you to make any number from 0 to that number., eg. 8, you can make any number from 0-8. This was solved through trials and error of different sums. First, 100 was tried, however it would be impossible as the third step of the algorithm would be (75, 25). Then, without trial, we saw that starting with an odd number would not work as you're stopped at step 1. Looking back at why 64 worked, it was because of its treemap that has encompasses all the powers of 2, and due to the algorithm of halving the number of pennies, you can always connect to that original treemap to lead back to the start. If I had more experience with Python and if I was less busy, I could try to make a program to simulate all the possible numbers from 0-64 to confirm the findings. Nonetheless, this was a fantastic problem exercise and quite enjoyable to solve. That's all for this week, hopefully I am ready for the test next week. Wish me luck (I still have trouble remembering these damn proof structures! But at least I have the cheat sheet!)

Saturday 25 October 2014

Week -7 Aha! Now I see why this is a comp. sci. course!

This week, I believe we've finished going through all the different types of proof structures, and boy are they hard to remember. Most recently, we finished proofs by cases which I found interesting, since it felt very un-intuitive to use, however still very useful once I understood that the cases must cover all possible scenarios. Imaging proving a integer relationship, you'd be making an infinite amount of cases!

Anyhow, this course has just gotten very interesting, now we're on algorithm analysis and asymptotic notation. With algorithm analysis, its very interesting as we count the number of steps that an algorithm uses, and its also very surprising that we don't care about the amount of steps it takes, but the difference in steps when input is changed. The course has suddenly become very computer science based and I'm starting to like it even more! My only issue is I have trouble remembering structures or what symbols to use in proofs, and whenever I forget, I get stuck on continuing the proof. More memorization is required I suppose, and I'll get to the problem solving sLOG... eventually, my midterms aren't over yet but it seems like they just keep coming and coming and coming... Until next time!

Wednesday 15 October 2014

Week 6 - Pls midterm y u do this

First things first, I've received my midterm back and boy do I feel stupid. 26.5 out of 38. Hoooorrrible, simply because the class average was in the 80s, meaning I'm horribly behind in the bell curve. But, I feel that it wasn't my lack of understanding of course concepts but mainly due to the way I answered a couple of questions. When the question asked me to explain my reasoning, I simply restated the statement in normal words instead of symbolic form. Absolutely idiotic, and I suppose I've learned the lesson the hard way. 

Anyhow, I'm going to delay the problem solving sLOG until after midterms when things die down a little bit. I've been reviewing the slides for past few lectures and I think I'm starting to get a hang of it. I'm still a bit unused to the format which we use to write proofs, however I understand the reasoning and logic for how to prove certain relationships (Direct proof, contradiction, etc.) However, I am having some issues with the recent lecture about limits. I'm planning to check out Prof. Heap's lecture slides and ask questions during tutorial next week for more clarity. 

One thing I found interesting though is that I never knew the floor function that we recently learned was originally a math symbol. I always thought it was created by programmers as I first learned of its use in programming languages such as Java. Apparently next week, we go through abit more proofs, but I'm more excited about learning about algorithm complexity and such. I've always been interested in those types of things despite being a commerce major. Nonetheless, I shall continue to work hard and hopefully get around that 80% median, or at least qualify for the Comp. Sci. minor.