Wednesday, December 29, 2010
It's Time
I spent all day watching Golden Girls. Don't judge me. I think it's time to go back to work.
Tuesday, December 28, 2010
The Joys Of The Craft
Perhaps my favorite programming quote is this one by Fred Brooks, author of The Mythical Man-Month:
If you haven't read The Mythical Man-Month, I really recommend you do. It's a fantastic read for software engineers. It's funny that it was first published in 1975, and yet its completely relevant today. The problems Brooks describes in the book can still be encountered in the software development world today.
"...There is delight of working in such a tractable medium. The programmer, like the poet, works only slightly removed from pure thought-stuff. He builds his castles in the air, from air, creating by exertion of the imagination. Few media of creation are so flexible, so easy to polish and rework, so readily capable of realizing grand conceptual structures..."
- Fred Brooks in The Mythical Man-Month
If you haven't read The Mythical Man-Month, I really recommend you do. It's a fantastic read for software engineers. It's funny that it was first published in 1975, and yet its completely relevant today. The problems Brooks describes in the book can still be encountered in the software development world today.
Saturday, December 25, 2010
Christmas Upgrades
I spent a good chunk of Christmas installing Windows 7 on my parent's ancient machine. In hindsight, putting Windows 7 on a machine this old was probably a bad idea. The Ethernet controller didn't have drivers for Windows 7, so we couldn't connect to the internet for a while. I had to do a lot of sketchy things with drivers to get it to work. After we got internet, Microsoft Updates downloaded all of the other missing drivers, and this fixed most of the issues. Unfortunately, one of those updates also screwed up the boot disk image. :/ Not impressed. After a little more black magic from the recovery disk, I think we're back up and mostly working. Hopefully, I can figure out all the outstanding issues tomorrow. I hope my Ubuntu install goes smoother.
In other news, Dani visited with her family this morning for brunch. It was nice. :) I am going boxing day shopping with them at 7am tomorrow. Crazy, I know. Maybe I'll pick up a new USB stick or something.
Anyways, Merry Christmas! I hope you have a great holiday season!
In other news, Dani visited with her family this morning for brunch. It was nice. :) I am going boxing day shopping with them at 7am tomorrow. Crazy, I know. Maybe I'll pick up a new USB stick or something.
Anyways, Merry Christmas! I hope you have a great holiday season!
Exceptions in Java
Java has two types of exceptions: checked and unchecked. Unchecked exceptions are the traditional exceptions you see in languages like C++ and C#. Checked exceptions are essentially more explicit versions of unchecked exceptions. Checked exceptions require that the developer explicitly declare what exceptions a method could throw. The compiler will then enforce methods higher in the call stack to either catch the exception, or propagate it to a higher level.
Checked exceptions are a feature specific to Java. It was introduced into Java as an experiment, and the general consensus from developers is that the experiment failed. This is probably why no newer languages support them.
What's the problem with checked exceptions? They seem like a great idea in theory. They allow the code to be explicit with all error handling code, and make it impossible to forget to catch an exceptions. Unfortunately there are a lot more drawbacks with checked exceptions.
My main issue with them is that they clutter the code quite a bit. There is a lot of extra noise that goes into your code when you use checked exceptions. Not only does this make the code less readable, but it slows down development. For instance, you should not have to catch every single exception when you are writing test code or quick prototypes. A lot of the time, developers will just "swallow" the exceptions with empty catch blocks, because there is no intelligent way to recover. Does your tiny app really need to catch OutOfMemoryExceptions around every statement that might allocate memory? Even if you do catch it, how will you gracefully recover? The extra verbosity becomes noise that clutters the program's logic.
This argument reminds me of static typing vs. dynamic typing. If you choose static typing, you lose some flexibility, but gain some extra static analysis from your compiler. The same thing applies with checked exceptions, however the benefit they offer isn't worth the inflexibility.
What are your opinions on checked and unchecked exceptions in Java?
Checked exceptions are a feature specific to Java. It was introduced into Java as an experiment, and the general consensus from developers is that the experiment failed. This is probably why no newer languages support them.
What's the problem with checked exceptions? They seem like a great idea in theory. They allow the code to be explicit with all error handling code, and make it impossible to forget to catch an exceptions. Unfortunately there are a lot more drawbacks with checked exceptions.
My main issue with them is that they clutter the code quite a bit. There is a lot of extra noise that goes into your code when you use checked exceptions. Not only does this make the code less readable, but it slows down development. For instance, you should not have to catch every single exception when you are writing test code or quick prototypes. A lot of the time, developers will just "swallow" the exceptions with empty catch blocks, because there is no intelligent way to recover. Does your tiny app really need to catch OutOfMemoryExceptions around every statement that might allocate memory? Even if you do catch it, how will you gracefully recover? The extra verbosity becomes noise that clutters the program's logic.
This argument reminds me of static typing vs. dynamic typing. If you choose static typing, you lose some flexibility, but gain some extra static analysis from your compiler. The same thing applies with checked exceptions, however the benefit they offer isn't worth the inflexibility.
What are your opinions on checked and unchecked exceptions in Java?
Friday, December 24, 2010
Introduction to Genetic Algorithms
Here's a nice little introduction to Genetic Algorithms. Although the problem they solve (generating "hello world") is rather silly, it's a good introduction to the world of Genetic Algorithms. If you are interested in the topic, you should check it out.
This is another good article. Again, this problem (solving N-Queens) isn't the best. A deterministic algorithm can solve the problem much faster.
The best problems for Genetic Algorithms are ones that don't have known optimal solutions. Another important property is that partial solutions can be crossed over to make a better solution. Without this property, your genetic algorithm will essentially degenerate to randomly guessing at the solution. In fact, I'm curious to see how much better some Genetic Algorithms perform when compared to a random algorithm. You can generate a lot of random guesses while a Genetic Algorithm goes through a generation.
It would be cool to make an app that allowed you to encode genetic algorithms without using a traditional programming language. Instead, the user would use a simple, domain specific language to encode the genetic algorithm. Then the program would be able to automatically calculate and display run statistics. Perhaps this should go on my Pet Projects queue.
Wednesday, December 22, 2010
End of Exam Plans
Tomorrow is my last exam. DB can die. Rawr!
Plans for Dec 22 6:30pm - Jan 03 9:00am
- Clean Roslin House.
- Play a lot of video games. :)
- Play with Toronto friends.
- New Years with Girl.
- Do Christmas-y type things with my parents.
- Play with GWT more. It would be awesome if I could apply it to some project, but I don't have any pet projects to work on right now. :/
- Finish that git book. Git seems pretty nice, after reading all the literature. Before I was blindly using it without knowing any of the theory.
- Finish that SE course.
- Find a nice little genetic algorithm challenge for Willis and I to do. We want to see who can create a better GA for some problem. Should be a fun little project during our work terms. Any ideas?
What are you guys going to do after exams? Presumably a lot of you are already done. Waterloo drags on with exams forever. *Sigh*
Plans for Dec 22 6:30pm - Jan 03 9:00am
- Clean Roslin House.
- Play a lot of video games. :)
- Play with Toronto friends.
- New Years with Girl.
- Do Christmas-y type things with my parents.
- Play with GWT more. It would be awesome if I could apply it to some project, but I don't have any pet projects to work on right now. :/
- Finish that git book. Git seems pretty nice, after reading all the literature. Before I was blindly using it without knowing any of the theory.
- Finish that SE course.
- Find a nice little genetic algorithm challenge for Willis and I to do. We want to see who can create a better GA for some problem. Should be a fun little project during our work terms. Any ideas?
What are you guys going to do after exams? Presumably a lot of you are already done. Waterloo drags on with exams forever. *Sigh*
Sunday, December 19, 2010
Inheritance vs. Composition
For a while, I've heard people say things like, "always prefer composition to inheritance". Today I looked into this argument more. Turns out the real argument should have been "Don't use inheritance at the wrong times". Durr. Essentially, this boils down to the "Is-a" vs. "Has-a" relationships we all learned in whatever Object Oriented 101 course you took. If two things share an "is-a" relationship, you can use inheritance. If it's a "has-a" relationship, it should be included through composition. For example, a patient has an ID, and a patient is a person. Following this rule will get you out of most of the problems with using inheritance wrong.
Another, perhaps more important, thing to consider is the Liskov substitution principle. It basically says if I have class D derive from class B, I should be able to replace all instances of B with D anywhere in my program. The program should behave exactly as before. If it doesn't, D should not have inherited from B. An implication of the Liskov substitution principle is that "is-a" really means that every method in the base class applies to the derived class. That is, if all birds can fly(), and a penguin cannot fly(), it shouldn't derive from a bird. For development purposes, a penguin is not a bird.
Another place where inheritance is probably inappropriate, is when you create linear inheritance chains. This is simply there to anticipate change someday, but it complicates code unnecessarily today. Merging the classes would be simpler (and by extension, safer) today. You can break them up again when you actually need the hierarchy, and it will cost just as much then. The difference is that you pay the price (in both time and complexity) only when you need it. I've talked about this issue before.
Anyway, I should probably go back to studying for exams, instead of watching Software Engineering lectures at French universities. La la la.
Another, perhaps more important, thing to consider is the Liskov substitution principle. It basically says if I have class D derive from class B, I should be able to replace all instances of B with D anywhere in my program. The program should behave exactly as before. If it doesn't, D should not have inherited from B. An implication of the Liskov substitution principle is that "is-a" really means that every method in the base class applies to the derived class. That is, if all birds can fly(), and a penguin cannot fly(), it shouldn't derive from a bird. For development purposes, a penguin is not a bird.
Another place where inheritance is probably inappropriate, is when you create linear inheritance chains. This is simply there to anticipate change someday, but it complicates code unnecessarily today. Merging the classes would be simpler (and by extension, safer) today. You can break them up again when you actually need the hierarchy, and it will cost just as much then. The difference is that you pay the price (in both time and complexity) only when you need it. I've talked about this issue before.
Anyway, I should probably go back to studying for exams, instead of watching Software Engineering lectures at French universities. La la la.
Saturday, December 18, 2010
Health Analytics Challenge
The Heritage Provider Network is offering a $3 million prize for an algorithm that can predict future hospitalizations based on patient data. This seems like a very cool application of machine learning. If I knew more about machine learning (or anything, really :P), I would totally participate. This sort of application would be invaluable to the health care field. Imagine having the ability to predict future problems with a patient, years in advance. That would cause a huge improvement in the quality of life.
If any of you know anything about machine learning, you should try out this challenge. It's set to start around next year. You could make a huge difference in the quality of life of people. I suppose $3 million would be nice as well. :P
If any of you know anything about machine learning, you should try out this challenge. It's set to start around next year. You could make a huge difference in the quality of life of people. I suppose $3 million would be nice as well. :P
Friday, December 17, 2010
A Good Software Engineering Course
I found this SE course recently. It seems very good. I sincerely hope Waterloo's Software Engineering courses are this good. I guess I'll find out in my next two academic semesters.
You should take a look at the introduction lecture. It is a fantastic description of what Software Engineering is, and why it's important.
EDIT: "We will have a project, and we WILL be changing requirements on you." Well done. This is a fantastic way to learn about Software Engineering.
You should take a look at the introduction lecture. It is a fantastic description of what Software Engineering is, and why it's important.
EDIT: "We will have a project, and we WILL be changing requirements on you." Well done. This is a fantastic way to learn about Software Engineering.
Thursday, December 16, 2010
Version Control Models
I've started reading Pro Git, a book on how to use git correctly. I figured it would be good to know all the ins and outs of git, since I'll be using it at Karos Health. The book does a pretty good job at summarizing the differences between the various version control methodologies. It certainly cleared up some things in my mind. Here's the page that talks about it.
To be honest, distributed source control systems seem like overkill, but that might be just because I haven't worked on projects big enough (ie. not a Linux kernel). At BBM we used TFS for source control and everything seemed to work smoothly. The project I worked on was only around 300 KLOC though.
In other news, I probably passed Physics. Here's hoping. Now I get to relax for a bit and do all my "easy" exams. :)
Do any of you guys have experience with git? How do you like it?
To be honest, distributed source control systems seem like overkill, but that might be just because I haven't worked on projects big enough (ie. not a Linux kernel). At BBM we used TFS for source control and everything seemed to work smoothly. The project I worked on was only around 300 KLOC though.
In other news, I probably passed Physics. Here's hoping. Now I get to relax for a bit and do all my "easy" exams. :)
Do any of you guys have experience with git? How do you like it?
Wednesday, December 15, 2010
Crisis
I've decided my true calling is in the arts now.
I'm going to write a novel.
This post brought to you by PWU.
I'm going to write a novel.
This post brought to you by PWU.
IBM AI Jeopardy
IBM has created a supercomputer AI system named "Watson" that will challenge all-star winners Ken Jennings and Brad Rutter in Jeopardy. This article talks a little more about it.
I am very curious as to how this will turn out. Apparently "Watson" will use natural language parsing to interpret the questions. If "Watson" wins, it will represent a huge leap in AI. We've already seen some very impressive natural language parsing from Wolfram Alpha. I want to see how much farther this concept can go with a budget from IBM.
So? Place your bets! Human or Computer? My money is on the AI killing the Jeopardy veterans. I guess we'll find out in February. What do you guys think?
I am very curious as to how this will turn out. Apparently "Watson" will use natural language parsing to interpret the questions. If "Watson" wins, it will represent a huge leap in AI. We've already seen some very impressive natural language parsing from Wolfram Alpha. I want to see how much farther this concept can go with a budget from IBM.
So? Place your bets! Human or Computer? My money is on the AI killing the Jeopardy veterans. I guess we'll find out in February. What do you guys think?
Tuesday, December 14, 2010
Note To Self
Note to self: Do not completely neglect a course throughout the term just because it's a first year course and it seems much less challenging (and interesting) than your other courses. This makes exam period 204% less fun. Damn you, Physics! Why did you have to be so much less interesting than my other courses this semester? A lot of times I was like "Go to physics" or "Work on OS for an extra hour". Guess what I choose to do most of the time. :P
In other news, algorithms went pretty well! The hardest course I've take so far in my university career is over! Woo! They were nice on the exam. They could have killed us (much like they did on the assignments), but instead they gave us tons of marks for doing simple things like tracing algorithms we had to know.
If I survive Physics (Thursday), It should be smooth sailing for the rest of my exams. I am not too worried about the OS and DB finals. After that, I will have time to do some more preparation for my work term. Specifically, learning as much as possible about GWT, and learning the ins-and-outs of git using this book (Jesse should be happy :P).
Monday, December 13, 2010
Japanese Multiplication Method
I just saw this video through Gizmodo:
Very cool way to multiply numbers. Don't think it's more efficient than the traditional method we were taught, but cool nonetheless.
Anyone want to prove it's correctness? :P
Very cool way to multiply numbers. Don't think it's more efficient than the traditional method we were taught, but cool nonetheless.
Anyone want to prove it's correctness? :P
Saturday, December 11, 2010
The Sound Of Sorting
This is a little old, but it's cool nonetheless. The sounds of various sorting algorithms.
I am in exam study mode. Bleh. Starting Tuesday, I have an exam every two days until the end (22nd). Unfortunately, my two hardest exams are first. I have to learn first year physics by Thursday. Woo! At least it should be smooth sailing after that.
I am in exam study mode. Bleh. Starting Tuesday, I have an exam every two days until the end (22nd). Unfortunately, my two hardest exams are first. I have to learn first year physics by Thursday. Woo! At least it should be smooth sailing after that.
Thursday, December 9, 2010
Facebook Hackers Cup 2011
Facebook just posted their Hacker Cup 2011 Competition. I was thinking of participating. I should have time, since I'll be on my work term by then. It's been a while since I participated in these sorts of contests. I had some fun with them in high school. If you want to work for Facebook (I don't), this seems like a great way to get their attention. I can imagine that the problems will be similar to the Facebook puzzles. Perhaps I should do some of those for practice.
Any of you planning on participating?
Any of you planning on participating?
Hacking Google Interviews
This site talks about all sorts of interview questions Google likes to ask. It has educational materials and Do's and Don'ts. You should check it out, especially if you have interviews coming up soon. Enjoy!
Wednesday, December 8, 2010
Cyber Terrorism and Wikileaks
This is so dumb. For real. Attacking websites under the guise of protecting free speech and protesting is stupid. I sincerely hope these people get busted and charged, but I know that will never happen. I can only hope that they realize that attacking these sites is only going to make people dislike Wikileaks even more.
Sunday, December 5, 2010
Lessons Learned From OS
OS was defeated last night. It was amazing. And we only had to stay up to 5am and go through ~450 broken builds! As I mentioned earlier, OS teaches you a lot about programming. You will come out a much better programmer, though not because you know how an OS works.
Here's the things I learned during this crazy project:
- Planning out everything before development is a really good idea. You should not be programming before you know how all the parts will interact. If that's not possible, you should take time to make your implementation very flexible, since it might have to change (sometimes radically).
- Pair programming is a good idea. Especially at 4am when you are both only half awake. We caught a lot of potential errors this way. We found pair programming especially useful for A3, since the work didn't really divide up very well.
- Overworked developers do sketchy things. After the 13th consecutive hour of development and debugging, you will do just about anything to get the job done. We definitely ignored some engineering "best practices" during our development.There are certainly sketchy solutions in our code. If we were maintaining the OS later, now would be a good time to re-factor everything.
- Debugging is hard and will take you double the time you allocate for it.
- A girlfriend that make you an OS care package(with BACON!) at 2am is amazing.
- Finally getting everything done is very satisfying and worth the trouble.
- Assert those stupid things that you think can never happen. They will happen and cost you hours of frustrating debugging time.
- Tests suites are invaluable. It's even better if you can run them all in under a minute.
- Bitmaps can do some things blazingly fast.
- Your physical memory manager probably shouldn't give it's own frames to other processes. This will end poorly for your physical memory manager.
- Low level programming sucks.
- C sucks.
What did you learn from OS or other crazy workload projects? Technical or otherwise.
Now it's time to finish up DB and sleep for 8294534529 hours. Wooo!
Common Interview Questions for Java
Here's a few more common Java interview questions for those of you going for interviews in the next couple months. I've been asked about half of them in the past.
Also, OS IS DONE! I was so happy last night! I'll make a post later on today about all the things that I learned. If this course is good for one thing, it's teaching you a lot about programming (and how you work under pressure). The OS knowledge seems secondary to that.
Also, OS IS DONE! I was so happy last night! I'll make a post later on today about all the things that I learned. If this course is good for one thing, it's teaching you a lot about programming (and how you work under pressure). The OS knowledge seems secondary to that.
Friday, December 3, 2010
Place Your Bets!
Will Willis and I have to pull an all nighter to finish OS?
Things left to do:
- Update our code to use our new physical memory management.
- Implement page replacement
- Write a 4 page design doc.
Hours left: 42 at the time of posting.
I really hope we don't have to. I now understand first hand why you are not supposed to overwork your developers. :/
Things left to do:
- Update our code to use our new physical memory management.
- Implement page replacement
- Write a 4 page design doc.
Hours left: 42 at the time of posting.
I really hope we don't have to. I now understand first hand why you are not supposed to overwork your developers. :/
Thursday, December 2, 2010
The Effectiveness of Test Driven Development
I've always been curious about Test-driven development (TDD). I've been told that it speeds up development time, because you can reduce the amount of bugs hiding in your code. With TDD, you are more likely to find bugs right away, and not months later when it's harder (and more costly) to fix them. Although I agree completely with unit testing as much of your code as possible, I don't think that TDD is the best way to produce these tests. Apparently, I'm not the only one who feels this way, and there is a lot of debate going on in this area.
According to case studies done by Microsoft and IBM, TDD teams took 15-35% more time to develop the applications, but the bug density was reduced by 40-90% compare to more traditional projects. The problem is that some (most?) projects can't afford to delay feature development time by 35%.
I think my main objection with TDD is step 2. I don't think writing the minimum code first and then refactoring is a good way to encourage good architecture. At the very least, it's more time consuming than it needs to be.
For me, the final design for something (especially small features) is pretty straight forward, but requires writing "extra" code. That is, I could implement the feature in a simpler way if I wanted to. However, implementing it the "stupid", simple way first, and then refactoring it seems like a waste of time to me.
Perhaps part of the problem is that I've never used TDD in a professional environment. Maybe I'll get a chance to at some point. Until then, I prefer to write my unit tests after my feature is implemented.
What do you guys think about Test driven development?
According to case studies done by Microsoft and IBM, TDD teams took 15-35% more time to develop the applications, but the bug density was reduced by 40-90% compare to more traditional projects. The problem is that some (most?) projects can't afford to delay feature development time by 35%.
The main idea behind test driven development is:
1) Write tests for a feature you are about to implement. These tests should fail at this point.
2) Write the minimum code required to satisfy these tests.
3) Re-factor the code, making sure that none of the tests break.
I think my main objection with TDD is step 2. I don't think writing the minimum code first and then refactoring is a good way to encourage good architecture. At the very least, it's more time consuming than it needs to be.
For me, the final design for something (especially small features) is pretty straight forward, but requires writing "extra" code. That is, I could implement the feature in a simpler way if I wanted to. However, implementing it the "stupid", simple way first, and then refactoring it seems like a waste of time to me.
Perhaps part of the problem is that I've never used TDD in a professional environment. Maybe I'll get a chance to at some point. Until then, I prefer to write my unit tests after my feature is implemented.
What do you guys think about Test driven development?
Wednesday, December 1, 2010
CT Imaging And Radiation
I just read this article about the benefits and drawbacks of CT scans. According to the article, CT use has increased by 16% a year, since 1995. The problem with CT scanners is that they expose patients to more radiation than other means of medical imaging. Some people are wondering if the benefits outweigh the risks, although most of the medical community agrees that CT scans are worth the risk.
Certainly CT scans provide radiologists with much higher quality images.
The left is a CT scan somewhere in the abdomen. The right is a traditional chest x-ray. Clearly, the CT scans provides the radiologist with a lot more information, at the cost of more radiation. CT scans are one of the few effective ways to see that much information inside a human body.
Are there other options? Of course. MRI is another very powerful imaging technology that can provide even more detail than CT.
MRI is one of the only imaging techniques that can see soft tissue in detail. It's also one of the only imaging technologies that is completely non-invasive. No radiation is used and it's 100% safe to use. So why haven't MRIs replaced CT scans? Money. MRIs are a lot more expensive than CT scans. I think one important research topic in the medical imaging field is how to bring down the costs of MRI. The increase in availability of these machines could have a huge positive impact on health care quality. It would allow doctors and radiologists to see a great deal of information in the human body, without putting the patient in any risk.
It's interesting to see how these technologies will change in the future. Hopefully I'll get a chance to see how they work first hand during a work term down the line or something.
Tuesday, November 30, 2010
Over-Engineering
Over-engineering has started to become a pet peeve of mine. Sometime I feel like my thoughts on architecture follow a pendulum. It goes from the extreme of abstracting like crazy and applying design patterns out the ass, to the other extreme of not thinking about generalizing at all. Eventually, I hope to settle down at a reasonable point in the middle. Give me a few years.
I think our current DB schema is a little over-engineered. Granted, we wanted to challenge ourselves to do something new with the project, but it ended up costing us a good deal in development time. Not so cool with OS in the way. In "real life", we would have saved time in the long run, since the project would probably to stay around for 5 years. The time used to set up our architecture would have made up by speeding up all those years of future development. Unfortunately, our application has the lifetime of a month.
I think that a good rule of thumb is to write the simplest code possible, without coding yourself into a corner.
Do you need to unit test that class? No? Then don't make it an interface. Changing this later isn't hard, but working with a system where every single class is an implementation of an interface is harder. Adding unneeded complexity to your code will just make your system more error prone.
Do you really need to every function to be less than 10 lines? Well is it hard to understand your code? No? Then don't bother. Reading through a file with hundreds of three line functions is a nightmare. It's really cool when you can read the high level function as if it were pseudo code, but understanding how everything works becomes much more complex. And again, complexity will lead to more error prone code.
Do you really need to make that Singleton thread safe? No? Then why would you? It's trivial to make that change later, when you actually need it.
With that said, it's important that you don't code yourself into a corner. You shouldn't make a system that's hard to extend. It should be easy to extend when the need arises. Until then, leave it alone.
I think our current DB schema is a little over-engineered. Granted, we wanted to challenge ourselves to do something new with the project, but it ended up costing us a good deal in development time. Not so cool with OS in the way. In "real life", we would have saved time in the long run, since the project would probably to stay around for 5 years. The time used to set up our architecture would have made up by speeding up all those years of future development. Unfortunately, our application has the lifetime of a month.
I think that a good rule of thumb is to write the simplest code possible, without coding yourself into a corner.
Do you need to unit test that class? No? Then don't make it an interface. Changing this later isn't hard, but working with a system where every single class is an implementation of an interface is harder. Adding unneeded complexity to your code will just make your system more error prone.
Do you really need to every function to be less than 10 lines? Well is it hard to understand your code? No? Then don't bother. Reading through a file with hundreds of three line functions is a nightmare. It's really cool when you can read the high level function as if it were pseudo code, but understanding how everything works becomes much more complex. And again, complexity will lead to more error prone code.
Do you really need to make that Singleton thread safe? No? Then why would you? It's trivial to make that change later, when you actually need it.
With that said, it's important that you don't code yourself into a corner. You shouldn't make a system that's hard to extend. It should be easy to extend when the need arises. Until then, leave it alone.
Monday, November 29, 2010
Making Carts Using Genetic Algorithms
This genetic algorithm tries to generate an optimal cart with two loads to traverse some static terrain. Complete with a nice little animation. :) I want to look through the source, but I don't have enough time now. Hopefully I'll remember to check it out after this week is over.
In other news, the deadline for DB was pushed back to Monday. It's appreciated, though we are close to done anyway. Maybe the OS deadline will be pushed back as well. :P
Anyway, time to prove things are NP-complete. lalala
In other news, the deadline for DB was pushed back to Monday. It's appreciated, though we are close to done anyway. Maybe the OS deadline will be pushed back as well. :P
Anyway, time to prove things are NP-complete. lalala
Saturday, November 27, 2010
Final Push
This week is going to suck...
The two projects are due this week. DB is due Friday, OS is Sunday. (and there's that pesky algorithms on Thursday :/). DB is very close to being finished. We are in the testing phase right now. Hopefully we will write our last lines of code today. OS is much, much further from being completed. In fact, we have 0 feature implemented, and we are still unsure about how everything works. We have some framework code working, but that really isn't impressive. Hopefully with DB out of the way, we can focus all our time on this and figure it out ASAP.
At least after this week is over, the crazy course load is done. I'm not expecting exams to be that bad. At the very least, it won't be a super busy time for me. This means I'll have time to sleep (and not have dreams about OS and DB)! That'll be the life!
How is your last week looking?
The two projects are due this week. DB is due Friday, OS is Sunday. (and there's that pesky algorithms on Thursday :/). DB is very close to being finished. We are in the testing phase right now. Hopefully we will write our last lines of code today. OS is much, much further from being completed. In fact, we have 0 feature implemented, and we are still unsure about how everything works. We have some framework code working, but that really isn't impressive. Hopefully with DB out of the way, we can focus all our time on this and figure it out ASAP.
At least after this week is over, the crazy course load is done. I'm not expecting exams to be that bad. At the very least, it won't be a super busy time for me. This means I'll have time to sleep (and not have dreams about OS and DB)! That'll be the life!
How is your last week looking?
Thursday, November 25, 2010
Architecture Lessons at 3AM
I learned a nice architecture lesson last night: Think about error handling early. The wrong time to think about it is at 3AM after most of your functionality is built in.
Originally, our architecture relied on returning null on failure. The levels above would check for this and handle it appropriately. There are a few problems with this. The problem that impeded us the most is that we were unable to differentiate between various kinds of failures for some method. In hindsight, this isn't a very clean way to handle error handling in general.
The first solution we tried involved an external error handling class. Our model (and data access code) would set flags in this static error handling class to indicate what went wrong. This meant that every time someone wanted to call on the model to do something, they would have to remember to check the error handling class to see if anything went wrong. This created a coupling between this error handling class and our models. If you forget to check that error flag, your code would fail silently or cause unexpected behavior. We eventually scrapped this idea in favor of exceptions.
Exception were better because any failure would force the programmers to deal with it immediately. If they forgot to catch the exception, the program would crash right away, and with a very clear reason (like SQLException). You could not simply ignore the error and continue running in ignorance like with the other static error handling technique.
After we decided on exceptions, we started talking about whether we wanted checked or unchecked exceptions. More fun debating there. All this happened after 2AM. I am always hesitant to make important design decisions at that hour.
We eventually decided to go with checked exceptions. This means changing almost all of our classes (including all our tests). Granted, the changes are fairly simple and straightforward, but it's still going to cost us some time to re-factor everything. Had we thought about error handling straight away, we could have saved some time.
Moral of the story: Think about error handling during your planning phase!
On an unrelated note, Willis and I got 97 on OS A2. Hell-freaking-Yeah!
Wednesday, November 24, 2010
The 7cubed Project
Check this out: http://7cubedproject.com/
7 UW programmers making 7 apps for 7 days. Very cool. Today is Day 5.
Day 1 was Quick Cite. With this mobile app, you can scan any book and the formatted bibliography will be sent to your email.
Day 2 was UWChat. This lets you check into UW rooms and have chat sessions with people in the same room as you.
Day 3 was Minehub, a dynamic Minecraft server.
Day 4 was supposed to be something called "AwesomeTracker", but unfortunately they were unable to ship it on time. They said that they'll get back to it after the week is over.
This seems like an interesting test of the limits of agile programming. The group says that a highly skilled team of developers can successfully follow an agile methodology where they measure releases on the order of hours. So far, it seems to be working quite well. Of course, there have been some hiccups. The failure to ship on day 4 is perhaps the biggest. The other big one is that during the rush to get Quick Cite out, they accidentally shipped code with the email functionality commented out. I think that these sort of problems are almost unavoidable when you are developing this fast.
In any case, I'm really interested in what else they can create. This seems like a really cool experiment, and a lot of fun for the coders involved.
7 UW programmers making 7 apps for 7 days. Very cool. Today is Day 5.
Day 1 was Quick Cite. With this mobile app, you can scan any book and the formatted bibliography will be sent to your email.
Day 2 was UWChat. This lets you check into UW rooms and have chat sessions with people in the same room as you.
Day 3 was Minehub, a dynamic Minecraft server.
Day 4 was supposed to be something called "AwesomeTracker", but unfortunately they were unable to ship it on time. They said that they'll get back to it after the week is over.
This seems like an interesting test of the limits of agile programming. The group says that a highly skilled team of developers can successfully follow an agile methodology where they measure releases on the order of hours. So far, it seems to be working quite well. Of course, there have been some hiccups. The failure to ship on day 4 is perhaps the biggest. The other big one is that during the rush to get Quick Cite out, they accidentally shipped code with the email functionality commented out. I think that these sort of problems are almost unavoidable when you are developing this fast.
In any case, I'm really interested in what else they can create. This seems like a really cool experiment, and a lot of fun for the coders involved.
Tuesday, November 23, 2010
Sanity Progress
I was feeling quite overwhelmed yesterday with the amount of work I have to do. Third year is hard. :/ In first year, they were all like lalala "If you can adjust to first year, the rest is easy". In second year, people said lalala "If you can get through second year, you can do the rest". Bullshit. Third year has been orders of magnitude harder and more work. First and second year courses don't have projects requiring more than 100 man-hours of low level C programming. Perhaps it's my fault for taking 2 project courses and algorithms at once... Maybe the next couple semesters will be easier (especially if I don't take Compilers and Real-time).
I am feeling better today. Our OS prof was nice and went through a lot of the assignment requirements in lecture today. I feel like I actually know where to go now, instead of poking through the OS/161 kernel aimlessly. :P DB also seems to be progressing fairly well. Hopefully we'll be feature complete for tomorrow or Thursday. If this was Amnesia, I would be surrounded by white light and my sanity would go up a level.
However thanks to this crazy workload, I feel like I'm coming out a stronger programmer. Writing code for an OS is quite humbling.
I am feeling better today. Our OS prof was nice and went through a lot of the assignment requirements in lecture today. I feel like I actually know where to go now, instead of poking through the OS/161 kernel aimlessly. :P DB also seems to be progressing fairly well. Hopefully we'll be feature complete for tomorrow or Thursday. If this was Amnesia, I would be surrounded by white light and my sanity would go up a level.
However thanks to this crazy workload, I feel like I'm coming out a stronger programmer. Writing code for an OS is quite humbling.
Monday, November 22, 2010
How much voltage does your program need?
Apparently Intel thinks they can create CPUs with up to 1000 cores. It'll be interesting to see how much of a change this will spur in Computer Science and Software Engineering.
I know some schools are already teaching multi threaded algorithms and concurrency. I can see this becoming more and more important in the future. I am getting really excited for the concurrency course at Waterloo.
Also, the Intel cores contain hooks that let you change the voltage to the cores. This sounds like yet another crazy thing for programmer's to consider in the future. I used to think that these issues were stupid. How much could they possibly reduce it by? Who cares how much voltage your CPUs are using? Google does. Saving 1% of CPU voltage will save the company thousands and thousands of dollars. It's pretty crazy.
Anyway, 4 assignments left in 2 weeks. Woo! OS an DB are both due in less than two weeks. Why does the end of the school year have to be so busy?
I know some schools are already teaching multi threaded algorithms and concurrency. I can see this becoming more and more important in the future. I am getting really excited for the concurrency course at Waterloo.
Also, the Intel cores contain hooks that let you change the voltage to the cores. This sounds like yet another crazy thing for programmer's to consider in the future. I used to think that these issues were stupid. How much could they possibly reduce it by? Who cares how much voltage your CPUs are using? Google does. Saving 1% of CPU voltage will save the company thousands and thousands of dollars. It's pretty crazy.
Anyway, 4 assignments left in 2 weeks. Woo! OS an DB are both due in less than two weeks. Why does the end of the school year have to be so busy?
Sunday, November 21, 2010
Ukrainian Folk Short
My dad sent me this video this morning. It's nice. I totally remember visiting family in Ukraine, living in a village like the one in the video. It must look weird to other cultures. Why are there random flies in the house? Why are people singing at the dinner table? Are they using a scythe? What's with the clothing? It seems weird how I've seen all these things in person, and now I live in a very different place. It's a cool feeling.
Friday, November 19, 2010
Software Developer Interviews
There seems to be a problem hiring good developers nowadays. It is apparently hard to hire people with even basic coding knowledge. The first time I really encountered this was in Jeff Atwood's blog post "Fizz Buzz", however it seems that this problem is still around.
That second article talks about basic interview screening questions that most("19/20") job candidates couldn't answer. The specific question was "Write a C function to reverse a singly-linked list". This seems unusual to me, since I've encountered harder questions during my co-op interviews.
It seems weird that some companies seem to be grilling co-op students harder than full time developers. Of course there are companies (like Google) who definitely put the full-timers through a lot to get a job. I've heard stories that it takes something like 6+ technical interviews to get a full time job at Google. This is certainly more than the hour and a half they spent with me when I interviewed with them this term.
I keep track of all the technical questions I've been asked in interviews. These questions have been collected over the course of two terms worth of interviews. Could you answer some of these? If so, then apparently you can have a comfy job as a developer.
That second article talks about basic interview screening questions that most("19/20") job candidates couldn't answer. The specific question was "Write a C function to reverse a singly-linked list". This seems unusual to me, since I've encountered harder questions during my co-op interviews.
It seems weird that some companies seem to be grilling co-op students harder than full time developers. Of course there are companies (like Google) who definitely put the full-timers through a lot to get a job. I've heard stories that it takes something like 6+ technical interviews to get a full time job at Google. This is certainly more than the hour and a half they spent with me when I interviewed with them this term.
I keep track of all the technical questions I've been asked in interviews. These questions have been collected over the course of two terms worth of interviews. Could you answer some of these? If so, then apparently you can have a comfy job as a developer.
1) What is a static function?
2) Can you have a static constructor?
3) Why would you use a static constructor?
4) What is a virtual function?
5) What's the difference between a thread and a process?
6) Fill an array of size 100 with the first 100 terms of the Fibonacci's Sequence.
7) What is the difference between a stack and a queue?
8) What is the difference between an exe and a dll?
9) What is object oriented programming?
10) Explain encapsulation, inheritance, polymorphism.
11) What is abstraction?
12) What is the difference between an abstract class and an interface?
13) What is the difference between pass by value and pass by reference?
14) What is a pointer?
15) Does Java/C# have pointers?
16) What is the difference between a pointer and a reference?
17) Implement modulus.
18) How would you overflow a stack? Heap?
19) Count the number of set bits in an n long array with 16 bit numbers? Can you improve the constant factor?
20) Worst case for a hash table?
21) Implement multiplication.
23) Write Fizz Buzz.
24) What does final mean in Java? Static?
25) How would you unit test <this>?
26) Basic question involving Distance = time * velocity.
27) Design a program to find the largest word in a paragraph
28) Design a program to extract unique values in a sorted list
29) What is normalization?
30) Get the second largest salary in a DB using SQL.
31) What's the difference between GET and POST?
32) Whats the probability of flipping 3 coins and getting heads all 3 times.
33) What is synchronization in the context of concurrency?
34) What is dependency injection? How would you implement it? How would you use it?
35) Name some differences between Java and C#.
36) How to bindings work for polymorphism?
37) How do you read a file in Java? (Stupid memorization question is stupid)
38) Given 7 "letter tiles" and a dictionary of valid words, return the set of words that can be generated using those tiles.
39) Given a file with expansion data like this:
“foo” expands to “foo” “bar” “biz”;
“buzz” expands to "blarge" “*foo”
where *foo means expand foo, so expand(buzz) = "blarge foo bar biz"
Write a function that expands strings based on such a file.
40) Write code that will always deadlock.
41) What is a buffer overflow? How is it exploitable?
42) What is DEP? What is ASLR? How do they help prevent buffer overflows?
43) What is BotNet? What is some of the malicious things it can do?
44) What is XSS?
45) What is a SQL injection?
46) What would you do if you found a security exploit?
47) When (if ever) is it okay to release the security exploit to the public?
48) What is an exploit? What is a vulnerability?
49) If you could enforce one programming rule to improve security, what would it be?
50) What is the difference between verification and validation?
51) What makes a good requirement? “The DB needs to be fast”. Is that a good requirement?
52) What goes into a good bug report?
53) Tracing DFAs.
54) What would you consider when designing a DB?
55) Find the memory leak in the some code.
56) What is MVC?
57) What are the 4 pillers of OOP?
58) Implement inheritance in a database.
59) Use joins (then don’t use joins) to get the data from above tables.
60) What do you do when you can’t beat a deadline? (impossible deadline)
61) String builder vs String in Java. When to use which?
62) What are JS prototypes?
63) What is AJAX?
64) What is JSON?
Hopefully this is helpful to people doing interviews next semester.
Wednesday, November 17, 2010
Wolfram And Natural Language Programming
I just read Stephen Wolframs's most recent blog post on programming with natural languages. He talks about some really cool features of Mathematica 8 and Wolfram|Alpha. The examples he shows are quite impressive.
Wolfram talks about a future where programmers would be able to communicate with a computer using natural languages like English. The programmer would specify the requirements in English, and the computer would synthesis a program to actually satisfy these requirements. Although this sounds like a really interesting idea, I can see a lot of problems for them to overcome before this becomes practical.
Perhaps the largest of these problems is that natural languages are very imprecise. The amount of text required to specify a requirement accurately in pure English is huge. Massive programs with really complicated requirements would create a huge wall of text when encoded in English. Maintaining and expanding programs that are encoded like this sounds like a nightmare.
Further, English is an inconsistent language. That is, you can derive many contradictions and silly results using the English language. For example,
Consider transitivity. If A > B and B > C, then we can conclude that A > C.
Now,
let A = "A Cheeseburger"
let B = "Nothing"
let C = "True Love"
So now,
A Cheeseburger is better than nothing (A > B)
Nothing is better than True Love (B > C)
so it should follow that
A Cheeseburger is better than True Love.
Silly result is silly, but it just shows that English sucks at being precise. This silly proof was originally shown to me by Professor Shai Ben-David in Logic class. He was explaining why we can't use English as a language for Logic. The reasons are essentially the same as the reasons why we can't use English for programming.
We could solve some of these issues by creating restricted English, but figuring our a working subset of English would be a huge task in itself.
It is interesting to see how software engineering techniques would change if this were to be come popular. MVC would translate to separate paragraphs talking about views, models, and their interactions. Most design patterns could be specified with an additional paragraph. Would these design patterns even be useful anymore?
In any case, the progress in free-form linguistic parsing is really interesting, but we will probably see it applied to other fields effectively before we see it as a substitute for traditional programming languages.
Wolfram talks about a future where programmers would be able to communicate with a computer using natural languages like English. The programmer would specify the requirements in English, and the computer would synthesis a program to actually satisfy these requirements. Although this sounds like a really interesting idea, I can see a lot of problems for them to overcome before this becomes practical.
Perhaps the largest of these problems is that natural languages are very imprecise. The amount of text required to specify a requirement accurately in pure English is huge. Massive programs with really complicated requirements would create a huge wall of text when encoded in English. Maintaining and expanding programs that are encoded like this sounds like a nightmare.
Further, English is an inconsistent language. That is, you can derive many contradictions and silly results using the English language. For example,
Consider transitivity. If A > B and B > C, then we can conclude that A > C.
Now,
let A = "A Cheeseburger"
let B = "Nothing"
let C = "True Love"
So now,
A Cheeseburger is better than nothing (A > B)
Nothing is better than True Love (B > C)
so it should follow that
A Cheeseburger is better than True Love.
Silly result is silly, but it just shows that English sucks at being precise. This silly proof was originally shown to me by Professor Shai Ben-David in Logic class. He was explaining why we can't use English as a language for Logic. The reasons are essentially the same as the reasons why we can't use English for programming.
We could solve some of these issues by creating restricted English, but figuring our a working subset of English would be a huge task in itself.
It is interesting to see how software engineering techniques would change if this were to be come popular. MVC would translate to separate paragraphs talking about views, models, and their interactions. Most design patterns could be specified with an additional paragraph. Would these design patterns even be useful anymore?
In any case, the progress in free-form linguistic parsing is really interesting, but we will probably see it applied to other fields effectively before we see it as a substitute for traditional programming languages.
Tuesday, November 16, 2010
Java's Demise?
This article talks about all the problems facing Java in the upcoming years. It'll be interesting to see how Oracle deals with all this.
I see this as a great opportunity for .NET to gain a lot of market share. According to TIOBE, Java is still ranked first, and C# is ranked 5th. I can definitely see them swapping in the next 5 years. I would be happy if C# at least swapped with C++ (currently 3rd). I wonder how C++0x and C1X will fit into the rankings whenever they actually get released.
I've enjoyed using C# a lot more than Java in the past. Java seems to be really lagging behind .NET in terms of implementing new language features (like Lambda). .NET and Mono are both progressing quite fast. Hopefully I'll get a chance to work .NET more in the future.
In other news, OS A3 has been released and work with DB is raging on. Busy busy.
I see this as a great opportunity for .NET to gain a lot of market share. According to TIOBE, Java is still ranked first, and C# is ranked 5th. I can definitely see them swapping in the next 5 years. I would be happy if C# at least swapped with C++ (currently 3rd). I wonder how C++0x and C1X will fit into the rankings whenever they actually get released.
I've enjoyed using C# a lot more than Java in the past. Java seems to be really lagging behind .NET in terms of implementing new language features (like Lambda). .NET and Mono are both progressing quite fast. Hopefully I'll get a chance to work .NET more in the future.
In other news, OS A3 has been released and work with DB is raging on. Busy busy.
Monday, November 15, 2010
Sunday, November 14, 2010
Planning
I think a large part of our problem for OS came from a lack of planning. We were making design and algorithm decisions during our slip day. Not good. :/
We should have sat down and planned out the project in its entirety before we started programming. We were trying to do things sections at a time, as if it were an agile project or something. Unfortunately, pretty much everything in the system was intricately intertwined so looking at some of those modules in isolation didn't really make sense a lot of the time.
I think we'll make it a point to do much for A3. That is, I would like to have the project fully planned out and designed before we write a single line of code. Hopefully this will make the project go much smoother than A2.
There's three weeks of classes left, and A3 hasn't been assigned yet. This is going to be fun.
On a related note, our next DB project has been fairly well planned out. I think that we are making great progress, and our architecture is solid. I am actually a little proud of our dinky design, even though it's just a simple MVP architecture. With three weeks until the deadline, I think we won't be pulling any late nights with this one.
Saturday, November 13, 2010
Language Expertise.
Sometimes I read articles about languages and I realize how little I actually know about these languages. I feel like I should read a nice technical book from front to back. Something like Head First C#, rather than Code Complete. I eventually want to read both, but I feel like I should know my languages in more detail before I learn more about engineering techniques.
Not to mention I know very little about some fields of development, like web development. Hopefully I will get a nice crash course in web development during my work term.
My goal is to get through a lot of books this Winter. I should be able to do it since I won't be worrying about OS or DB all the time. :P Hopefully I won't be too distracted by Roslin house/video games.
On my to read list (in order):
- Head First Java
- Re-read Clean Code
- Code Complete 2
I will be happy if I can get through those in 4 months, but more would be nice.
Anyway, I should probably do something productive. Maybe I'll fail at Algorithms for a bit. :/
Not to mention I know very little about some fields of development, like web development. Hopefully I will get a nice crash course in web development during my work term.
My goal is to get through a lot of books this Winter. I should be able to do it since I won't be worrying about OS or DB all the time. :P Hopefully I won't be too distracted by Roslin house/video games.
On my to read list (in order):
- Head First Java
- Re-read Clean Code
- Code Complete 2
I will be happy if I can get through those in 4 months, but more would be nice.
Anyway, I should probably do something productive. Maybe I'll fail at Algorithms for a bit. :/
Thursday, November 11, 2010
Web Technologies
I just got back from meeting my co-workers. Everyone seems nice. It's a pretty small office (about a dozen people). It's also an open office, so it seems really fun. I wasn't a big fan of the cubicles at my last job.
I get to use GWT to make web-based applications in Java. I'll be using Eclipse running on Macs. This will be my first real exposure to web development, and I'm pretty excited. Hopefully I will have some time to get to know GWT before I start in January. I should be able to find some time after exams or something.
Anyway, I'm off to spontaneous sushi with Girl. :)
I get to use GWT to make web-based applications in Java. I'll be using Eclipse running on Macs. This will be my first real exposure to web development, and I'm pretty excited. Hopefully I will have some time to get to know GWT before I start in January. I should be able to find some time after exams or something.
Anyway, I'm off to spontaneous sushi with Girl. :)
Wednesday, November 10, 2010
Maclean's "Too Asian" Article
Maclean's published an interesting article entitled "Too Asian" (Edit: It seems that Maclean's has taken down the article. Hopefully they will put it back up at some point. Check the comments for a kinda-sketchy repost of the full article.). Waterloo gets more than a few mentions, as well as UofT and UBC.
I think some of these pre-frosh are just being stupid. If you want to go to University just to drink and party, I promise you will find like-minded people at any university. It's not like nobody drinks and parties in Waterloo. Just look down King street on Saturday night, or the line up outside Mel's (RIP) at 2am. You will not have to go far if you are living in UWP and are looking for a party. I have several high school friends that lived there and tell me all about "Wasted Wednesday" and "Man-down Mondays".
I also think it's silly to go to university just to party. If you want to drink all day, there are certainly cheaper ways to do this. It's that mommy and daddy won't pay for you to live in a studio apartment and drink all day instead of working.
Here's my favorite quote:
"The division is perhaps most extreme at Waterloo, where students have dubbed the MC and DC buildings—the Mathematics & Computer Building and the William G. Davis Computer Research Centre, respectively—“mainland China” and “downtown China,” and where some students told Maclean’s they can go for days without speaking English."
That last bit sounds plausible, though I can't say that I've seen it personally. I actually heard these nicknames in 3A for the first time.
I think some of these pre-frosh are just being stupid. If you want to go to University just to drink and party, I promise you will find like-minded people at any university. It's not like nobody drinks and parties in Waterloo. Just look down King street on Saturday night, or the line up outside Mel's (RIP) at 2am. You will not have to go far if you are living in UWP and are looking for a party. I have several high school friends that lived there and tell me all about "Wasted Wednesday" and "Man-down Mondays".
I also think it's silly to go to university just to party. If you want to drink all day, there are certainly cheaper ways to do this. It's that mommy and daddy won't pay for you to live in a studio apartment and drink all day instead of working.
Here's my favorite quote:
"The division is perhaps most extreme at Waterloo, where students have dubbed the MC and DC buildings—the Mathematics & Computer Building and the William G. Davis Computer Research Centre, respectively—“mainland China” and “downtown China,” and where some students told Maclean’s they can go for days without speaking English."
That last bit sounds plausible, though I can't say that I've seen it personally. I actually heard these nicknames in 3A for the first time.
Tuesday, November 9, 2010
Life After OSA2
With OS done, I finally have some spare time. Woo! I am assignment free for a whole week! I feel like I have so much free time now. I am going to be an Arts student for a week and do nothing productive.
So, I randomly took up Starcraft. lol. I won my first game today, with the help of Girl. Woo! I miss having time for games. I look forward to going back to gaming during my co-op term.
Speaking of co-op terms, I am meeting my co-workers on Thursday! Ahh! We are going to talk about all the projects I could work on and the related technologies. Hopefully I will have time to actually look into some of those things. I will get a lot of practice with Java with the upcoming DB project. We are (most likely) using Java, with JDBC and JUnit. I am actually pretty excited for the DB project. We have a lot of freedom. We get to choose pretty much any technologies involved (other than the DB itself. We are stuck with DB2 for that :|). Jesse and I also get to do some architecture, something we both want to get into later. :) Weee! Our professor has a software engineering background, so hopefully we can impress him with our beautiful and pragmatic design.
I am going to work on that now actually (I guess I won't act like an Arts student after all...)
So, I randomly took up Starcraft. lol. I won my first game today, with the help of Girl. Woo! I miss having time for games. I look forward to going back to gaming during my co-op term.
Speaking of co-op terms, I am meeting my co-workers on Thursday! Ahh! We are going to talk about all the projects I could work on and the related technologies. Hopefully I will have time to actually look into some of those things. I will get a lot of practice with Java with the upcoming DB project. We are (most likely) using Java, with JDBC and JUnit. I am actually pretty excited for the DB project. We have a lot of freedom. We get to choose pretty much any technologies involved (other than the DB itself. We are stuck with DB2 for that :|). Jesse and I also get to do some architecture, something we both want to get into later. :) Weee! Our professor has a software engineering background, so hopefully we can impress him with our beautiful and pragmatic design.
I am going to work on that now actually (I guess I won't act like an Arts student after all...)
Monday, November 8, 2010
The Defeat of OS A2!
Wooooo! Gwaba Gwaba Gwaba!
We finally finished OS A2! It took about 800 kernel and countless man-hours, but we are finally done! That was a nightmare. We spent the last 3 days straight programming. We didn't go to sleep until 4am every night. We were on the brink of insanity, but it was totally worth it.
I am now going to not worry about OS for a long, long time (read: until A3). Maybe I should play some Starcraft or something.
Gwaba Gwaba Gwaba Gwaba Gwaba!
We finally finished OS A2! It took about 800 kernel and countless man-hours, but we are finally done! That was a nightmare. We spent the last 3 days straight programming. We didn't go to sleep until 4am every night. We were on the brink of insanity, but it was totally worth it.
I am now going to not worry about OS for a long, long time (read: until A3). Maybe I should play some Starcraft or something.
Gwaba Gwaba Gwaba Gwaba Gwaba!
Saturday, November 6, 2010
Code Reuse Myth
Code Reuse Myth
Allan Kelly makes some interesting points about code reuse. He argues that code reuse is a bad development goal, since most code designed for reuse is never actually reused. Often, developers want to create reusable code, not because they think the code will be reused, but because the final code will be better engineered. Specifically, the code will be have properties like low coupling, modularity, and testability. However, creating "reusable" code costs a lot. Kelly thinks that it's something like 3 times more expensive.
I think that code re-usability should not be a direct goal of software architecture. It's silly for this to be a direct goal, if it's never a requirement. This is especially true because of the cost. However, code properties that accompany re-usability(like low coupling) should be software architectural goals. This has real benefits to everyday development, and has a side effect of creating code that can be easily reused, with minor modifications. This would probably cost a lot less, since not every component of the system needs to be abstracted to some generic, reusable code section. To make things worse, planning for everything to be reused is often over engineering. The resulting code is a nightmare to understand. Code that abstracts too much becomes much less clean, and as a result, much harder to work with.
Lesson: Plan to write code that is testable and well engineered, and you get reusable code. Don't plan to write reusable code that happens to have positive software architectural properties.
Also, hi
Allan Kelly makes some interesting points about code reuse. He argues that code reuse is a bad development goal, since most code designed for reuse is never actually reused. Often, developers want to create reusable code, not because they think the code will be reused, but because the final code will be better engineered. Specifically, the code will be have properties like low coupling, modularity, and testability. However, creating "reusable" code costs a lot. Kelly thinks that it's something like 3 times more expensive.
I think that code re-usability should not be a direct goal of software architecture. It's silly for this to be a direct goal, if it's never a requirement. This is especially true because of the cost. However, code properties that accompany re-usability(like low coupling) should be software architectural goals. This has real benefits to everyday development, and has a side effect of creating code that can be easily reused, with minor modifications. This would probably cost a lot less, since not every component of the system needs to be abstracted to some generic, reusable code section. To make things worse, planning for everything to be reused is often over engineering. The resulting code is a nightmare to understand. Code that abstracts too much becomes much less clean, and as a result, much harder to work with.
Lesson: Plan to write code that is testable and well engineered, and you get reusable code. Don't plan to write reusable code that happens to have positive software architectural properties.
Also, hi
Friday, November 5, 2010
Genetic Algorithms and Mario
Genetic Algorithm for Mario AI
So genetic algorithms are getting really cool. I wish I had more time to play with these things. I wonder how much more they can do with these kinds of algorithms. Maybe if it gets more popular, Waterloo would ever offer a course on it. I would take it.
Hopefully I will be able to take the Machine Learning course at some point. It's one of those 'special offering' courses that get offered at very arbitrary times, so I have to hope I get lucky. Apparently the wheels of beaurocracy are turning for making this an official course. I hear those wheels turn very slowly though... :/
I wonder if it's possible to do machine learning through genetic programming, like having a genetic algorithm find spam rules or something. Maybe I'll ask Shai Ben-David if I ever see him on campus. Or if I can catch that course.
In other news, OS is coming along nicely. Todo checklist:
- Fork
- Waitpid
- Exit
- Execv
- Miscellaneous testing and bug fixes
Thankfully, those first 3 points are well underway and will, hopefully, be done today.
*Sigh* Back to work.
So genetic algorithms are getting really cool. I wish I had more time to play with these things. I wonder how much more they can do with these kinds of algorithms. Maybe if it gets more popular, Waterloo would ever offer a course on it. I would take it.
Hopefully I will be able to take the Machine Learning course at some point. It's one of those 'special offering' courses that get offered at very arbitrary times, so I have to hope I get lucky. Apparently the wheels of beaurocracy are turning for making this an official course. I hear those wheels turn very slowly though... :/
I wonder if it's possible to do machine learning through genetic programming, like having a genetic algorithm find spam rules or something. Maybe I'll ask Shai Ben-David if I ever see him on campus. Or if I can catch that course.
In other news, OS is coming along nicely. Todo checklist:
- Fork
- Waitpid
- Exit
- Execv
- Miscellaneous testing and bug fixes
Thankfully, those first 3 points are well underway and will, hopefully, be done today.
*Sigh* Back to work.
Wednesday, November 3, 2010
Minecraft on OS/161.
Woo! Willis and I are making progress with OS. A good chunk of the file system calls are done. There is still some work to do there (like solving some synchronization concerns), but I think we can make the deadline.
I can run 2 player Tic-Tac-Toe now! Woo. Next step: Minecraft. That better be worth bonus marks or something. I wonder how hard it would be to get the JVM running on OS/161. Perhaps I should get basic files working properly first...
I am getting more and more excited for Winter term. A bunch of us will be in Waterloo this time, so it should be a lot of fun. I've never had a work term in Waterloo, so I'm excited to see all the antics I can get into with university friends. :P
In other news, I saw Red a few days ago. It was phenomenal. It reminded me a little of Kick-ass, though I'm not sure why. Both were outrageous and hilarious.
Okay, I'm off to figure out how to deal with EOFs in sys_read. Exhilarating, I know. :)
I can run 2 player Tic-Tac-Toe now! Woo. Next step: Minecraft. That better be worth bonus marks or something. I wonder how hard it would be to get the JVM running on OS/161. Perhaps I should get basic files working properly first...
I am getting more and more excited for Winter term. A bunch of us will be in Waterloo this time, so it should be a lot of fun. I've never had a work term in Waterloo, so I'm excited to see all the antics I can get into with university friends. :P
In other news, I saw Red a few days ago. It was phenomenal. It reminded me a little of Kick-ass, though I'm not sure why. Both were outrageous and hilarious.
Okay, I'm off to figure out how to deal with EOFs in sys_read. Exhilarating, I know. :)
Monday, November 1, 2010
Job!
I official got matched for Karos Health for next semester. I feel really bad since a lot of the other jobs I had offers for went unfilled. Hopefully they will all find other candidates to fill the roles in the continuous phase or something.
I'm so excited. I want to just start my co-op term now. Instead, I have algorithms and OS to finish. :/
So battle plan:
- Finish Algorithms. There is one more little bug to iron out in my Levenstein distance, and I should be done.
- See a movie with Girl. Maybe The Social Network, maybe Red. I am cool either way.
- OS OS OS OS. I have to finish all the file stuff, then execv and exception handling. There's exactly a week left and the pressure is building. This is going to be busy week.
I'm so excited. I want to just start my co-op term now. Instead, I have algorithms and OS to finish. :/
So battle plan:
- Finish Algorithms. There is one more little bug to iron out in my Levenstein distance, and I should be done.
- See a movie with Girl. Maybe The Social Network, maybe Red. I am cool either way.
- OS OS OS OS. I have to finish all the file stuff, then execv and exception handling. There's exactly a week left and the pressure is building. This is going to be busy week.
Sunday, October 31, 2010
Real Software Engineering.
Thanks to Willis, I stumbled on this talk on "real software engineering". The main thesis is that software engineering isn't really an engineering, since it doesn't "work" like other engineering. That is, the processes we call software engineering do not ensure success like in other engineerings. I'm not sure if I would call this the major distinction between software engineering and other engineering.
He also talks about how software system are often a lot more complex than the products of other engineering. This makes ensuring quality much harder. Using mathematical models is impractical, so testing is the only real way to ensuring quality. Further, this testing has to be done very often and early, in order to minimize testing costs.
He makes an interesting distinction between the engineering processes between software engineering and something like structural engineering. In structural engineering, the engineer designs the final product and encodes that in some design document. This document then goes to laborers who actually create the final product.
Software development follows a different process. It seems unlikely for this workflow to apply to software engineering, since we haven't found a good way to express a design completely and unambiguously. Instead, the source code acts as the design document. It is, after all, a very specific document encoding the exact project requirements. With this, the language compilers become the laborers who build the final product given a design.
I think he missed an important difference between software engineering and other engineering - changing requirements. I don't think any other engineering has to design systems that must be flexible enough to withstand customers adding, removing, and changing requirements at any time during the project.
I think the origin of the Waterfall Method is hilarious. In 1970, Winston Royce published a paper about the Waterfall Method. His main point was that projects using Waterfall were "doomed" to failure. However, the paper was poorly written, and it is easy to misunderstand him and think that he recommends Waterfall. This is exactly what happened. Too many managers simply looked at the diagrams that seemed to make a lot of sense. They completely disregarded warnings. In fact, on the second page, Royce presents the first ever Waterfall Diagram. The very next line reads: "I believe in this concept, but the implementation described above is risky and invites failure.". I guessed they all missed that part. *facepalm*
My favorite quote (on page 1!): "An implementation plan to manufacture larger software systems, and keyed only to theses steps, however, is doomed to failure."
So lesson: If you are going to implement some new idea from a paper, read the full paper!
He also talks about how software system are often a lot more complex than the products of other engineering. This makes ensuring quality much harder. Using mathematical models is impractical, so testing is the only real way to ensuring quality. Further, this testing has to be done very often and early, in order to minimize testing costs.
He makes an interesting distinction between the engineering processes between software engineering and something like structural engineering. In structural engineering, the engineer designs the final product and encodes that in some design document. This document then goes to laborers who actually create the final product.
Software development follows a different process. It seems unlikely for this workflow to apply to software engineering, since we haven't found a good way to express a design completely and unambiguously. Instead, the source code acts as the design document. It is, after all, a very specific document encoding the exact project requirements. With this, the language compilers become the laborers who build the final product given a design.
I think he missed an important difference between software engineering and other engineering - changing requirements. I don't think any other engineering has to design systems that must be flexible enough to withstand customers adding, removing, and changing requirements at any time during the project.
I think the origin of the Waterfall Method is hilarious. In 1970, Winston Royce published a paper about the Waterfall Method. His main point was that projects using Waterfall were "doomed" to failure. However, the paper was poorly written, and it is easy to misunderstand him and think that he recommends Waterfall. This is exactly what happened. Too many managers simply looked at the diagrams that seemed to make a lot of sense. They completely disregarded warnings. In fact, on the second page, Royce presents the first ever Waterfall Diagram. The very next line reads: "I believe in this concept, but the implementation described above is risky and invites failure.". I guessed they all missed that part. *facepalm*
My favorite quote (on page 1!): "An implementation plan to manufacture larger software systems, and keyed only to theses steps, however, is doomed to failure."
So lesson: If you are going to implement some new idea from a paper, read the full paper!
Friday, October 29, 2010
Teaching real-world programming
MIT is has a very cool course on real-life software development. The course involves 4 major group projects, as well as code reviews by senior developers in the industry. After the coding is done, the groups sit down with real life developers and talk about code style and clarity in 60-90 minute sessions.
I think this is a fantastic idea for a course, and I think it'd be great for Waterloo to adopt something similar. Getting input from real life developers is invaluable, and I am starting to think a lot of our professors can't offer that. For instance in CS 246 "Object-Oriented Software Development", our professor said something along the lines of making your own linked list every time you need one, because then you know exactly how it works. Really? This course was supposed to teach fundamentals of software engineering, and I don't think ignoring existing code is one of those. To make things worse, it's the only mandatory course CS students will have to take that talks about developing quality software for the real world. It was bad enough that we spent most of the course learning the stupid nuances of C++, and only two lectures on design patterns.
I think Waterloo could easily adopt such a program. I'm sure there would be no problem finding volunteer developers to do code reviews with the hundreds of tech companies in Waterloo.
On an unrelated note, the Jobmine Gods have been very nice to me this semester! I will be working with Karos Health in Waterloo for my Winter term. I'm super excited. Time to brush up on my Java!
I think this is a fantastic idea for a course, and I think it'd be great for Waterloo to adopt something similar. Getting input from real life developers is invaluable, and I am starting to think a lot of our professors can't offer that. For instance in CS 246 "Object-Oriented Software Development", our professor said something along the lines of making your own linked list every time you need one, because then you know exactly how it works. Really? This course was supposed to teach fundamentals of software engineering, and I don't think ignoring existing code is one of those. To make things worse, it's the only mandatory course CS students will have to take that talks about developing quality software for the real world. It was bad enough that we spent most of the course learning the stupid nuances of C++, and only two lectures on design patterns.
I think Waterloo could easily adopt such a program. I'm sure there would be no problem finding volunteer developers to do code reviews with the hundreds of tech companies in Waterloo.
On an unrelated note, the Jobmine Gods have been very nice to me this semester! I will be working with Karos Health in Waterloo for my Winter term. I'm super excited. Time to brush up on my Java!
Thursday, October 28, 2010
First!
I now have a blag for my random thoughts. I have no idea how often I will use this, or what I will write about. Probably something relating to software development and my life. Perhaps I can use it as a portfolio as well. We'll see.
Maybe I should be getting ready for my OS midterm instead of setting up a random blog. I should get on that.
Maybe I should be getting ready for my OS midterm instead of setting up a random blog. I should get on that.
Subscribe to:
Posts (Atom)