Friday, June 24, 2011

"Good morning, Vietnam"

"Adrian Cronauer: Goooooooood morning, Vietnam! Hey, this is not a test! This is rock and roll! Time to rock it from the Delta to the D.M.Z.!"

Hello critters!
Long time, no see, the past few days I've been a bit busy but here is the next little thought dump.
Firstly on the agenda is a long standing issue with the mathematics surrounding computer science (or computer science using mathematics anyway you prefer), some guy wrote a book arguing that modern computer scientists don't need mathematics or that, and I quote,
"The notion of the algorithm, simply does not provide conceptual enlightenment for the questions that most computer scientists are concerned with." 
Needless to say I am STRICTLY AGAINST that point (and generally that idea), and I will try to summarize & generalize my arguments in a few brief points:

  1. The most basic notion and idea that comes to mind first is that you cannot truly grasp a concept about any subject without understanding the underlying principles and methods that lead to the creation and support of that idea. 
  2. One cannot show adequately and fully an idea which builds, or somehow relies, upon those principles. 
  3. Algorithm proofs & analyses. (not so general)
  4. The statement that mathematics isn't needed in computer science in itself implies that one limits oneself to only use some small part of the arsenal that is presented to him, or called another way one becomes a coding monkey. (Thanks to the CAD monkeys for the term :)
  5. The aforementioned limit also applies to mental development in those fields.
  6. etc. etc. 

Now don't get me wrong, I don't say that a computer scientist should be able to prove the Four colour theorem, but graph theory  for example is very heavily used in all aspects of computer science and having a more in-depth knowledge in it can prove vital in many situations, it could turn a good solution into a great one, but then proving so would also require mathematics, but then showing its limits as well and so forth. There are many likewise examples that could be given for other mathematics branches involved in computer science. There are a few related links in the end.

Second on the agenda is shorter and more technical, an interesting branchless solution to the integer comparison problem. There are numerous solutions on-line for it, but this one is a bit different and goes like this:
 Let's have two integers X and Y, then we want to be able to take 2 different "paths" in our program based on  the following criterion: if X <= Y -> path 1, otherwise -> path 2, but we want to do it branchless (no conditionals).
Solution: Define a 2 element jump array (or any other type of array needed based on the criterion), lets call it jump_table[2]. Then just execute the following: 
jmp jump_table[((X-Y)>>31)&0x1];
Simple proof (that it works): X-Y in integer notation would be either positive, zero or negative (trichotomy), 
 1. if it is positive or zero: bit 32 == 0 (and we take path 1) 
 2. if it is negative: bit 32 == 1 (and we take path 2)
After testing it against the conditional version for 1000000 iterations it proved to be ~ 16 % faster overall. But the most important property is that it is branchless, and in some cases that matters much more.

Third on the agenda I've been really agitated with the multi-core craze that's been going on lately. Now people started counting their cores, instead of the clock speed (as was the previous misdirection). I have always disliked (that's me being polite) people who try to show-off without having any valid reason to do so (aka posers/fakes). I think that humility is a virtue, and even in case one has reason to show-off it should be left to the public to decide so. Bragging shows simplicity and eagerness for recognition rather than eagerness for the subject itself. Now to the point, multiple cores is not real multiprocessing, because they have to share the same bus, memory lanes and of course socket. It _WILL NOT_ provide linear speed-up in anything for those (and many other) simple reasons, more to the point more cores doesn't mean more "speed", higher "speed" or any of the like. Such notions are more complicated and actually depend on many variables on multiple levels (hardware design, software design etc.). If manufacturer X has announced 16 core chip it doesn't necessarily mean it will perform better than provider Y's 4 core chip. More cores can better obturate a bus if used/designed correctly of course, and can provide very good speed-up for CPU intensive applications (which again depends on the software/hardware design working together). I don't want to get more technical, I think you should get the point by now. In my next post I'll probably write down a few thoughts about virtualization and the "cloud". 

'til next time!

P.S. In the next posts I'll give a brief overview of my new way for load-balancing between multiple peers which uses  relatively simple notions from combinatorics. One day it should become a paper.. (laziness)
If there are any mistaken/illogical sentences - I couldn't get a good night sleep, get over it!

The links:
Computer Science Mathematics - Some of the mathematics background required for computer science.
Theoretical Computer Science Talk - Given by Akamai's chief of technology, Tom Leighton
Want to be a computer scientist? Forget maths (Article) - About the book I talked about earlier.
From rigor to rigor mortis - A bonus. Take a look inside, you may like it.

Wednesday, June 15, 2011

Day hard.

Offspring on (check)
Green tea with brandy (check)

So hello ineffaceable people! It was an amusing day, after a sleepless night (once again), as most of you have experienced such feeling after a long long (looong...) day, and night for that matter, of work followed by "early" wake up. It is like you are in one of those movies where the main character just wakes up, everything is blurry and there's almost no recollection of what happened last night. Hit pretty fast by the strong cup 'a' coffee you've just made for yourself with the ingenious idea that it will straighten you, but alas you find that concentration and focus are not even remotely present. (by the way did you know that there are missiles called ALAS ? I sure did not).
 So lets get to it then, shall we ? (Parts a bit technical, skip if bored to death.)
There are 2 parallel stories that developed, one involving my iPhone doing things that probably even Apple won't believe and the second one involving one nefarious bug in a small software.
Lets begin with the iPhone story: I'll try to structure it as much as I can since I think that the states the phone went through could be easily represented by a graph which could logically explain why/what made it go POO.
 I started by breaking free my girlfriend's phone and thought "hmm, why should mine be still locked ? Only because I have an iPad baseband on my iPhone and a twisted version of iOS which was barely put together shouldn't deny me of the pleasure of playing with Cydia, oh and yes - my wi-fi is out of order (hardware problem) too.." Of course everything broke down, iTunes wouldn't recognize that there was a device connected, as a matter of fact none of the jailbreak/recovery mode/DFU mode programs would recognize it.. But lets not get our hopes down, I started fiddling with the bootstrap software and was finally able to reconfigure it to boot into Recovery mode again, afterwards iTunes started spilling errors (1600 - unknown error [?!]), and because of the iPad baseband I couldn't just update/fix it through iTunes so I had to fix it using the "pirate" software (you think of another term at this hour..), it had already loaded 2 different exploits which should've allowed jailbreaken firmware to be uploaded, so I went on a limp and did the iPad baseband reinstall procedure with a new custom firmware (making use of the new exploits) and was able to go from Dead mode -> DFU Mode -> Recovery Mode -> Standard firmware -> DFU Mode -> Jailbroken/custom firmware -> Incompatible Cydia/iOS and packages on top as a result.. The plan was to use some tool called Cyder to install Cydia software through the PC (without the need for wi-fi), but of course with incompatible versions it all went to a very dark place very fast. I'll stop here because I started to bore myself with my own story and continue with the second one which developed in parallel (in the next window on the taskbar)..

 The bug with the software was trivial, but manifested in a rather interesting way. Here's the quick overview: a multi-threaded program which services some descriptors, a long forgotten static descriptor variable (unused) and a select syscall. If this doesn't ring any bells yet, here's what happens next: since the software doesn't handle the select syscall well and upon receiving a descriptor with id = 0 it just selects it, since it doesn't use STDIN, if you've started it in background and still logged on there is no problem, BUT once you quit and the descriptor gets closed select should return EBADF (according to the manual of select(2), and /usr/include/errno.h says error 9 == EBADF). Now put that in a room with improper select return value handling and you've got yourself a software which behaves normally when started but upon exit goes to 100 % (good it's only 1 thread doing the selects) CPU usage. Of course easily "debuggable" issue, if you even get to a debugger, the cause is pretty logical, but it's rare to hit such bug nowadays, which makes it fun :-) Moral of the story ? Don't leave unused variables behind or one day they'll haunt some programmer dealing with your mess (or you for that matter) !

 'nough said, going to glue together some parts for a design show model now,
Haunt you soon!

P.S. For the sleeping beauties Day hard is mildly related to Die hard, now the question is have you watched it ? If not, you are not of age to read this blog, so shoo!

Tuesday, June 14, 2011

Hello, world! (and goodbye privacy :)

Hello, generally I am against these types of communication/logging, unless one has something really new/important to say. So I'll try 'n stick to this simple guideline, hopefully it won't be a complete waste of time as I expect it to be.
There are times when one just needs to express his frustration/admiration about something he has experienced, so I guess this is as good place as any.
Enough said, be back soon. Goodnight!

P.S. I guess you can consider this as another way to write the infamous "Hello, World!". Fancy, isn't it ?