Web Hosting Philippines, Offshore Programming, Offshore SEO Philippines, Cheap Webhosting Manila Philippines


The R5 (Rant, Rave, Rant, Rant, Rave) Blog

Saturday, April 16, 2005

IUnknown vs. System.Object

The dedication of Andrew Troelsen's "COM and .NET Interoperability" reads:

"This book is dedicated to Mary and Wally Troelsen (aka Mom and Dad). Thanks for buying me my first computer (the classic Atari 400) so long ago and for staying awake during my last visit when I explained (in dreadful detail) how System.Object is so much better than IUnknown. I love you both."

Every time I dip my toes into learning .NET, I wind up dismissing it as a hopelessly complex and overengineered POS, but then I encounter something that makes enough of an impression to make me want to reconsider it. This is one of those things... LOL!

Tuesday, April 12, 2005

OOP: The False Religion

Edsger Dijkstra said: "It is practically impossible to teach good programming style to students that have had prior exposure to Basic; as potential programmers, they are mentally mutilated beyond all recognition."

One look at most people's OO code and I feel like applying the phrase "mentally mutilated beyond recognition" to OOP practitioners as well. Congealed, booger-like masses of heavily-coupled classes (rendering useless the whole reason for using them in the first place!!!) and superfluous inheritance hierarchies are the norm.

Good old functions are a perfectly good abstraction in many many cases, but clearly, a new generation of programmers has been trained to believe that they are committing a mortal sin if they do not immediately think in terms of classes and objects when trying to solve a problem.

[ They should ponder this fact over and over again: Linux, the most successful open source project of all time - with a codebase worth several million lines and teeming with daily activity involving thousands of people - does not make use of classes and objects. ]

Good OO design is extremely hard such that for all but the most experienced programmers, I believe the correct design decision is to actually refrain from designing your own classes. I've come to realize that classes, because they are intended to be heavily reused, require design skills, taste and subtlety on a level approaching that required for language design and are thus not for mere mortals to dabble with. Unfortunately, most of today's popular languages, through a lack of better alternative high level facilities, force everyone working in them to become class designers.

Happily, there are environments, like Python, which allow average programmers to effectively reuse reifications of abstractions (classes included) done by those with the necessary design skills/expertise, largely dispensing with the need to do such work themselves. A good thing, since for the most part, the end results of most such efforts are U-G-L-Y, those based on classes consistently being the worst offenders. In fact, one of the characteristics of the better-designed python 'objects' is that they save you from having to engage in the gobbledygook of object think and talk, delivering not only on the promise of "usage without knowledge of implementation details", but usage without having to master an entire paradigm. This is the promise of scripting taken to a higher level.

So, to all budding Python programmers out there, please keep in mind that you can do a lot of powerful things in this language just sticking to functions and for loops and try to avoid inflicting your class designs on others until you're up to the task of designing elegant ones.

Also realize that there are much neater non-OO abstractions in the language such that classes, and specifically class hierarchies, should be considered an evil to be resorted to only out of sheer desperation. The increasing popularity of interfaces and similar mechanisms nowadays proves that inheritance - IS-A relationships - are only useful in a minority of situations and nowhere near as universally applicable as the OO snake oil salesmen of yore (mid-to-late 90s) like to tout.

Expect more traditional OO concepts to crumble or mutate as time passes to the point where what you thought of as object oriented today will become completely unrecognizable. Come to think of it, 'traditional OO' could be an oxymoron... OO is a circus: friend classes, abstract classes, refactoring, patterns, "cross-cutting concerns", interfaces, members, methods, messages, yada yada yada. The monkey-mind comes up with a new concept every 6 months...

Friday, April 08, 2005

Date format rant: Use ISO YYYY-MM-DD already!

When are people going to stop using MM/DD/YYYY or DD/MM/YYYY (referring to the date conventions they used on the webpage, and not IBAdmin / Firebird per se) format and finally switch over to the ISO YYYY-MM-DD standard?

Non-IT people can be forgiven somewhat for sticking to what they are used to. But for those of us working in a profession where ambiguity represents a deadly poison, to still insist on using one of the worst examples of such when a perfectly usable alternative exists is just unconscionable.

Monday, April 04, 2005

Folklore.org: trip down nostalgia lane

Take a trip down nostalgia lane back to a time when programmers had 3 orders of magnitude less memory and cpu power than today. Read about the techniques they developed to handle problems in such an environment. It was a whole different world from today! Rendered using lovely python CGI written by Andy Hertzfeld of Apple and Eazel fame.

"... we'd often push parameters on the stack out of order, sometimes four times in a row, because we had a value in a register that we would need later, and we didn't want to fetch it again..."

"... the errant code in question was in ROM, which was already frozen in immutable silicon... But Larry Kenyon had already figured out a sneaky technique to fix ROM bugs, by patching system traps..."

So it was Apple's fault! If he had gotten chastised more for his monstrosity of a notation back then, I might actually be doing Win32 programming today. :-P

"... Wendell Sander ... did a small custom chip that crammed all the functionality of Woz's disk controller into a single chip. It was called the "IWM" chip, which stood for the "Integrated Woz Machine", since Woz's disk controller is really an elaborate state machine ... ."

Linux

Love the OS, Hate the [twerpy] advocates.



borrowed from rmh

Friday, April 01, 2005

First post

First test post. Love this Dots Dark template. Go disco!

Reminder: Consider switch to the Bluebird template in the future. We may want a simple look that will fit in with and not distract from the (planned) technical content of this blog.