Latest articles

Star Trek 2009/XI/Crappy-Non-Googleable-Name First Impressions

Originally posted: May 9th, 2009

Just what the internet needs, another random Joe's worthless opinion on it:

(Disclaimer: I've always liked Star Trek, fuck, I even liked the animated series. So it's not like I'm a trek-hater that, unsurprisingly, hated it.)

Saw the first two scenes. Roughly ten minutes. That's it. That's all I could actually take.

First Scene: Worst directing and camera work in movie/film/television history. Seriously. Words like that get tossed around a lot. But I really, truly mean it. In fact, I can actually say that the old Hana Barbara cartoons had better directing without resorting to hyperbole. I can't believe I can say that, but it's true.

Even Paul Greengrass (Of The Bourne Supremacy/Ultimatum Butchering) knows how to properly frame up a subject. People hate Michael Bay, but he can do it too. Hell, even Uwe Boll can fucking do it. But somehow JJ Abrams can't. Instead, he sets up the absolute worst shots humanly possible (this, in addition to the same paint-mixer-as-a-tripod-syndrome Greengrass suffers from), does it all deliberately, and still tries to call himself a director. What the hell is going on at Paramount that this guy has managed to secure a job?

Second Scene: BAM! Product placement in the face! In Star Trek. Yes, that's right. Product Star Trek. And you thought Minority Report was fiction. Take a good guess what company I'm never buying a phone from...

Icing On The Shit Cake: Granted: I love "Sabotage". Question: What the fuck is it doing in Star Trek?

Read more

This is what years of web-development and tax-season stress have done to me...

Originally posted: April 14th, 2009

After falling into a lake 5 miles southeast of the much more well-known Jusenkyo spring, I now turn into a foul-mouthed anthropomorphic walnut every time I get splashed with a three-week-old coffee/martini blend that was mixed by a near-sighted ambidextrous midget wearing a striped goatskin bikini and chanting "I'd like to see a ferret on THAT" in Swahili. Needless to say, this happens CONSTANTLY, and I'm frankly growing quite tired of it.

Read more

Code Snippets Vs DRY

2010.10.24: Update: Umm, yea, so turns out Haxe does have a much simpler syntax if you just need a default getter/setter. Didn't realize that when I wrote this. But I do still think people should be careful not to let code snippet support become a substitute for proper DRY.

Originally posted: March 25th, 2009

Code snippet support in an editor can be very useful, but it does seem to carry a certain risk of discouraging advancements in DRY practices. For example, I was just working on a Haxe class (I refuse to use the goofy "haXe" capitalization) that has 14 (and that number is growing) accessors like this:

private var _assetBaseUrl:String;
public var assetBaseUrl(get_assetBaseUrl, null):String;
private function get_assetBaseUrl():String
    return _assetBaseUrl;

Might not seem like much to someone who's accustomed to Haxe, but that's an enormous amount of code just for a single privately-writable, publically-read-only member variable (For the record, Haxe has the ugliest, most verbose accessor definition syntax I've ever seen). Considering I have 14 of these (so far), just in this one class alone, that's a lot of mess.

Of course, this is where the code snippet users come in and say "Just use a code snippet!". I don't know for certain, of course, but I can't help suspect that the existence of code snippets is part of why Haxe's creators allowed the accessor syntax to be so verbose in the first place. And considering that most of those differ only by name and type, well shit, so much for the idea of "Don't Repeat Yourself"!

Contrast that with the same code in C#:

private String _assetBaseUrl;
public String assetBaseUrl()
    get { return _assetBaseUrl; }

Or better yet, D programming language:

// getter is defined in a utility module I wrote
mixin(getter!(char[], "assetBaseUrl"));

And holy crap, all of a sudden we have actual DRY!

Problem is, code snippets tend to get used as an excuse to ignore DRY (it's really only about one small step above outright copy-and-paste coding). If the early programmers used editors with code snippet support, we probably wouldn't even have functions today.

Of course, I'm not suggesting that we get rid of code snippets, or stop using them. We just shouldn't be considering them a proper substitute for real DRY.

Read more

Putting the "Engineering" back into "Software Engineering"

Originally posted: September 16th, 2008

(I wrote this a while ago, but didn't post it for some reason. So I'm posting it now.)

I've been very vocal about my distaste towards many of the features of various dynamic programming languages (at least for the purposes of any non-trivial program). Recently, while reading the first chapter of "Practical Cryptography" (by Niels Ferguson and Bruce Schneier), it occurred to me that authors' explanation of "The Evils of Performance" was very relevant to my opinions of these languages.

In that first chapter, the authors stress the importance of security being made the single top priority. They make this point by looking at security as an engineering discipline. As they point out, good engineering has always been about making safety and reliability the primary concerns. No other concern should ever be optimized to a point where it could interfere with safety or reliability.

The authors go on stating that, in the same way, the computer industry needs to prioritize security far ahead of efficiency. To illustrate, they present the explanation: "We already have enough fast, insecure systems. We don't need another one."

I'd argue that in computer programing, reliability deserves a similar status ahead of efficiency. Just as in other forms of engineering though, this doesn't just mean placing reliability ahead of the actual product's efficiency. This also means placing it ahead of the efficiency of the development process itself. We already have enough unreliable software being churned out.

Which brings me back to dynamic programming languages: The reason I so strongly dislike many (albeit, not all) of the characteristics of these languages is because they treat short-term programmer productivity as a holy grail (sometimes even stating it as the single primary goal), while allowing good engineering principles like reliability to fall by the wayside. The real irony, though, is that maintaining an unreliable program is itself a drain on programmer productivity, thus hijacking any long-term productivity gains. So for any non-trivial project, these languages would have been more productivity-friendly by going the engineering route and making design decisions that focused on aiding the creation of reliable software at a reasonable speed rather than potentially buggy software at rapid speeds.

Read more

The Problem With Implicit Variable Declarations

Originally posted: September 16th, 2008

Implicit variable declarations (ex: Visual Basic without "Option Explicit") can seem like a good idea if you look at them a certain way. I can understand what language designers are probably thinking when they decide to implement implicit declarations.

"Let the computer do the work for you."

That's a favorite mantra of mine, one of my cardinal rules. To a programmer, a computer with a compiler is like having an army of robots at your disposal. Why do things manually when you send send off a drone or a function or a script to do it for you while you get on with other things? So whenever possible, automate whatever you can. Efficiency: it's nice.

So I think I can understand what language designers are thinking when they decide to include the feature of implicit declarations. They're thinking "Hey, why should I have to explicitly point out that I'm going to use a variable? If I'm using the variable somewhere in the code, the computer should be able to figure out on it's own that the label is supposed to be a variable. Saves me the bother." Sounds good. Same result, less effort. Efficiency. Nice.

But there's an ugly assumption hidden in that reasoning. What that developer is really saying is "Hey computer, anytime you come across an undeclared label being used as a variable, just go ahead and assume it's a new variable". We all know what happens when we make assumptions, right? "ASSumptions will make an ass outta ya."

What do you really want to happen when you mistype a variable name? (Oh, sure, you'd never do anything like that? Riiight? It happens to the best. Deal with it.) There's two choices: A. The compiler grabs you, points at your error, and yells "Hey! You screwed this up! Fix it!". or B. The compiler pretends everything is ok, processes the bad code, and leaves you with a bug which, only if you're very luckly, will manifest itself immediately and in a way that makes the exact nature and location of the problem obvious. Hmm, which is a better way to write code...?

"But I might have really wanted it to be a new variable!" Ok fine. How about you go rewrite your "rm" command to never ask for confirmation just because, well hey, we can't let the potential downfall of accidentally loosing files force us to endure a much, much lesser inconvenience when we really do want to delete. Doesn't make much sense does it? Point being: Assumptions are bad. Bad enough even to outweigh convenience.

Language design, and heck, API design in general, are more than just programming. There is code involved, yes, but there's a large amount of psychology that needs to go into it as well. When you're writing an ordinary function, you're basically thinking "automation" (ie, that "army of robots"). But when you're designing an interface for programmers, even yourself, the real important thing suddenly becomes "How can I prevent the programmer from messing up?". It's like designing any interface, the weakest link is always the human factor. Even if it's the best programmer in the world, a highly unreliable computer will still make that human look like a giant pile of shoddy engineering. So it's your job, as the language/interface/API designer, to do whatever you can to minimize that risk of programmer error.

Another way to think of the issue is in terms of "good redundancy versus bad redundancy". There tends to be a lot of value placed on eliminating redundancy. Often this is good. But sometimes redundancy can improve reliability, which is a very important concern. Mandatory explicit variable declarations are one form of "good redundancy" that improves reliability. Walter Bright explains it best:

"Variable declarations are one [example of good redundancy in language design]. But since the compiler can figure the need for declarations from the context, declarations seem like prime redundancies that can be jettisoned. This is called implicit variable declaration. It sounds like a great idea, and it gets regularly enshrined into new languages. The problem is, the compiler cannot tell the difference between an intended new declaration and a typo - and the poor maintenance programmer can't tell, either. After a while, though, the lesson about redundancy is learned anew, and a special switch will be added to require explicit declaration."

- Walter Bright: Redundancy in Programming Languages

Visual Basic developers learned this lesson a long time ago. Even though VB supports implicit variable declarations, it's extremely rare to come across a professional VB developer that doesn't strongly recommend turning it off (with "Option Explicit") and religiously does so in their own code. It's a shame there are so many newer languages that haven't learned from this.

Read more