Software/Device Development and Design

Fuck JavaScript...

...and all you moron kids who can't comprehend the distinction between "newer" and "better".

Read more


Only an Idiot Would Misrepresent Opinions on Validation

I recently came across Raymond Chen's nearly-two-year-old article "Only an idiot would have parameter validation, and only an idiot would not have it". It was interesting, and so were the comments, but I think Chen's account of the situation is flat-out wrong.

After mentioning that people's reaction to Win3.1's new feature of parameter validation was "It's about damn time", he says:

"But nowadays, parameter validation is out of fashion again. If you detect an invalid parameter and return an error code, then all you're doing is masking a latent bug in the application. It should crash and burn with a big red fat'n'ugly blinking exception. "

Unless there's some real doofuses that I haven't come across, parameter validation is not out of fashion (and wasn't two years ago, either). What's out of fashion is responding to those failure conditions with a mere return code instead of something like an exception.

Getting rid of parameter validation does not amount to "crash and burn with a big red fat'n'ugly blinking exception". It can, but not reliably so - only if you happen to be lucky. And maybe it's just that I don't get out enough, but I'm not aware of anyone who does consider ignoring error conditions to be a reliable way of making an app shout "Failure, failure!" from the rooftops. Maybe it really was back in Win3 as Chen suggests, I don't know, but it's not the case now.

Although, what possibly has flip-flopped (and perhaps this is where Chen got mixed up) is the idea that returning a "bad params" return code is better than forging on and letting shit hit the fan (FWIW, I'm not sure which way I lean on that). I don't think that change in popular opinion happened for entirely unreasonable...umm, reasons. It sounded good at the time, and then we learned it doesn't work out as well as we had hoped in actual practice. So some people are on the other side of the fence now. Live and learn.

BTW, In the title of this, I'm not calling Raymond Chen an idiot. I'm making a play on words with his article's title. So there. Idiot.

Read more


Pushing "Cloud" Proves You're an Idiot Trend Whore

Initially, I had a hard time wrapping my head around the concept of "Cloud" computing. Until one thing occurred to me: It's exactly the same as two concepts we already had, and with names that were already in widespread use: "Hosted" and "Web App".

Gee, I guess it's not web hosting they do after all, it's "Cloud" servers. I guess that must mean it's better. Oh, I see, Microsoft's new DBMS isn't a hosted database, it's a "Cloud" database. Wow, that's so progressive and high tech. And Google Docs isn't a web app, it's "Cloud" computing. Well, fuck me.

It's like the idiotic word "tween". A group of people were too dumb to know the word "preteen" already existed, so they contrived and spread a moronic alternative. And they weren't even as original or clever about it as they thought: Animators had already been tweening for years. Which, of course, just sounds like pedophilia now.

To those pushing and marketing this "Cloud" nonsense: You're like the fashion industry except worse, because being in technology, you of all people should know better than to be a bunch of mindless trend whores.

Suck my cloudsack.

Read more


Decentralized Like My Ass

The web-2.0-app/web-service/cloud-computing crowd seems to enjoy giving a lot of lip service to "distributed" versus "centralized". Now, I agree with the benefits of distributed architectures over centralized ones just as much as they do. But it's rather...interesting to hear it coming from them because 9 times out of 10 their crap is more centralized than traditional software.

Yea, that's right, "Heresy!" I know. So burn me at your stake. But it's true.

I have a copy of an excellent old VGA DOS game called "God of Thunder". The companies involved are long since out of business, which is unfortunate, but you know what? That doesn't affect me. I can still play it, back it up, access my old saves, whatever. If I loose my copy, chances are there's someone else out there with it that I can still get it from. No central node's been broken, because there's no central node to be broken.

Now take your typical web 2.0 "cloud" app - even one of the forward-thinking ones with a public API. The company goes down, or even more likely, just looses interest, or changes it all around, or switches to something else. Guess what? You're fucked. Yea, they might have been nice enough to give you a chance to grab your data before changing it's format or pulling the plug. And you can argue that you don't care about any of that anyway. But the fact remains: You lost the central node and can't use it. Some decentralization. So tell me, traditional software or cloud computing, which one is more centralized?

Torrent is decentralized computing. Modern network routing is decentralized computing. Hell, a private render farm is arguably decentralized computing on a smaller scale. But unless your web 2.0 app is open source and operating on multiple servers managed by multiple independent owners, then my 25-year-old copy of AppleWorks for Apple II is more decentralized than your fancy piss-cloud.

And yea, DRM does throw a wrench in the decentralized benefits of traditional software. But developers don't need to use it (even if they think they do), and it can always be circumvented (no thanks to the Orwellian DMCA, of course). "Cloud" software, on the other hand, is centralized by default. Conscious choice and effort must be made to decentralize it, and if that's not done, then one it's gone, it's gone - there's nothing to circumvent.

My point here is not to say that traditional software is better (Although, that is my opinion. It's just not the point I'm trying to make.) The point here is that if you equate web 2.0 or "cloud" with "decentralized", then you've been filled full of marketing bullshit.

If you're wondering where I got that "9 times out of 10" statistic at the beginning, I've obtained it from the same reliable source where the heads-in-the-cloud-computing folks got their ideas about what "decentralized" means. See this article's title for specific contact details.

Read more


Good New Techs: What Would Corporations Do?

HTML and CSS, frankly, suck. But what if someone created a good alternative? Here's what I think would happen:

Microsoft would never know it existed.

Goggle would re-invent a crappy version of it and pretend it was all their idea.

Apple would put a note in their developer-license-agreement prohibiting it.

Sun would release a whitepaper attempting to explain why it wasn't needed, but in their attempt they would accidentally make it clear it was a good idea after all.

Oracle would create a not-terrible-but-not-great version of it and have their salesmen spend a couple million each convincing middle and upper managers to pay twenty million for it each. Most of them would fall for it.

Sony would investigate the feasibility of introducing DRM capabilities into it.

No one would ever notice if IBM did or didn't do anything with it.

Hobbyist developers would flock towards a newly-created alternate version that seemed simpler at first glance, but was much slower and really just made it easier to introduce subtle bugs.

W3C would form a committee to standardize it. Their early recommendations would combine the worst aspects of all the various versions. The final draft would be nearly identical to the early drafts, but wouldn't be finalized until the original committee's grandchildren were in retirement facilities.

Adobe would create a mediocre, bloated, yet passable child-window-fiesta-of-an-app to deal with it and charge hundreds for it. It would be enormously popular.

The people formerly from JASC would create a great alternative to Adobe's offering at a reasonable price, and after no one bought it they would kill it off by selling the rights to the dying carcass of some formerly-relevant corporation.

Corel...ah ha ha ha ha! Corel...That's a joke that doesn't need a punchline.

Hasbro Interactive would buy the rights to one of the older versions, and sue any individuals and small businesses that had anything similar. Then they would sell their rights.

Steve Yegge will have something to say about it, but no one will know or care what it is because by the time they finish reading his post the universe will have ended. But he'll still maintain that his long-winded approach was "good marketing".

Read more


V12s And The 640k Show Horse

"I've been commissioned to design a roadway for the city, and I've come up with a great design! It assumes that everyone has V12 cars...But come on, V12s have been around forever. Isn't it way past time that all those 4-cylinder owners finally upgraded? I'll be dammed if I'm going to compromise my wonderful design and take slightly more development time just to cater to the few people still living in the stone age."

Hypothetical, obviously. But it demonstrates exactly why programmers who trot out the "640k should be enough for everyone" show horse to defend their consumer-whoreism approach to development piss me off. (Well, that, and the fact that Gates never actually said it.)

I'll certainly grant that there are legitimate uses for 64-bit and multi-core. But this whole attitude of "Something that doesn't emit 64-bit is useless" and such has gotten ridiculously out of hand. Most people and programs don't need 64-bit or multi-core. Sure, a few do. And sure, many things can be better with 64-bit or multi-core - but they don't fucking need it. The notion that they do is a load of high-octane V12 bullshit.

This is the point where I inevitably get a bunch of crap about "But that's all the stores sell!" So what? Is that all that's in common use? Of course not. I don't know about you, but I develop for the hardware that people have, not hardware they might get if and when they decide to go buy something new (nevermind the second-hand market...you know...like eBay...maybe you've heard of it?). And when I optimize something to run well on the lower-end, guess what? It'll still run even better on the V12s. Even moreso since mine isn't one of the inevitably three or more programs on the user's system that all simultaneously believe their optimizations can rely on having at least of couple cores to their self.

And of course there's embedded software. You know, that stuff that the self-centered "waste loads of resources, because my time is more important" programmers always seem to forget exists. Embedded 32-bit and/or single-core devices are going to be around for quite awhile longer. Even the ones that don't stay 32-bit or single-core are still typically going to lag behind desktops, laptops and servers. Even aside from that, there's still power drainage and battery life. All of which leads me to another reason for software developers to cut the consumer-whore crap:

True story: A certain OS developer kept making each version more bloated than the last. They did it because they were Moore-worshipers, plus the bloat led to more hardware sales, which 90% of the time, were pre-packaged with their OS. Then they continued that with OS "Big Panoramic View 6" which completely fucked up their ability to compete in the emerging netbook and tablet markets: Ie, devices which were, guess what? Low-powered! Ah ha ha! Stupid fucks. So...are you behaving like Microsoft?

Read more