Usually this time of year you start reflecting on things that have happened over the last year. Review the good and the bad. Things that sucked and things that ruled. Things that made you laugh and cry.
Well, these days I can't keep up with anything. I can't even remember if that iPhone thing came out this year or last year. I remember buying the new Harry Potter book at 1 am and I think it was cold outside. Or maybe it was wasn't. Was that this year?
My memory is terrible sometimes.
Regardless, the other day I was thinking about all the technologies that I have learned (and forgotten) over the last 25 years. I'm pretty sure there are more. Maybe someday I'll remember one and write about it if I can remember enough. But in the meantime here are some of the highlights and lowlights of my software development career:
1980: This Oregon Trail game on the Apple ][ game is really cool! But I think it needs some modifications. After a quick check with the Apple Basic manual I punch in "LIST" then "20580 PRINT "YOUR OX PEES IN RIVER. LOSE TURN SEARCHING FOR CLEAN WATER." Then "20590 GOTO 10". Ahh, so much better. Then "SAVE OREGON" just so everyone else can enjoy my little modification. Hope the teacher doesn't find out.
1985: It's the BBS heyday. After having pirated GBBS and run it on my 300 baud modem I got bored with it and wrote my own multi-forum BBS. A friend of mine wrote the low level driver for the keyboard entry. He even wrote a nifty text-based tennis game for me. Hey Jordan Lampe, if you're out there drop me a line. This is King Arthur. Ahh, those were the days.
1986: Apple Pascal RULES. It's the coolest thing on this planet. I taught my Pascal teacher at High School how to program. He gave me a C. Ingrate. Seriously. I wrote a freaking adventure game IN CLASS while he taught everyone else how to code up an addition problem. What? You're supposed to be writing an addition problem instead of an adventure game? Doh! Maybe I deserved that C after all.
1986: College. Woot! Upgrade to a 80286 PC and sell my Apple ][ to a farmer. Then take Art History, Calculus, Latin American Studies, and English Literature. No programming, but I get to use that awesome Word Perfect word processor and then copy my documents to a 5" floppy so I can take them to the computer lab to print them on a high quality dot matrix printer.
1987: Macintosh computers RULE. Sure wish I could afford one.
1988: Start taking Pascal classes. Sleep through Pascal I and II. Get A's. Take Fortran. Why? Because the scientific community uses it a lot for it's awesome math library. And they're working on the new Fortran specification. It could be huge. Go to class three times -- first day, mid-term, final. Get an A. Wish I could do that for my Shakespeare classes.
1990: Ok, so upper level Computer Science courses get hard. Really hard. I can't skate by and skip class anymore. But they're at least teaching C++ and object oriented programming now. That's fun. Wonder how this is going to affect that new Fortran specification?
1991: Take computer graphics class and learn how to do cool 3D programming. Coolest thing I've ever done. Write my own 3D object displayer/rotator. Of course, not knowing crap about the Macintosh GUI and event driven programming I create my own buttons with 3D effects (hah! take that Apple - I went 3D buttons on a Mac before you did! plblblblblbllbbl....and, yeah, it was a ginormous waste of time). Write it in Apple Pascal. Pascal kinda sucks.
1992: Re-take assembly language class. God I hate assembly. WTF invented this devil spawn bastard child from hell? Yeah, it's better than punch cards but COME ON.
1992: GRADUATION!!! Booyah. Computer Science degree from the University of Minnesota. I'm living large now, baby.
1992: This job market sucks. Thanks, Bush. Why did I vote for you again?
1992: YES!! I got a job. And HELLS YEAH I get to use OS/2 instead of that girly Windows application. Get immediately put on a project to write a DLL for a program that will run on that girly Windows application. Oh the irony.
1993: Borland C++ is pretty cool. They really thought things out when they made it. Windows programming is soooooo easy. Making a DLL? Piece of cake! Making a GUI? No problem! This is going to revolutionize Windows development.
1993: This new Visual Basic thing is kinda slow but you can put together a GUI in about 5 minutes. Click, click, click, drag, click, drag, type some text. Voila! Instant GUI. Making a GUI in Borland C++ kind of sucks now. If you need some speed just put the code into a DLL and call it from your VB app. No problem. Everyone's happy!
1994: Borland? Who's that? We do everything in Visual Basic.
1996: God Visual Basic is slow.
1997: There's this new thing called Forte. It uses an object oriented language called TOOL. You can distribute your code! You can write a "service object" and just say "run this on the server" and it will run on the server. Your client automatically finds it an uses it! SWEET. This is going to revolutionize enterprise Windows development the same way that one company did back in 1993.
1998: Heard anything about that new Fortran spec lately?
1999: Still using Forte. The IDE sucks, the debugger kind of blows, but you know what? It's pretty decent at this distributed system thing. Can you imagine trying to do this with .NET? Life. Is. Good!
2000: Well, not so good. The web app piece of Forte isn't very good. Matter of fact, our web app is so big it's just simply not going to work. This thing called Java is getting a lot of press and a lot of companies are starting to use it. Let's take a chance and rewrite out web app with it. Not sure it's going to work out or be the right thing but there really aren't many other good options.
2001: Ok, so how do we get our Java servlets to talk to our Forte services? CORBA! Forte will dump out a file that we can import into Java so that we can easily write some CORBA middleware to transfer objects between Forte and Java. Cool! This Java stuff rocks.
2002: We're going to Windows XP. Was there something wrong with Windows 2000? This GUI sucks. It looks like I'm Teletubby land. Can I go back to the classic look? BTW, Java kicks ass.
2002: Well, somewhere around 2002. Sun buys Forte. Why? Are they on crack?
2003: Wow, it's been 10 years since I used OS/2. The only reason anyone ever used that is because Windows was such a pile back then. GPFs, Blue Screen of Death...You got none of that with OS/2. It was rock solid. Then along came NT 3.5 and the whole world shifted. NT was such a great product. Lean and mean and a kernel that never ever crashed on me. I went years without having it crash. Matter of fact, between NT 3.5, NT 4, 2000 Pro, and XP Pro, I don't think my PC has crashed due to anything but a hardware failure.
2004: Sun announces they are dropping support for Forte in 2006. Everyone has 2 years to convert all their stuff. And, by the way, we have this other product called Java that is a real good thing to convert to. Good luck!
2004: We look into converting our Forte apps to Java. All 3,000,000 lines.
2005: Turns out that the new Fortran spec was released in 1992 along with minor revisions in 1995 and 2003. Who knew? Now where did I leave that Java certification study guide?
2006: What a waste of time. I spent all that time studying for the Java Certification only to realize that so many people have one it's practically meaningless. And I really didn't learn anything. Anybody want a Java Certification study guide for $5? $3?
2007: Well, we're paying for Forte support from Sun now. We still have 3,000,000 lines of code to convert. At least we have a plan in place. There's a company that has a tool that will convert all your code. It's slick. Now all we need is management to approve the year it will take to convert it and fix the broken stuff. Ever try to tell management that you're going to work for a year without giving them anything new? It's going to have to happen though -- we can't even pay for support in 2009.
2007: I learned how to make cool web pages without using tables. All you have to do is div div div div div div div div div div div div and give each one a CSS class. Then spend a whole bunch of time making it work in all the browsers. I can't count the number of times I was 1 pixel off in one of the three major browsers. But hey, it's the technology of the future. And once it works it's SLICK. Makes it all worthwhile.
2007: Does anyone give a crap about Vista? I don't... I just upgraded my Ubuntu to Gnarly Gorilla or whatever it is. Couple quick questions, a little download time, reboot...BANG! It's done. And the AMP in my LAMP all fired up flawlessly after the upgrade. Suck on that, MS.
So there you have it. My life as a developer over the last 25 years. I'm sure I missed a bunch of products I've used or didn't use that I wanted to. Some of it sucked, some of it ruled -- but that's how everything goes. You just hope that more things rule than suck. And since I'm still with that company that hired me way back in 1992 I can say that I've definitely worked on more things that ruled than sucked.
See y'all next year. :-)
Thursday, December 20, 2007
Thursday, November 01, 2007
My Vista Conspiracy Theory
Put on your tin hats, boys and girls, I have a great conspiracy theory!
By now, most of you know Windows Vista is a big pile of garbage. It's as slow as molasses in January in Minnesota, it uses so much memory that it makes Fat Bastard look like Richard Simmons, it has more bugs than a New York city sewer, its new security measures are worse than a rectal exam at the airport, and there are so many versions of Vista to choose from that it'll make you more dizzy than drinking an entire bottle of Russian Vodka in one sitting.
Yep, Vista sucks like the black hole at the center of the galaxy. They should have called it Windows ME 2.0.
How could a company awash in cash and developer resources produce such an obvious blunder? Why would they unleash such a rabid beast on the unsuspecting public?
Is it gross incompetence? Is it unbridled hubris? Were the employees at Microsoft threatened with flying chairs and forced to release Vista before it was ready?
No, I say! NO! It's all part of a brilliant Microsoft plan! Here's my theory.
The 2008 elections are looming, and after the outrageous ineptitude displayed by the Republican Party these last several years, there's a very good chance we'll end up with a government dominated by Democrats.
Microsoft has gotten a free pass during our Republican dominated years. Heck, Microsoft has practically been encouraged to do whatever it wants, including stealing candy from crying babies.
But with the real possibility of Democrats taking control, the Department of Justice might once again try to, you know, dispense some actual justice. That means Microsoft might once again be punished for little transgressions like illegal monopoly maintenance.
Microsoft can dodge some legal bullets more easily if they can point to some competitors that aren't completely irrelevant. Apple may be their closest competitor, at least when it comes to operating systems.
So here's my theory of Microsoft's brilliant plan: They release a craptacular operating system -- Vista -- on purpose! They know Vista is a clusterfubar of monumental proportions, but that's exactly what they needed to release in order to purposefully boost sales of Macs and Mac OS X. By the time a new President is in the White House (which could very well be a Democrat), Apple's market share will have grown by leaps and bounds!
Then, the next time someone points at Microsoft and claims "illegal monopoly maintenance!!!" or the Department of Justice decides to start doing their job again, Microsoft can point to Apple and say, "But look! Apple is a serious threat to us now! They're a huge competitor gaining market share! Leave us alone to innovate! We need to innovate!!!"
Those people at Microsoft are geniuses!
Put on your tin hats, boys and girls, I have a great conspiracy theory!
By now, most of you know Windows Vista is a big pile of garbage. It's as slow as molasses in January in Minnesota, it uses so much memory that it makes Fat Bastard look like Richard Simmons, it has more bugs than a New York city sewer, its new security measures are worse than a rectal exam at the airport, and there are so many versions of Vista to choose from that it'll make you more dizzy than drinking an entire bottle of Russian Vodka in one sitting.
Yep, Vista sucks like the black hole at the center of the galaxy. They should have called it Windows ME 2.0.
How could a company awash in cash and developer resources produce such an obvious blunder? Why would they unleash such a rabid beast on the unsuspecting public?
Is it gross incompetence? Is it unbridled hubris? Were the employees at Microsoft threatened with flying chairs and forced to release Vista before it was ready?
No, I say! NO! It's all part of a brilliant Microsoft plan! Here's my theory.
The 2008 elections are looming, and after the outrageous ineptitude displayed by the Republican Party these last several years, there's a very good chance we'll end up with a government dominated by Democrats.
Microsoft has gotten a free pass during our Republican dominated years. Heck, Microsoft has practically been encouraged to do whatever it wants, including stealing candy from crying babies.
But with the real possibility of Democrats taking control, the Department of Justice might once again try to, you know, dispense some actual justice. That means Microsoft might once again be punished for little transgressions like illegal monopoly maintenance.
Microsoft can dodge some legal bullets more easily if they can point to some competitors that aren't completely irrelevant. Apple may be their closest competitor, at least when it comes to operating systems.
So here's my theory of Microsoft's brilliant plan: They release a craptacular operating system -- Vista -- on purpose! They know Vista is a clusterfubar of monumental proportions, but that's exactly what they needed to release in order to purposefully boost sales of Macs and Mac OS X. By the time a new President is in the White House (which could very well be a Democrat), Apple's market share will have grown by leaps and bounds!
Then, the next time someone points at Microsoft and claims "illegal monopoly maintenance!!!" or the Department of Justice decides to start doing their job again, Microsoft can point to Apple and say, "But look! Apple is a serious threat to us now! They're a huge competitor gaining market share! Leave us alone to innovate! We need to innovate!!!"
Those people at Microsoft are geniuses!
Does Apple Hate Java?
On October 26, 2007, Apple released Leopard, the latest and greatest version of the excellent operating system, Mac OS X.
Even though I haven't owned a Mac since the 1980s, I've been enjoying reading about Mac OS X online. The temptation to buy a Mac has never been greater, and I was very nearly ready to give in to that temptation .... until I realized, much to my dismay, that Apple hates Java.
You see, these days, I'm mostly a Java developer. Oh sure, I still use C, and dabble with Python, and do a bit of HTML and JavaScript too, but I'm primarily a Java developer.
Java does a very good job with, "Write Once, Run Everywhere." Not perfect, mind you -- but still, a very good job. And, as a Java developer, I thought to myself, "Self, as a Java developer, you could probably switch to a Mac pretty easily, since Apple supports Java!"
Well, Apple does support Java ... poorly. Leopard, for some reason, didn't ship with Java 6. Okay, that alone is a huge disappointment, given that Java 6 has already been out for other operating systems for about a year. But surely Leopard will ship with the latest version of Java 5! Right?
Nope. A lot of rumors, from several reliable sources, have been saying that Java 5 on Leopard is buggy and broken. What the hell, Apple?
Given the army of Java developers out there, you would think Steve Jobs would try to make Mac OS X one of the best development platforms for Java in the world. Unfortunately, he can't even be bothered to ship a decent Java runtime for Mac OS X.
This baffles me. Presumably, a lot of Java developers chose Java over competitors like .NET / C# because they dislike Microsoft (for whatever reason). The battle to win the hearts and minds of those developers is already 1/2 won! Why not go the rest of the distance and make Java on Mac a priority, Apple? You could win the hearts and minds of a lot of Java developers with ease, who would then presumably write software for Mac OS X, which would in turn entice more people to use Macs.
Needless to say, this major stumbling block makes my decision to not buy a Mac an easy one.
Come on, Apple. Get with it.
On October 26, 2007, Apple released Leopard, the latest and greatest version of the excellent operating system, Mac OS X.
Even though I haven't owned a Mac since the 1980s, I've been enjoying reading about Mac OS X online. The temptation to buy a Mac has never been greater, and I was very nearly ready to give in to that temptation .... until I realized, much to my dismay, that Apple hates Java.
You see, these days, I'm mostly a Java developer. Oh sure, I still use C, and dabble with Python, and do a bit of HTML and JavaScript too, but I'm primarily a Java developer.
Java does a very good job with, "Write Once, Run Everywhere." Not perfect, mind you -- but still, a very good job. And, as a Java developer, I thought to myself, "Self, as a Java developer, you could probably switch to a Mac pretty easily, since Apple supports Java!"
Well, Apple does support Java ... poorly. Leopard, for some reason, didn't ship with Java 6. Okay, that alone is a huge disappointment, given that Java 6 has already been out for other operating systems for about a year. But surely Leopard will ship with the latest version of Java 5! Right?
Nope. A lot of rumors, from several reliable sources, have been saying that Java 5 on Leopard is buggy and broken. What the hell, Apple?
Given the army of Java developers out there, you would think Steve Jobs would try to make Mac OS X one of the best development platforms for Java in the world. Unfortunately, he can't even be bothered to ship a decent Java runtime for Mac OS X.
This baffles me. Presumably, a lot of Java developers chose Java over competitors like .NET / C# because they dislike Microsoft (for whatever reason). The battle to win the hearts and minds of those developers is already 1/2 won! Why not go the rest of the distance and make Java on Mac a priority, Apple? You could win the hearts and minds of a lot of Java developers with ease, who would then presumably write software for Mac OS X, which would in turn entice more people to use Macs.
Needless to say, this major stumbling block makes my decision to not buy a Mac an easy one.
Come on, Apple. Get with it.
Wednesday, October 10, 2007
Umm, yeah
So it's been pretty close to forever since I last wrote something. Shoot me.
Plain fact of the matter is I really haven't had much to rant about lately. Things have been going pretty smoothly in my life. Sure, I could rant about Halliburton, the sad state of the upcoming 2008 election, hooking up a Wiimote to your computer, how good Pringles are (and how to make a cantenna when you're done), or why I just can't seem to play guitar just like Joe Walsh, but do I really have anything of value to add to those topics?
Probably not.
Well, ok. I don't sound like Joe Walsh because I play a Fender through a 40w Crate amp and he plays a Gibson through a Marshall Stack. And there's a slight talent difference. But beyond that, what's there to say? Nada.
So just think... since you're not sitting here reading some useless garbage you have enough time to crack open a Stewart's and play a game of Mr. Jack with a friend. Woot!
Enjoy!
Plain fact of the matter is I really haven't had much to rant about lately. Things have been going pretty smoothly in my life. Sure, I could rant about Halliburton, the sad state of the upcoming 2008 election, hooking up a Wiimote to your computer, how good Pringles are (and how to make a cantenna when you're done), or why I just can't seem to play guitar just like Joe Walsh, but do I really have anything of value to add to those topics?
Probably not.
Well, ok. I don't sound like Joe Walsh because I play a Fender through a 40w Crate amp and he plays a Gibson through a Marshall Stack. And there's a slight talent difference. But beyond that, what's there to say? Nada.
So just think... since you're not sitting here reading some useless garbage you have enough time to crack open a Stewart's and play a game of Mr. Jack with a friend. Woot!
Enjoy!
Wednesday, July 04, 2007
Firefox and the Flash Plugin
Firefox is my web browser of choice. By itself, it's a best-of-breed web browser. With a set of carefully chosen plugins, it simply can't be beat.
However, there's one plugin I refuse to install on Firefox, and that's the Flash plugin.
It's not that I have anything against Flash itself. It's a fantastic tool which can make beautiful, media rich web pages. Unfortunately, 99% of the time, it's used for a more nefarious purpose: to inundate you with obnoxious ads. The ads usually bounce, jump, shake, and flash (pathetic pun intended) until you're either annoyed enough to close your web browser entirely, or until you're started having an epileptic seizure.
My plan of attack to handle this situation is simple: I don't install the Flash plugin on Firefox. On those occasions when I need to visit a Flash enabled web site for a legitimate reason, I use Internet Explorer instead, where I've installed the Flash plugin. (And, in fact, I don't leave Firefox to do it. I simply use the IE Tab plugin for Firefox to use Internet Explorer right inside Firefox!)
There's one slight flaw in the plan, though. Every single time I visit a web site with Flash ads, which is way too many of them, Firefox displays the following message at the top of the web page:
Additional plugins are required to display all the media on this page.
Perhaps I'm just too sensitive, but that message drives me nuts. Sure, it's a lot less annoying than looking at obnoxious Flash ads, but it's still annoying. I purposefully avoided installing the Flash plugin on Firefox to avoid annoyance.
Luckily, there's a work-around. Open a new tab, browse to about:config, and type plugin.default_plugin_disabled into the Filter field. Right click the one and only attribute that should then be displayed, and Toggle the value to false. Restart Firefox, and...voila! No more annoying messages about additional plugins being required.
Happy browsing!
Firefox is my web browser of choice. By itself, it's a best-of-breed web browser. With a set of carefully chosen plugins, it simply can't be beat.
However, there's one plugin I refuse to install on Firefox, and that's the Flash plugin.
It's not that I have anything against Flash itself. It's a fantastic tool which can make beautiful, media rich web pages. Unfortunately, 99% of the time, it's used for a more nefarious purpose: to inundate you with obnoxious ads. The ads usually bounce, jump, shake, and flash (pathetic pun intended) until you're either annoyed enough to close your web browser entirely, or until you're started having an epileptic seizure.
My plan of attack to handle this situation is simple: I don't install the Flash plugin on Firefox. On those occasions when I need to visit a Flash enabled web site for a legitimate reason, I use Internet Explorer instead, where I've installed the Flash plugin. (And, in fact, I don't leave Firefox to do it. I simply use the IE Tab plugin for Firefox to use Internet Explorer right inside Firefox!)
There's one slight flaw in the plan, though. Every single time I visit a web site with Flash ads, which is way too many of them, Firefox displays the following message at the top of the web page:
Additional plugins are required to display all the media on this page.
Perhaps I'm just too sensitive, but that message drives me nuts. Sure, it's a lot less annoying than looking at obnoxious Flash ads, but it's still annoying. I purposefully avoided installing the Flash plugin on Firefox to avoid annoyance.
Luckily, there's a work-around. Open a new tab, browse to about:config, and type plugin.default_plugin_disabled into the Filter field. Right click the one and only attribute that should then be displayed, and Toggle the value to false. Restart Firefox, and...voila! No more annoying messages about additional plugins being required.
Happy browsing!
Tuesday, June 26, 2007
Why Is My Computer So Slow?
The list of computers I've owned and used over the years almost sounds like some kind of computer museum. It includes (but isn't limited to) the following:
My early computers (the TRS-80 Color Computer and Commodore 64) were great machines, but even I'll admit they only had enough power for one person to run one application at a time (generally speaking). (Yes, I'm aware there were multi-user, multi-tasking operating systems available for the TRS-80 Color Computer and the Commodore 64, but I'm aiming this article at a more general audience.)
The Fat Mac and Amiga were another story. They easily had enough power for one person to run multiple applications at the same time, those applications being reasonably modern (GUI based). They also had enough power for dozens of users to run simple (likely text based) applications at the same time (though you would need to install an alternative operating system on the machine).
Jump to the 25 MHz 80386 and 90 MHz Pentium, and we're entering a new realm of power. These machines, when using the right operating system, could handle multiple users running multiple applications at the same time, those applications being reasonably modern (GUI based). In fact, the 90 MHz Pentium has enough power to handle the needs of most modern users: web browsing, email, word processing, even listening to music. (To be fair, though, decoding MP3 audio requires enough power that it would cease to be a good multi-user machine).
Enter the 600 MHz Pentium III, more than enough machine for, most likely, 95% (or more) of the computer using population. Decode MP3 audio? No problem. Watch HD video? No problem. Unless you're editing your own movies, this machine probably has all the power you need (and then some).
Last, but not least, the AMD Athlon 64 3200+. By today's standards (at the time this article was written, June 2007), this machine is already starting to show its age, but it's a beast with some serious power. With the right operating system, it could easily support dozens of users running multiple applications at the same time, those applications being very modern (GUI based, with all the bells and whistles).
And yet, too often, the performance disappoints. I type a few letters into my word processor, and at that exact moment, the operating system (Windows XP Professional) decides I'm less important than something happening in the background, and I see a noticeable delay before my typing appears on the screen. Perhaps I restore a program that's been minimized and idle for a while, and wait while the hard drive grinds and brings my application back up (and, in case you're interested, I never exhaust the physical RAM in this machine, which has 2 GB). Or perhaps I decide to listen to some music in iTunes, and watch it slowly boot up. There are many other examples of the computer being just plain lethargic.
What's going on here? We all have the equivalent of supercomputers on our desks, with far more power than we really need. How could it be that we ever wait so long for such mundane things to happen on our computers?
A big part of the problem, probably the primary problem, is the operating system (usually Windows). It should be smarter. The computer is our electronic servant, and should always give their owner a higher priority than, well, anything else. For example, I should never see a delay while typing in my word processor. I should never encounter a delay while restoring an application that's been idle for a while (unless I exhaust physical RAM and it has to be restored from disk, but once again, I never exhaust physical RAM). Microsoft is obviously doing something terribly wrong inside Windows.
Another part of the problem is software development. As computers have gotten faster, software developers have used increasingly easier computer programming languages. There are a lot of advantages to using languages that are easier to use and faster to develop with, but each and every user of your application pays a performance penalty. Perhaps too much of a performance penalty in some cases? I love the safety and ease of use of these modern languages as much as the next software developer, but frequently, I can't help but think that perhaps my users would be better served if I was using a more efficient language, even if it made my job harder.
If you'd like to get a taste of the kind of speed and responsiveness you could see from your computer if you were using more efficient software, consider giving Damn Small Linux a test drive. It's a very small, efficient operating system, but also very capable and powerful. I've got it booted right this minute, with a few applications running (including the Firefox web browser), and the entire thing is consuming just 50 MB of RAM. In fact, the entire OS itself (with all sorts of handy applications) is just 50 MB -- about 1/14th the size of a single CD.
The next time you ask yourself, "Why is my computer so slow?", you should already know the answer: Because the people who wrote your operating system and your applications decided their time is more important than yours. You might want to consider using more efficient operating systems and applications.
The list of computers I've owned and used over the years almost sounds like some kind of computer museum. It includes (but isn't limited to) the following:
- TRS-80 Color Computer
- Commodore 64
- Fat Mac (512k)
- Amiga 500
- 25 MHz 80386
- 90 MHz Pentium
- 600 MHz Pentium III
- AMD Athlon 64 3200+
My early computers (the TRS-80 Color Computer and Commodore 64) were great machines, but even I'll admit they only had enough power for one person to run one application at a time (generally speaking). (Yes, I'm aware there were multi-user, multi-tasking operating systems available for the TRS-80 Color Computer and the Commodore 64, but I'm aiming this article at a more general audience.)
The Fat Mac and Amiga were another story. They easily had enough power for one person to run multiple applications at the same time, those applications being reasonably modern (GUI based). They also had enough power for dozens of users to run simple (likely text based) applications at the same time (though you would need to install an alternative operating system on the machine).
Jump to the 25 MHz 80386 and 90 MHz Pentium, and we're entering a new realm of power. These machines, when using the right operating system, could handle multiple users running multiple applications at the same time, those applications being reasonably modern (GUI based). In fact, the 90 MHz Pentium has enough power to handle the needs of most modern users: web browsing, email, word processing, even listening to music. (To be fair, though, decoding MP3 audio requires enough power that it would cease to be a good multi-user machine).
Enter the 600 MHz Pentium III, more than enough machine for, most likely, 95% (or more) of the computer using population. Decode MP3 audio? No problem. Watch HD video? No problem. Unless you're editing your own movies, this machine probably has all the power you need (and then some).
Last, but not least, the AMD Athlon 64 3200+. By today's standards (at the time this article was written, June 2007), this machine is already starting to show its age, but it's a beast with some serious power. With the right operating system, it could easily support dozens of users running multiple applications at the same time, those applications being very modern (GUI based, with all the bells and whistles).
And yet, too often, the performance disappoints. I type a few letters into my word processor, and at that exact moment, the operating system (Windows XP Professional) decides I'm less important than something happening in the background, and I see a noticeable delay before my typing appears on the screen. Perhaps I restore a program that's been minimized and idle for a while, and wait while the hard drive grinds and brings my application back up (and, in case you're interested, I never exhaust the physical RAM in this machine, which has 2 GB). Or perhaps I decide to listen to some music in iTunes, and watch it slowly boot up. There are many other examples of the computer being just plain lethargic.
What's going on here? We all have the equivalent of supercomputers on our desks, with far more power than we really need. How could it be that we ever wait so long for such mundane things to happen on our computers?
A big part of the problem, probably the primary problem, is the operating system (usually Windows). It should be smarter. The computer is our electronic servant, and should always give their owner a higher priority than, well, anything else. For example, I should never see a delay while typing in my word processor. I should never encounter a delay while restoring an application that's been idle for a while (unless I exhaust physical RAM and it has to be restored from disk, but once again, I never exhaust physical RAM). Microsoft is obviously doing something terribly wrong inside Windows.
Another part of the problem is software development. As computers have gotten faster, software developers have used increasingly easier computer programming languages. There are a lot of advantages to using languages that are easier to use and faster to develop with, but each and every user of your application pays a performance penalty. Perhaps too much of a performance penalty in some cases? I love the safety and ease of use of these modern languages as much as the next software developer, but frequently, I can't help but think that perhaps my users would be better served if I was using a more efficient language, even if it made my job harder.
If you'd like to get a taste of the kind of speed and responsiveness you could see from your computer if you were using more efficient software, consider giving Damn Small Linux a test drive. It's a very small, efficient operating system, but also very capable and powerful. I've got it booted right this minute, with a few applications running (including the Firefox web browser), and the entire thing is consuming just 50 MB of RAM. In fact, the entire OS itself (with all sorts of handy applications) is just 50 MB -- about 1/14th the size of a single CD.
The next time you ask yourself, "Why is my computer so slow?", you should already know the answer: Because the people who wrote your operating system and your applications decided their time is more important than yours. You might want to consider using more efficient operating systems and applications.
Friday, April 13, 2007
TV on a Cell Phone?!
I guess I first really noticed this while watching American Idol on my real TV. They were advertising that you could watch video clips on your cell phone. Lately I've seen and heard commercials for Verizon's VCast videos. They're talking about it like it's the greatest new technology on Earth. Hooray! You can take video with you! See what you can watch 48 hours in advance!
I find the whole thing ridiculous.
Not only are cell phone screens way too small to be useful as a streaming video player but is it really necessary to have video available to you at all times? Has our society gotten so dependent on not being bored that we have to have something entertaining every second of every day no matter where we are? We gotta have iPods in our ears at all time, cell phones that can surf, IM, and play videos, GameBoys and PSPs to play games, and Blackberrys so we can do business anytime anywhere. Ever wonder why, as a society, our attention spans are getting shorter and we seem to have much less patience?
I don't. We're training ourselves.
I find the whole thing ridiculous.
Not only are cell phone screens way too small to be useful as a streaming video player but is it really necessary to have video available to you at all times? Has our society gotten so dependent on not being bored that we have to have something entertaining every second of every day no matter where we are? We gotta have iPods in our ears at all time, cell phones that can surf, IM, and play videos, GameBoys and PSPs to play games, and Blackberrys so we can do business anytime anywhere. Ever wonder why, as a society, our attention spans are getting shorter and we seem to have much less patience?
I don't. We're training ourselves.
Tuesday, April 03, 2007
When is the last time you had fun programming?
Ask yourself that question. Be honest with the answer.
If your answer is something similar to, "It's been a long time since I had fun programming," you owe it to yourself to check out the Python programming language.
Even though I have my doubts about dynamically typed programming languages in general, I have to admit that learning to program in Python, and programming in Python, is just plain fun. It's easy to get started: Just download and install Python, and hop into the interactive interpreter. There's an excellent Python tutorial to help you get started, and even a free, high quality online book.
The Python interactive interpreter is a big part of what, for me, makes Python so much fun. It reminds me of programming BASIC on the Commodore 64 in the early 1980s. The Python interactive interpreter gives you that kind of instant gratification. (With suitable apologies to anyone under 30 years old, who probably has no idea what I'm talking about.)
For example, let's say you're starting to learn about Python lists. Instead of flipping back and forth between an editor and running your Python program to check the results, you can experiment with Python lists right in the interactive interpreter. Here's an example:
>>> mylist = ['one', 'two', 'three', 'four']
>>> print mylist
['one', 'two', 'three', 'four']
>>> mylist[1] = 'TWO!'
>>> print mylist
['one', 'TWO!', 'three', 'four']
This kind of experimentation in the interactive interpreter saves a lot of time. It's also a good way to learn how different functions behave in Python. Here's another example:
>>> myfile = file("hello.txt", "r")
>>> line = myfile.next()
>>> print line
hello world
>>> myfile.close()
You can even ask the interactive interpreter for help on objects and methods:
>>> help(file.close)
Help on method_descriptor:
close(...)
close() -> None or (perhaps) an integer. Close the file.
Sets data attribute .closed to True. A closed file cannot be used for
further I/O operations. close() may be called more than once without
error. Some kinds of file objects (for example, opened by popen())
may return an exit status upon closing.
The standard Python library is also quite comprehensive. You'll find support for just about anything you might need to do. For anything not built into Python by default, you can probably find a Python library that does what you need.
The community support for Python is impressive, too. If you have a question about Python, type your question into Google and you're likely to get a relevant result in your first search results page. You can also ask the friendly people at comp.lang.python on Usenet (though, admittedly, some of them are misguided and solidly in the "Python can do no wrong!" camp, or the "Why would you want to do that anyway?" camp -- you can identify and ignore them pretty easily, though).
In case you're worried or wondering, Python is plenty buzzword compliant. Dynamically typed, garbage collected, object oriented, etc. It's also very portable: You can run Python on your Java Virtual Machine (thanks to Jython) and on your .NET runtime (thanks to IronPython), as well as running the standard Python interpreter. Projects are well underway and making nice progress that enable you to compile Python to native code, as well.
If your corporate software development has you singing the blues, and you feel like bringing back some of that magic you used to feel when writing software, give Python a test drive. The worst that can happen is you waste a few hours, and the best that can happen is you'll discover a new, favorite programming language.
Ask yourself that question. Be honest with the answer.
If your answer is something similar to, "It's been a long time since I had fun programming," you owe it to yourself to check out the Python programming language.
Even though I have my doubts about dynamically typed programming languages in general, I have to admit that learning to program in Python, and programming in Python, is just plain fun. It's easy to get started: Just download and install Python, and hop into the interactive interpreter. There's an excellent Python tutorial to help you get started, and even a free, high quality online book.
The Python interactive interpreter is a big part of what, for me, makes Python so much fun. It reminds me of programming BASIC on the Commodore 64 in the early 1980s. The Python interactive interpreter gives you that kind of instant gratification. (With suitable apologies to anyone under 30 years old, who probably has no idea what I'm talking about.)
For example, let's say you're starting to learn about Python lists. Instead of flipping back and forth between an editor and running your Python program to check the results, you can experiment with Python lists right in the interactive interpreter. Here's an example:
>>> mylist = ['one', 'two', 'three', 'four']
>>> print mylist
['one', 'two', 'three', 'four']
>>> mylist[1] = 'TWO!'
>>> print mylist
['one', 'TWO!', 'three', 'four']
This kind of experimentation in the interactive interpreter saves a lot of time. It's also a good way to learn how different functions behave in Python. Here's another example:
>>> myfile = file("hello.txt", "r")
>>> line = myfile.next()
>>> print line
hello world
>>> myfile.close()
You can even ask the interactive interpreter for help on objects and methods:
>>> help(file.close)
Help on method_descriptor:
close(...)
close() -> None or (perhaps) an integer. Close the file.
Sets data attribute .closed to True. A closed file cannot be used for
further I/O operations. close() may be called more than once without
error. Some kinds of file objects (for example, opened by popen())
may return an exit status upon closing.
The standard Python library is also quite comprehensive. You'll find support for just about anything you might need to do. For anything not built into Python by default, you can probably find a Python library that does what you need.
The community support for Python is impressive, too. If you have a question about Python, type your question into Google and you're likely to get a relevant result in your first search results page. You can also ask the friendly people at comp.lang.python on Usenet (though, admittedly, some of them are misguided and solidly in the "Python can do no wrong!" camp, or the "Why would you want to do that anyway?" camp -- you can identify and ignore them pretty easily, though).
In case you're worried or wondering, Python is plenty buzzword compliant. Dynamically typed, garbage collected, object oriented, etc. It's also very portable: You can run Python on your Java Virtual Machine (thanks to Jython) and on your .NET runtime (thanks to IronPython), as well as running the standard Python interpreter. Projects are well underway and making nice progress that enable you to compile Python to native code, as well.
If your corporate software development has you singing the blues, and you feel like bringing back some of that magic you used to feel when writing software, give Python a test drive. The worst that can happen is you waste a few hours, and the best that can happen is you'll discover a new, favorite programming language.
Friday, March 02, 2007
Libraries: Java vs. C and C++
Java is bloated. I hear it over and over. I hear it from colleagues; I hear it from C programmers, I hear it from C++ programmers. I hear it from Java programmers. I even hear it coming from my own lips (or typed from my own fingers), even though I generally like the Java programming language!
And, to be honest, the fact that Java is bloated is becoming increasingly irrelevant (though, to be honest, it's not always irrelevant -- sometimes you really do need all the performance you can muster and/or and the smallest executable you can achieve).
However, having been a software developer as long as I have, I can still remember the Assembly vs. C wars, and the C vs. C++ wars. The wars stay the same, only the names change: Now it's usually C++ vs. Java, with the C++ advocates on the side that always seems to lose.
With Great Bloat Comes Great Power!
One of the reasons Java is bloated is actually one of its greatest strengths: the standard Java library.
I had a (very brief) debate with someone on Usenet regarding libraries. He was trying to convince me that the fact that C came with a very spartan standard library wasn't a disadvantage. He felt it was advantageous: It keeps C small and agile, and gives you a lot of third party libraries to choose from, so you can be sure to choose the very best. I'll admit that there's some small truth to his argument, but the disadvantages outweigh the advantages in most application domains, as I'll attempt to show here.
When Will I Actually Get To Start Coding?
To complete any non-trivial application in C or C++, you need to involve a lot of libraries. You may end up needing GUI libraries, database libraries, scripting language libraries, arbitrary precision math libraries, collections libraries (this being more true of C than C++), networking libraries, security libraries... The list goes on and on.
In the world of C and C++ development, each of those libraries may, in turn, depend on other libraries. You pretty quickly end up in a hell of spaghetti dependencies. Just look at any non-trivial Linux application. The dependency tree on many of them will blow you away.
Of course, if you're being diligent, you'll want to evaluate multiple libraries in each category, right? You need a GUI? Okay, no problem: Do you use Qt? How about Gtk+? Have you considered wxWindows? Maybe FLTK? <Insert a dozen more GUI libraries here!>
Qt looks nice, but... Oops, your budget is small and Qt is expensive, really expensive. No problem, we'll use the free (GPL licensed) version of Qt. Oops, the onerous restrictions of the GPL doesn't meet the requirements of your company? That's a shame. Let's try Gtk+.
Oh dear, the latest version of Gtk+ has a security advisory out against it? Wait, it's not the Gtk+ library itself, but a library the Gtk+ library itself depends on? Better keep track of all those dependencies...
Bah, let's try wxWindows. Happy days, it looks like the wxWindows library will meet your needs!
Aw, darn it. wxWindows doesn't support all the platforms you're targeting. Back to the drawing board.
Now repeat this exercise for the dozen or more libraries your application may require. Good luck finding some time to actually, you know, start coding your own application. (And, yes, I'm exaggerating a little bit to make my point. Sue me.)
Java, on the other hand, comes standard with Swing for your GUI; JDBC for your database connectivity (or even Dirby if you need a lightweight database); a JavaScript interpreter for scripting (Rhino); arbitrary precision math APIs, collections APIs; networking APIs; security APIs; etc.
All of a sudden, the burden of evaluating libraries, researching licensing issues, tracking dependencies, ensuring platform support, and keeping abreast of security vulnerabilities has dropped to zero. You work has been reduced to making sure an appropriate Java Runtime Environment is installed.
Every developer probably already knows this, but Sun also offers Java with an extremely generous license (free as in beer). This includes the development kit and the runtime environment.
Starting with Java 7, Sun even plans to release Java under a license that should make most Free Software advocates happy. There won't be any barriers to anyone shipping Java with their operating system, whether it be Windows, MacOS X, Linux, BSD, etc.
Conclusions
In any case, I hope I've made my point. The rather spartan standard libraries that come with C and C++ are a huge disadvantage in many application domains, putting a much greater burden on the developer. I'm convinced C and C++ still have their place and will still be in active use decades from now, but a huge number of applications are better built on higher level languages with massive (read: bloated) standard libraries.
Go Bloat!
Java is bloated. I hear it over and over. I hear it from colleagues; I hear it from C programmers, I hear it from C++ programmers. I hear it from Java programmers. I even hear it coming from my own lips (or typed from my own fingers), even though I generally like the Java programming language!
And, to be honest, the fact that Java is bloated is becoming increasingly irrelevant (though, to be honest, it's not always irrelevant -- sometimes you really do need all the performance you can muster and/or and the smallest executable you can achieve).
However, having been a software developer as long as I have, I can still remember the Assembly vs. C wars, and the C vs. C++ wars. The wars stay the same, only the names change: Now it's usually C++ vs. Java, with the C++ advocates on the side that always seems to lose.
With Great Bloat Comes Great Power!
One of the reasons Java is bloated is actually one of its greatest strengths: the standard Java library.
I had a (very brief) debate with someone on Usenet regarding libraries. He was trying to convince me that the fact that C came with a very spartan standard library wasn't a disadvantage. He felt it was advantageous: It keeps C small and agile, and gives you a lot of third party libraries to choose from, so you can be sure to choose the very best. I'll admit that there's some small truth to his argument, but the disadvantages outweigh the advantages in most application domains, as I'll attempt to show here.
When Will I Actually Get To Start Coding?
To complete any non-trivial application in C or C++, you need to involve a lot of libraries. You may end up needing GUI libraries, database libraries, scripting language libraries, arbitrary precision math libraries, collections libraries (this being more true of C than C++), networking libraries, security libraries... The list goes on and on.
In the world of C and C++ development, each of those libraries may, in turn, depend on other libraries. You pretty quickly end up in a hell of spaghetti dependencies. Just look at any non-trivial Linux application. The dependency tree on many of them will blow you away.
Of course, if you're being diligent, you'll want to evaluate multiple libraries in each category, right? You need a GUI? Okay, no problem: Do you use Qt? How about Gtk+? Have you considered wxWindows? Maybe FLTK? <Insert a dozen more GUI libraries here!>
Qt looks nice, but... Oops, your budget is small and Qt is expensive, really expensive. No problem, we'll use the free (GPL licensed) version of Qt. Oops, the onerous restrictions of the GPL doesn't meet the requirements of your company? That's a shame. Let's try Gtk+.
Oh dear, the latest version of Gtk+ has a security advisory out against it? Wait, it's not the Gtk+ library itself, but a library the Gtk+ library itself depends on? Better keep track of all those dependencies...
Bah, let's try wxWindows. Happy days, it looks like the wxWindows library will meet your needs!
Aw, darn it. wxWindows doesn't support all the platforms you're targeting. Back to the drawing board.
Now repeat this exercise for the dozen or more libraries your application may require. Good luck finding some time to actually, you know, start coding your own application. (And, yes, I'm exaggerating a little bit to make my point. Sue me.)
Java, on the other hand, comes standard with Swing for your GUI; JDBC for your database connectivity (or even Dirby if you need a lightweight database); a JavaScript interpreter for scripting (Rhino); arbitrary precision math APIs, collections APIs; networking APIs; security APIs; etc.
All of a sudden, the burden of evaluating libraries, researching licensing issues, tracking dependencies, ensuring platform support, and keeping abreast of security vulnerabilities has dropped to zero. You work has been reduced to making sure an appropriate Java Runtime Environment is installed.
Every developer probably already knows this, but Sun also offers Java with an extremely generous license (free as in beer). This includes the development kit and the runtime environment.
Starting with Java 7, Sun even plans to release Java under a license that should make most Free Software advocates happy. There won't be any barriers to anyone shipping Java with their operating system, whether it be Windows, MacOS X, Linux, BSD, etc.
Conclusions
In any case, I hope I've made my point. The rather spartan standard libraries that come with C and C++ are a huge disadvantage in many application domains, putting a much greater burden on the developer. I'm convinced C and C++ still have their place and will still be in active use decades from now, but a huge number of applications are better built on higher level languages with massive (read: bloated) standard libraries.
Go Bloat!
Monday, February 26, 2007
Where are all the fancy schmancy graphics?
And where are the real links? We mention a lot of stuff here and we never seem to link up the stuff.
No graphics. No links.
I would love to add a bunch of sweet looking eye candy to this blog but the reality is that I suck at graphics. I could spend hours putting graphics together but I'd much rather spend it doing something else. Like writing here. Or playing guitar. Or playing a board game. Or watching the Osca...er...wait. I'd rather make graphics than watch the Oscars. But not American Idol! I'm a sucker for that show.
As for the links? I'm still not sure I want to litter my blog with tons of links. They're distracting in a paragraph and usually take something away from the meaning of it. They focus the reader on the links rather than other parts you may want to emphasize. I'll try to make them more subtle with the CSS and if that doesn't work I may put the links at the end of the text in each blog entry.
No graphics. No links.
I would love to add a bunch of sweet looking eye candy to this blog but the reality is that I suck at graphics. I could spend hours putting graphics together but I'd much rather spend it doing something else. Like writing here. Or playing guitar. Or playing a board game. Or watching the Osca...er...wait. I'd rather make graphics than watch the Oscars. But not American Idol! I'm a sucker for that show.
As for the links? I'm still not sure I want to litter my blog with tons of links. They're distracting in a paragraph and usually take something away from the meaning of it. They focus the reader on the links rather than other parts you may want to emphasize. I'll try to make them more subtle with the CSS and if that doesn't work I may put the links at the end of the text in each blog entry.
Friday, February 23, 2007
Decaffeinated
A couple weeks ago I had a life changing event. I was home alone with the kids and completely out of Pepsi. I don't drink tea or coffee so I tried to go without the caffeine. I was doing alright until about 1 o'clock when I started getting the dreaded lack-of-caffeine-headache. At that point I should have packed the kids up, gone to the store, and gotten some caffeine. But no, I decided to stay the course and wait for my wife to come back with some Pepsi.
Bad plan.
The headache kept getting worse. By the time I finally decided to take some ibuprofen for it I realized we didn't have any of that either. Shortly after that my wife arrived with some Pepsi which I quickly opened up and gulped down. Then I went to lay down for a while and wait for the headache to go away.
Wrong.
It kept getting worse. And I had to go to the bank. I grabbed another Pepsi and went up to the store to buy some Ibuprofen. My headache was so bad at that point my vision was starting to deteriorate. Fortunately the grocery store is only a few block away. I got the pain killers, went back out to my car, took 3 or 4 and slammed down some more Pepsi. Then I proceeded to the bank -- painfully. When I got there I felt like throwing up but they didn't have a public restroom (small town bank). So I finished my business there and went back home. When I got home I lied down for a short time and the nausea went away.
After a while I was finally able to sleep. A couple hours later I awoke to a very minor headache and an incredibly valuable lesson learned.
There's really no punchline to go in here. It was the most miserable day in recent memory. I don't get migraines but I suspect what I got was much like one. I don't think I have ever felt pain that bad in my life. The lesson?
Caffeine wreaks havoc on your body.
On that day I vowed to myself that I would never have that kind of reaction again to the lack of caffeine and the only way to do that was to break my addiction. Was it a combination of factors? Possibly. But it certainly wasn't worth taking that risk.
So I quit. It took a long time to do it and it wasn't without moments of minor headaches. I had to slowly wean myself off of it. Every day I'd drink less caffeine and wait longer between feeding it to my body. Eventually my body adjusted, grudgingly.
It's Friday and I had a Coke this morning. Before that I had one Wednesday afternoon. Before that it was Tuesday morning. I didn't drink one this morning because I was getting a headache. or because I'd get one if I didn't . I drank it because I like the taste. But I realize that if I have another one today I could fall into bad habits. This isn't going to be easy. When this 12-pack is gone (only 2 left) I'm going to get decaffeinated cola drinks. That'll be good.
The caffeine is still pulling at me. I'm craving one right now. But I'm not going to risk getting that kind of headache again. Not ever.
Bad plan.
The headache kept getting worse. By the time I finally decided to take some ibuprofen for it I realized we didn't have any of that either. Shortly after that my wife arrived with some Pepsi which I quickly opened up and gulped down. Then I went to lay down for a while and wait for the headache to go away.
Wrong.
It kept getting worse. And I had to go to the bank. I grabbed another Pepsi and went up to the store to buy some Ibuprofen. My headache was so bad at that point my vision was starting to deteriorate. Fortunately the grocery store is only a few block away. I got the pain killers, went back out to my car, took 3 or 4 and slammed down some more Pepsi. Then I proceeded to the bank -- painfully. When I got there I felt like throwing up but they didn't have a public restroom (small town bank). So I finished my business there and went back home. When I got home I lied down for a short time and the nausea went away.
After a while I was finally able to sleep. A couple hours later I awoke to a very minor headache and an incredibly valuable lesson learned.
There's really no punchline to go in here. It was the most miserable day in recent memory. I don't get migraines but I suspect what I got was much like one. I don't think I have ever felt pain that bad in my life. The lesson?
Caffeine wreaks havoc on your body.
On that day I vowed to myself that I would never have that kind of reaction again to the lack of caffeine and the only way to do that was to break my addiction. Was it a combination of factors? Possibly. But it certainly wasn't worth taking that risk.
So I quit. It took a long time to do it and it wasn't without moments of minor headaches. I had to slowly wean myself off of it. Every day I'd drink less caffeine and wait longer between feeding it to my body. Eventually my body adjusted, grudgingly.
It's Friday and I had a Coke this morning. Before that I had one Wednesday afternoon. Before that it was Tuesday morning. I didn't drink one this morning because I was getting a headache. or because I'd get one if I didn't . I drank it because I like the taste. But I realize that if I have another one today I could fall into bad habits. This isn't going to be easy. When this 12-pack is gone (only 2 left) I'm going to get decaffeinated cola drinks. That'll be good.
The caffeine is still pulling at me. I'm craving one right now. But I'm not going to risk getting that kind of headache again. Not ever.
Monday, February 12, 2007
Memory Management: C vs. Java
Like many other companies, the company I work for decided to move from C to Java. There were a lot of good reasons for the switch. One of the biggest reasons was that we could ditch manual memory management (malloc(), free(), etc.) for automatic memory management (garbage collection).
Garbage collection is usually marketed as having the following advantages:
Let's examine these claims in more detail.
No memory leaks?
It's not really entirely true that garbage collected languages have no memory leaks, depending on your exact definition for "memory leak". Here are a few ways you can still leak memory in garbage collected languages like Java:
It's worth pointing out that I'm not suggesting garbage collected languages are just as likely as non-garbage collected languages to leak memory. I'm just trying to show that garbage collected languages can also leak memory.
Less development time spent on memory management?
There's no doubt that it takes more time to write software that does manual memory management. After all, those developers have to take the time to think about when memory resources can be safely disposed.
There's also no doubt that developers will likely spend less time fixing memory leaks in garbage collected programs. That's because there should be less memory leaks to track down and fix in the first place.
However, there's another issue that's often swept under the rug...
Before switching to Java, the company I work for used to sell products written mostly in C. Not surprisingly, these products had the occasional memory leak. It usually caused quite a stir around the office. One or more developers would spend anywhere from a few hours to several days hunting down and fixing the leak. Manual memory management was wasting our time! Garbage collection would free us from this onerous task!
Fast forward to the present. Hardly a day goes by that I don't hear about our JVMs running out of memory, or our products not scaling well because they consume such a massive amount of memory. It seems like developers are constantly trying to determine where the memory is going and how to use it more efficiently.
All of a sudden, the occasional memory leak doesn't sound so bad. Personally, I'm quite convinced we have more memory problems now than we did before.
Conclusion
To be fair, I can't say with any certainty that massive memory consumption is a common issue in garbage collected languages in general, though it does seem to be a problem with Java in particular. The company I work for has a lot of really smart architects: too smart, if you know what I mean. They're ready and eager to deploy complicated patterns and over-engineered solutions for every problem, which most of us mere mortal developers find really hard and time consuming to maintain.
Apparently, someone forgot to tell them that more time is spent on maintenance than on new development. Writing easily maintainable code should always be priority #1. Using every pattern imaginable isn't nearly as important.
But I can say with certainty that we seem to be spending inordinate amounts of time worrying about the shocking memory requirements of our applications these days. It seems garbage collection isn't quite the panacea that we were sold.
Like many other companies, the company I work for decided to move from C to Java. There were a lot of good reasons for the switch. One of the biggest reasons was that we could ditch manual memory management (malloc(), free(), etc.) for automatic memory management (garbage collection).
Garbage collection is usually marketed as having the following advantages:
- No memory leaks
- Less development time spent on memory management
Let's examine these claims in more detail.
No memory leaks?
It's not really entirely true that garbage collected languages have no memory leaks, depending on your exact definition for "memory leak". Here are a few ways you can still leak memory in garbage collected languages like Java:
- Accidentally keeping object references in collections.
Garbage collected languages induce a false sense of security in many developers. I've seen code that leaks memory by the megabyte because of developers forgetting to remove object references from collections when they're done with those objects. - Forgetting to release non-memory resources which themselves have allocated memory "under the covers".
Many APIs, such as AWT, Swing, and JDBC are partially implemented in native code. That native code often allocates memory from the operating system. If the application fails to release one of these kinds of resources, memory is leaked more often than not.
It's worth pointing out that I'm not suggesting garbage collected languages are just as likely as non-garbage collected languages to leak memory. I'm just trying to show that garbage collected languages can also leak memory.
Less development time spent on memory management?
There's no doubt that it takes more time to write software that does manual memory management. After all, those developers have to take the time to think about when memory resources can be safely disposed.
There's also no doubt that developers will likely spend less time fixing memory leaks in garbage collected programs. That's because there should be less memory leaks to track down and fix in the first place.
However, there's another issue that's often swept under the rug...
Before switching to Java, the company I work for used to sell products written mostly in C. Not surprisingly, these products had the occasional memory leak. It usually caused quite a stir around the office. One or more developers would spend anywhere from a few hours to several days hunting down and fixing the leak. Manual memory management was wasting our time! Garbage collection would free us from this onerous task!
Fast forward to the present. Hardly a day goes by that I don't hear about our JVMs running out of memory, or our products not scaling well because they consume such a massive amount of memory. It seems like developers are constantly trying to determine where the memory is going and how to use it more efficiently.
All of a sudden, the occasional memory leak doesn't sound so bad. Personally, I'm quite convinced we have more memory problems now than we did before.
Conclusion
To be fair, I can't say with any certainty that massive memory consumption is a common issue in garbage collected languages in general, though it does seem to be a problem with Java in particular. The company I work for has a lot of really smart architects: too smart, if you know what I mean. They're ready and eager to deploy complicated patterns and over-engineered solutions for every problem, which most of us mere mortal developers find really hard and time consuming to maintain.
Apparently, someone forgot to tell them that more time is spent on maintenance than on new development. Writing easily maintainable code should always be priority #1. Using every pattern imaginable isn't nearly as important.
But I can say with certainty that we seem to be spending inordinate amounts of time worrying about the shocking memory requirements of our applications these days. It seems garbage collection isn't quite the panacea that we were sold.
Thursday, January 04, 2007
Using Seams for Test Cases
I promised I'd write something techie one of these days so here it is. It's a technique I've been using to write unit tests against code that has a problem with dependencies. Many times this is a third party system or a database with a variable data set that you can't write a good case against. Michael C. Feathers describes this as a Seam in Working Effectively with Legacy Code (a fantastic book -- buy it).
So let's say you have a class something like this...
public class Report {
public Integer[] fetchDataSet() {
Integer[] lDataSet = null;
// Go to the DB and fetch the data set (use your imagination here...lol)
return lDataSet;
}
public int sumDataSet() {
Integer[] lDataSet = fetchDataSet();
int sum = 0;
for (int i = 0; i < lDataSet.length; i++) {
sum += lDataSet[i].intValue();
}
return sum;
}
}
...and you need to write a test for the sumDataSet() method. Normally you would write a JUnit test something like this:
public void testSumDataSet() {
Report report = new Report();
int sum = report.sumDataSet();
assertFalse(sum == 0);
}
The two main problems with this is that it has a dependency on the database and you might not know what the values in the database are going to be so it's hard to make a proper assert. There may be some cases where a sum of 0 is a valid sum so the assert in the example isn't a valid assert.
Fortunately, this problem is easily solved. All you need to do is subclass your Report object and replace the fetchDataSet() method with one of your own that returns a set of known values. It looks like this:
public class ReportTest extends Report {
public Integer[] fetchDataSet() {
Integer[] dataSet = new Integer[2];
dataSet[0] = new Integer(1);
dataSet[1] = new Integer(5);
return dataSet;
}
}
Now you can write your test case like this:
public void testSumDataSetWithoutDB() {
Report report = new ReportTest();
int sum = report.sumDataSet();
assertTrue(sum == 6);
}
And voila!! A true test of your sumDataSet() method that has no dependencies and a set of known values you can write a true assert against. It's a real unit test! Yes!!
If you have any questions please refer to the Feathers book. Every group that has legacy code should have a copy of it. Buy it, read it, live it.
So let's say you have a class something like this...
public class Report {
public Integer[] fetchDataSet() {
Integer[] lDataSet = null;
// Go to the DB and fetch the data set (use your imagination here...lol)
return lDataSet;
}
public int sumDataSet() {
Integer[] lDataSet = fetchDataSet();
int sum = 0;
for (int i = 0; i < lDataSet.length; i++) {
sum += lDataSet[i].intValue();
}
return sum;
}
}
...and you need to write a test for the sumDataSet() method. Normally you would write a JUnit test something like this:
public void testSumDataSet() {
Report report = new Report();
int sum = report.sumDataSet();
assertFalse(sum == 0);
}
The two main problems with this is that it has a dependency on the database and you might not know what the values in the database are going to be so it's hard to make a proper assert. There may be some cases where a sum of 0 is a valid sum so the assert in the example isn't a valid assert.
Fortunately, this problem is easily solved. All you need to do is subclass your Report object and replace the fetchDataSet() method with one of your own that returns a set of known values. It looks like this:
public class ReportTest extends Report {
public Integer[] fetchDataSet() {
Integer[] dataSet = new Integer[2];
dataSet[0] = new Integer(1);
dataSet[1] = new Integer(5);
return dataSet;
}
}
Now you can write your test case like this:
public void testSumDataSetWithoutDB() {
Report report = new ReportTest();
int sum = report.sumDataSet();
assertTrue(sum == 6);
}
And voila!! A true test of your sumDataSet() method that has no dependencies and a set of known values you can write a true assert against. It's a real unit test! Yes!!
If you have any questions please refer to the Feathers book. Every group that has legacy code should have a copy of it. Buy it, read it, live it.
Subscribe to:
Posts (Atom)