MacGyver and programming

Tags:

Originally posted in my old blog at My Opera

The other day I thought about using MacGyver as an example when talking about programming.
Why MacGyver?

In the TV series, he builds clever tools from common everyday things. I think this is why MacGyver is a very good programming analogy – in both good and bad.


If we think about programming, it often is about solving specific problems with the same set of tools: The functions, constructs, objects and such of the programming language and associated libraries.

When solving a programming problem, you can find a good solution to it. This in itself isn't special in any way, but you can also come up with a clever solution – something MacGyver does in every episode of the series. Instead of taking the obvious way of doing something, you could come up with a new, better way of doing something – perhaps a design pattern or a new algorithm.

This is of course very good: The application or whatever you are working on can become better and the code cleaner and more robust, but you have to be careful.
If you aren't careful, you could shoot yourself in the foot: You come up with a new way of doing it, but instead of being a clever solution, it's a hacky solution.

It could be a clever hacky solution, but it's still hacky. What I mean by hacky is something that is confusing, complicated or does not work properly. Many of the things MacGyver does can be considered hacky and he often is lucky that his solution even works at all. A lot of the things are obviously something that can only happen in a TV series: It just happens, that all the things he needs to solve a problem are at a hand's reach. Your new way of doing something in code could end up being a lucky fluke – it could fail next week when something else is different, just like MacGyver could end up dead if he instead of today did something tomorrow, when the owner of the matchbox didn't forget it lying around.

This is something experience helps with. More experienced programmers will know more ways to do something, and they will know that instead of doing something in a new way, you should just go with the old solution you might've used elsewhere. This is something where design patterns come from: Proven best practices.

But without MacGyver programmers there would be no innovation, no new methodologies, no new algorithms… For example, we wouldn't have Object-Oriented Programming without people who want to try new things and new ways of solving old problems.

I think it's important to be innovative and trying to think of new better ways of solving things, but this should be done separately from the important code. If you have something that has to be foolproof and work well, use a method you know. Don't risk trying some fancy new way of doing it that you don't know very well yet (for example an another programming language) and ending up with a pile of unusable code. I can say I'm guilty of doing this myself, I've tried using Perl for some things and ended up with a Perl script which doesn't even work and then having to rewrite it in some language that I know better.

Use the new things in something that's not of absolute importance, where it doesn't matter if it ends up not working. If it's something that has to work, stick to the best practices.