Philosophy of coding. The name itself should make you a little
suspicious. I mean, this is an engineering-ish endeavor. Some wacky
postmodernist might get carried away and start talking about "The
Semiotics of Analog Design", but nobody would take it very seriously.
When you write code, it is essentially an act of creation, however,
and there are a lot of assumptions that underlie that endeavor that it
is nice to make explicit.
So this little rantish screed of a treatise will be dealing with two
issues. First, various philosophies on how to code. Then I get into
a rant about the assumptions that underlie most code. Obviously, the
first is more applied and hands on, but many people have argued pretty cogently that the second is more
important in the long term.
How to write good code
The only way to really know how to solve a problem correctly is to
have already solved it in the past. Other engineering disciplines
know this, and people build on each other's designs making only
incremental improvements. The end product may look different, but it
is all pretty traditional. A bridge is a bridge is a bridge.
Software design is a little different, because software consists
almost solely of ideas, so it is tempting to do the computing
equivalent of "I'm entitled to my opinion", because you have valid
reasons for that opinion. Unfortunately, many things are the way they
are because they are the best solution. This is not always
true, but when you find yourself taking the road less traveled, try
to find out why it hasn't been taken before. It's quite possible that
it was, and then it was abandoned for very good reasons. When coding,
try to find out what others have done to solve a similar problem. If
you see obvious flaws in their design that make it inapplicable to
your situation, it is totally okay to bust out into new territory, but
don't reinvent the wheel. In particular, when working with
networks, don't reinvent TCP/IP. It's there, just use it. When
working with displays, PLEASE don't reinvent X or windows
messaging. They've both been done, and they both suck for completely
different reasons. If you have a truly new and innovative idea, go
for it. But be aware that it will make your life harder in every way,
so know why you are doing it.
Speaking of difficult lives, if you don't stick to standards, your
life will suck more. Seriously. It is often kind of a pain to make
something standards-compliant, but once it is, if things don't work
well with it, it is not your problem.. You've done your part.
Standards exist for a reason, and they are usually designed by
engineers for the good of everyone. To find out what's gone on in the
world, you'll need to hook up with someone more experienced in a
mentor-apprentice kind of way. Computer science may be
high-falutin', but coding is like a really high paid vo-tech job.
It's a craft, as well as a science, and getting the knack only comes
with time, practice, and guidance.
How do you write something well, if you can't kind find another
solution to study? Now we're talking philosophically. Some people,
notably the waterfall design method people, say that you should
carefully write lots and lots of design documents before you start.
Other people, notably the Extreme Programming people, say that you
should implement something crappy, and then use that knowledge to
build something nice the second time. There is also a third camp, which is the Unix way - build lots of little modules, and make sure each module works, then combine them.1 In this, I have to side with
the XP people. The only way to know what you need to write is to
have already written one. Or, put more cynically, "plan to throw
one away - you will anyway." This technique has the
added benefit that you get to start hacking at a problem right away
rather than doing tedious design, and when you get a steaming pile of
crap that only mostly works, you can feel okay about throwing the crap
out the window and starting afresh. Yes, you'll write code that isn't
used. Yes, that can suck. But in the long run the system that comes
out will be better and more fun to work with. If you really want to do some design up front, then go for the Unix method. Waterfall design is too painful to be fun, and when coding is fun, the end product is better.
Keep having fun. Some people really like writing device drivers.
Some people like writing fractal generators. Do what you want to
do, this is supposed to be fun, at least when it's not your job. And
if it is your job, try to have fun then, too. The end product, as
well as your quality of life, will be a lot better.
Speaking of making life better, one good way that you, as a coder,
can make life better for those who want to use your program is to make
it readable. Use comments, but more importantly use descriptive
variable names and don't write obfuscated code. Never ever ever
ever optimize your code. I promise that, unless you really know
what you are doing, your "optimizations" will only make your code
harder to read. Just follow the rules for optimization, and you'll
be set.
Code for the ages. Code gets used. That's its nature, and the
whole point of writing anything. Many of my professors were employed
in aerospace in the early seventies, and the fact that the code they
wrote is still being used in modern satellites freaks the
hell out of them. They fully admit that what they wrote was a pile of
crap, but it worked. If they had known it would be this used, they
would have done it better. They would have made it extensible and
readable. When you write code, you are essentially showing off your
geek cred, so dress to impress.
Code and life, the universe, and everything
So, Donald Knuth covered this much better than I in a series of
lectures entitled Things a Computer Scientist Rarely Talks About.
Go read them to muse on programming as an act of creation, and the
idea of playing god in toy universes and creating and destroying self
propagating patterns. I want to talk about code's relation to the
world around us.
When you write code, there are a bunch of hidden assumptions
underlying what you write. The biggest and best example of this is
the internet. The protocols and architecture of the internet promote
freedom and openness. This can also be observed by how difficult it
is for people to secure their systems from breakin. The
internet was not explicitly designed as this open architecture
free-for-all of friendliness. The designers had specific goals about
inter-domain routing, and robustness, and they wanted to make all the
protocols relatively easy. They weren't worried about Intellectual
property rights, they were worried about Internet Protocol stacks.
But, because they were in an academic environment, they ended up
making a system that was also very open to abuse in some specific ways
from malicious users, because it was a scholastic culture of trust. Similarly, they never check whether a particular piece of data would be illegal to share, because the whole point of academia is knowledge sharing.
Unfortunately, code has irrevocably entered the mainstream. Everyone (of a
certain socioeconomic status) has a computer, and everyone has heard
about software piracy, hackers, cyber-whatever, and DDOS
attacks. Everyone knows about bugs, and everyone is starting to
have an opinion on cryptography and intellectual property. Thus,
while code has not become political - very few people "code for
democracy!" - it can impact politics and vice-versa. So, think about
how you might give away the code. Think about giving it away for free. It is,
after all, only ideas, and ideas are cheap. Or expensive. Or
something. But whatever you decide, do it with open eyes. Consider
your options. Read Code, and other laws of Cyberspace, and the GNU
Manifesto. Read In the Beginning was the Command Line. Read stuff by Bill
Joy, Ray Kurzweill, Cliff Stoll, and others. We've moved past the day
where the geek world can do whatever it wants in a vacuum (if, indeed, those
days ever existed). Do whatever you want, but make sure you realize the
ramifications.
<< | Learn to Program | >>
- Thanks go out to Simpleton, for pointing out that I had neglected to mention this method.