Friday, February 25, 2005
Tuesday, February 22, 2005
PHP is not Perl
Sometimes when I can't bring myself to work on Thinking in Java 4e anymore, I've learned a little about PHP. Sounds kind of sick, I know, but it's actually refreshing in a kind of "green fields, unlimited possibilities" way.
I skimmed through a couple of books before settling on John Coggeshall's "PHP 5 Unleashed." His voice is good, and the book has a feeling of being carefully crafted.
I had somehow gotten the impression that PHP was a kind of Perl derivative, probably from the '$' before the variables. As I learn about it, however, it seems more like C than anything else. C with '$'s in front of the variables.
It also seems very consistent and well thought-out. Language features seem to follow logically from each other and so far I haven't found anything particularly surprising -- no special cases. Of course, I am getting on board at PHP 5, which seems to have worked out all the kinks, and added objects, which seem to be an amalgam of C++ and Java (mostly Java) with '->' instead of '.' for member selection.
It feels slightly weird to say this, as if, for some reason, I shouldn't like PHP. But I do. It looks like they learned well from other languages, and doesn't seem hacky all (again, my perspective is PHP5).
Of course, I'm aware of the problems of mixing presentation with model, and that at some scale this will probably start causing problems. But I also see the value of it -- if you stay below a certain size and complexity, mixing the two makes it much easier to program. And for me, web stuff is just something I need to get done so that it works, in the most expedient fashion. So I think, when I start building dynamic pages, that I will give PHP a try. I don't know that this is where I'll end up, but it seems promising. Plus it has a boatload of people who are using it, and apparently lots of good libraries. The fact that it is supported on most web hosting providers has a lot of appeal, as well.
I'm not giving up Zope, though. For what I know how to make it do, it works fine. For now, however, I am giving up on making it past the first elbow of the Z-shaped learning curve. I've actually been around that particular bend several times, and slightly up the curve, but I've come to the conclusion that if you don't live and breathe Zope development, it's too complex to hold in your brain between one bout and the next. So I'll use Zope (and maybe graduate to Plone) for as much as I'm able, and switch to another technology like PHP or one of the Python web frameworks for building more dynamic content.
I'm not so sure this is a bad model. The argument for purity is that it's easier to use only one language. But if Zope works for me up to a point, and then it becomes much harder to build a page that, for example, stores its fields, than it is to use PHP, and if PHP is reasonably well-designed so that it stays in my brain between bouts of web programming, then I'll be more effective with a hybrid of technologies than I will by remaining pure for the sake of being pure.
The other factor is that I've realized that in my case, I don't build applications on my server that are going to run into scalability problems. But I do run into a big roadblock when trying to develop web applications using the more "pure" approach, and as a result I've been stymied more often than I've been successful.
MindView Home Page
Monday, February 21, 2005
Servant or Disciplinarian
One of the missing dimensions in this discussion about static and dynamic typing is where a language fits on the continuous spectrum of servant at one end and disciplinarian at the other. People who complain about having to spend too much time arguing with the compiler are wanting more servant and less disciplinarian, and those who feel that more static type checking will be helpful are asking for more discipline.
Different situations need different positions on this spectrum. I think the arguments have been ignoring this, and the all-important missing statement in brackets that should precede every declaration is: "[in my situation] more (static|dynamic) properties are better."
The situations are usually more about people than they are about programming. And for some reason programmers don't like hearing this, but the second missing dimension in this discussion is that programmers are different. There's a huge difference between novice programmers and the mysterious "5% who are 20x more productive than the other 95%."
This is probably a threatening thing to say because it can be tied up with things like money and social acceptance and the perceived quality of our (as Woody Allen said) "second-favorite organ." But why should someone who can swing a hammer believe that they have the same experience or ability of a master finish carpenter?
I remember coming out of school with a Master's degree in Computer Engineering (my undergraduate degree was in applied physics). I actually did know some things, and I could figure other things out. But my cubicle-mate, Brian, could build software. He had spent enough time thinking about it and doing it that he had perspective on the problem. That certainly didn't make me useless -- I created valuable things while I worked at that company. But at the time, we were just beginning to use pre-ANSI C for hardware programming, and there was very little type checking, and I could have used more. Of course, pre-ANSI C was not much more than a glorified assembly language and there was hardly any type checking at compile time, and none at runtime. This might be part of the confusion about dynamic languages, that it seems like you might be thrown back into working without a net, as we were doing then.
On numerous occasions I've heard Smalltalk programmers say that if a programmer starts with Smalltalk, it fundamentally shapes how they approach all programming (they also talk about unlearning the damage done by other languages in order to learn Smalltalk). I think this is probably true; most of the people I know who started with Smalltalk have a much better grasp of the fundamental concepts, and an ability to see what is simply the cruft of a particular language, than people who start with languages that are closer to the metal.
I started with assembly, then Pascal & C, then C++, then Java, then Python (ignoring a number of flirtations with scattered other languages like Prolog, Lisp, Forth, and Perl, none of which really took). So my experience with languages started with the cruft and the important concepts mixed together. Not only could I not tell one from the other (so they all seemed equally important) but the higher-level concepts were not initially available.
In C++, I started by creating my own containers because there was no STL for many years (the STL was actually added rather late in the standards process, and its initial goal was as a set of generic algorithms for functional-style programming, and not as a set of containers which is where it gets most of its use). All I wanted, initially, was a holder that would expand itself -- to a C programmer, stuck with fixed-size arrays, this seems rather revolutionary. And so it didn't seem strange that Java would have a separate library for containers. But Python doesn't even bother with fixed-size arrays. You just use a list, which is always there without importing any library, as common as air, as are dictionaries (Maps to C++ and Java programmers). Now, even sets are first-class objects. In C++ and Java, collections are intermediate level concepts, and so some programmers don't use them. It's quite difficult to know that some libraries are more important than others; in my initial experience with C, my tendency was just to write all my own code and not rely on library functions, so it was quite awhile before I understood that malloc() and free() had special importance, and the whole possibility of dynamic memory allocation (I really did come up the ladder from hardware). I now know intimately how lists and hashtables work, but I also wonder how different I might have been had I started with a language where dynamically-sized lists and dictionaries were just part of the fundamental toolset. My experience now, and what I see with those who started with Smalltalk, is that you take those concepts with you into languages like C++ and Java, and you are able to manipulate those languages more effectively as a result.
My point is that programmers are all on different positions on their own learning curve, and there's some intersection of that curve and the level of discipline that they and their team need. This is an issue that is based heavily in the human dynamic of the team. That is, it depends on who you have on your team and how they work together. Some teams benefit from the extra structure provided by a static language, others need the rapid abilities of a dynamic language.
However, much of the argument around static vs. dynamic languages seems to be about finding bugs. I will make two observations about this. First, both static and dynamic languages still produce bugs. And no matter how many static type checks a language includes, bugs still happen. I think there is a belief in some camps that we simply don't have enough static type checks in Java, and if we had more then eventually the compiler could guarantee provably-correct programs. And this leads to the second point: Java has a lot of dynamic checks. And it benefits from a lot of dynamic abilities, such as reflection. In fact, I think it could be safely argued that Java straddles the static and dynamic language worlds. For that matter, Python does a little bit of type checking during its compilation phase (and there is talk of adding more).
In the end, languages need, I think, to be opportunistic about when they can discover errors. And the best time to discover these errors is not always clear. Sometimes it's quickly obvious: it seems unlikely that array-bounds-checking at compile time is feasible, for example, so Java does it dynamically, at runtime (and C++ doesn't do it at all). Many of the arguments seem to be about when type correctness is established. The static approach is to force explicit declarations, the dynamic approach is to perform checks at runtime. In between is type inference, where you don't have to explicitly declare types but they're checked at compile-time anyway. An even more interesting approach would be a combination of type inference and dynamic checking, which produces the best of both worlds.
Perhaps what I most value about the dynamic language experience is that I feel it has opened my mind a bit more, indeed, as learning any language has. Even if much of your day-to-day programming is with a language like C++ or Java, I believe that learning a dynamic language will not only add another very valuable tool to your set -- and one which may allow you to solve "side problems" much more quickly -- but it allows you to go back to your "main" language with a new perspective, and one that will help you solve problems in a more effective and elegant fashion.
MindView Home Page
Friday, February 18, 2005
Not as They Seem
Alfie Kohn has an essay about how rewards are counterproductive. He's written numerous books on similar topics.
Malcolm Gladwell (author of The Tipping Point) has just published Blink, which also talks about things that we assume to be one way, and are actually completely different. I referred to a speech he gave which you can listen to online here.
For some reason I find this kind of thing fascinating. It's something about having our "natural" assumptions turned completely upside down. I guess it it regularly reminds me that the world is not so predictable/boring as we come to expect.
This is also the motivation for questioning many of our assumptions about programming. I've had personal experiences where my assumptions have been wrong -- where I've discovered that something that I thought was helping was holding me back.
My friend Daniel commented:
I read something a while back about giving a reward BEFORE you start. That someone gave small bags of M&M's to groups before they started, and they performed much better, because they were happier. They don't have to be big things, just something small and pleasant sets the brain in a different mode. I now cannot find that article, even with my desktop google!
MindView Home Page
Wednesday, February 16, 2005
One of the difficult challenges when discussing the productivity of dynamic languages is to make any sort of proof about said productivity. I believe this is because it's one of those synergistic, emergent systems, where everything taken together produces a surprising or unexpected result. But this makes it hard to come up with any kind of proof, and as a result we have a bunch of people who are primarily just speaking about their personal experiences trying to convince people who haven't had such experiences that it's a Good Thing ("Hey man, you gotta try some of this stuff!"). In our business, we are constantly being shilled into trying new products, technologies, etc., most of which have been a waste of time, some of which have been a collosal waste of time AND money. So we've learned to ignore most of these claims.
Oliver Steele offers his theory about why dynamic languages are more productive, folding in issues about testing, in Test versus Type.
MindView Home Page
Tuesday, February 15, 2005
Destructors in GCed languages
Based on a conversation with Walter Bright, the creator of the D Language, a summary of my thinking on destructors in GCed languages:
1) If you have scoped objects, they can automatically call destructors at the end of the scope, because it's deterministic. However, this could cause confusion with:
2) Heap based objects should not have destructors called automatically, because garbage collection is not deterministic. If cleanup other than memory release is necessary, the programmer should be forced to decide when this happens, presumably with finally clauses. If cleanup is necessary, it must be determined by the programmer because the GC is nondeterministic. This leads to:
3) The value of the destructor for GCed objects is that it automatically calls base-class destructors in the proper order (most-derived first). Java abdicates any responsibility for cleanup other than memory release, which forces the programmer to generate their own "cleanup" methods and to be dilligent about proper order.
4) The biggest remaining problem is member-object cleanup. Member objects need to be cleaned up in reverse order of initialization. A destructor should automatically call member-object destructors, and if you don't expect the GC to call the destructor (which doesn't work anyway), you can assume that all member objects are still alive when the destructor is called. Therefore it is safe for the member object destructors to be called. Without this the solution is incomplete.
5) The one thing I could see adding to D is functionality to verify that destructors have been called if they exist. This would close the loop and solve the problem of destructors with GCed objects. The key is that the GC can't do it (as the Java designers learned with the multi-year finalize() debacle). But the GC can verify that destructible objects have had their destructors called by the time the GC is releasing the object.
Note that Python is able to get away with automatic destructor calls because it uses reference counting and thus the moment the reference count goes to zero the destructor is called. I.e.: it's deterministic in Python.
MindView Home Page
Monday, February 14, 2005
I was looking at Sun's Core Java Technologies Tech Tips for January 4, 2005, and I saw that Gilad Bracha is described as a computational theologist. In the Wikipedia, Theology seems to be about religion. Does this means that Gilad is someone who tends to get into religious debates about programming? Or is this a pay grade at Sun? Perhaps someone can explain this term to me.
I think I see what Gilad is trying to say. One of the things I respect about the Jewish tradition is that they are taught not to accept the holy books at face value, but instead to question and argue about them. I think he means that he brings this approach to the design of programming languages. The risk in using such a term is that it could be misunderstood all over the place.
MindView Home Page
Friday, February 11, 2005
Gosling on SWT
In a presentation to an Australian user group, James Gosling said some things about SWT that I thought were a little far out, so I asked Chris Grindstaff (who helped me with the SWT section in TIJ4) about it, and he allowed me to publish his comments:
"From memory, there were some OTI Smalltalkers 7-8 ago who did try to convince Sun they were going down the wrong path with AWT/Swing. Swing is very analogous to VisualWorks Smalltalk's approach of emulating the widget toolkit. OTI had been native UI toolkits in Smalltalk for many years and thought it was a better approach. A lot of those folks are responsible for SWT today. They thought emulated was the wrong way to go and they still do.
"Gosling says AWT == SWT. That's sort of true but less true than more. The big difference between the two is AWT is very much least-common-denominator across all platforms. SWT isn't. The other significant difference is AWT chose to hide the emulation layer in C. In other words, java.awt.Button is the same on all platforms, while the native peer differs on each platform. One of the consequences of this is porting is harder, some things are in Java, some aren't. It also makes for a larger footprint because a java.awt.Button has fields for it's size, bounds, etc that the OS also has.
"In SWT the org.eclipse.swt.Button Java/class is different on each platform. The SWT lib does nothing but stuff methods straight to the OS. One toolset and less duplication.
"Gosling also claims SWT is way simple. That's not true either. It's not as full-featured as Swing but that's also by design.
"The API matching Windows is somewhat fair, but much preferable to least common denominator with AWT. Where something doesn't exist on a platform an emulated version is created. But again that's better than not implementing a tree, table, or notebook, example, just because they don't exist on one of the targeted platforms.
"Porting and consistency aren't nightmares on other platforms. One of the beauties of moving the widget toolkit code from C to Java is your platform gurus can program the widget toolkit in Java using Eclipse. So when you look at the SWT source for a org.eclipse.swt.Button on Windows it makes a lot of sense to a Windows guru, likewise for the widgets on the Mac, they make lots of sense to a Carbon guru.
"The proof is in the pudding. You rarely see an AWT application, even most Swing apps are ugly and OS strangers. You can get close but never close enough. For example when MS added theme support in Windows XP, SWT got those for free. There are more and more SWT built applications appearing. In general, why struggle to emulate pixel by pixel what Microsoft, Apple, and all the Linux developers are doing for you? Don't reinvent, use.
"You can also check out the article I wrote for Linux magazine, although I probably don't say anything I haven't said here: SWT: Eclipse's Secret Weapon"
MindView Home Page
Wednesday, February 09, 2005
Static Versus Dynamic Attitude
A reply to Bill Venners about his Static Versus Dynamic Attitude posting:
I think your point about the language "leading you in a particular direction" is key. I've found that each language tends to make me think about doing things in a particular way, and they often seem like they actively prevent me from thinking about other possibilities. Witness the C++-ish influences on the designs in GoF.
But if I learn a new language that has a different way of thinking, then I can go back to a previous language and apply that way of thinking, just as you are now able to think about tuple-ish things in Java whereas it may not have occurred to you before. It's the beginning of a differentiation between saying "everything is an object" and "there are different kinds of objects."
On the other hand, your statement about the tuple reflects more of your static-mindedness: "From a safety perspective, a tuple seems even more error prone than a Map, because I have to get the order correct on both sides, not just the names." When you think about it as a Python tuple or a Java container of Object, your static-mind says "the compiler can't check it, so any object can go in any slot, and I could get it wrong." But if you realize that in both Java and Python (but not C++, which may be where the original roots of this "problem" lie -- however, with C++ templates you can create a type-checked tuple), type checking also happens at run time, you realize that the first time you try to use one of these objects that you've put into the wrong slot, you'll find out about it.
I think the biggest problem when thinking about this is that static-mind is very deterministic and unforgiving, and says "if I can't find out about it at compile time, all is lost." In reality, you can only find out about some problems at compile time, anyway. Dynamic languages shift the discovery of problems more towards runtime. If you can accept that there are plenty of problems that can only ever be detected at runtime anyway, it's possible to look at this issue with a little more perspective, and to calm down the stridency of the static-mind a bit.
MindView Home Page
Sunday, February 06, 2005
The Economy of the Small
The term "Economy of Scale" has always been used to mean "Economy of Large Scale," but things have been happening -- primarily due to the Internet -- that have changed this. Ebay is an excellent example: it has made money from the beginning, all the way through the dot-com crash and beyond, as if it works in an orthogoal playing field. Which it does: millions of successful businesses, some of them lasting only for a single transaction but successful nonetheless, most of the longer term ones operated as part-time ventures. All stable and thriving.
The big-economy media has only been able to report this through the window of Ebay as a big business; the smaller aspect is occasionally noted but not really seen as an important data point. Because of its skew, the big-economy media has gained its own small-economy bugaboo: bloggers. People are creating their own newpapers using feed readers, selecting their own columnists. "News" is no longer fed only through the controlled pipe of newspapers and television, and we can only hope it will get worse for the big-economy media, who have never had any competition except others playing the same game. This is only getting started; feed readers are still in their infancy and most people haven't started using them yet.
The first time the phenomena of "the Internet has fundamentally changed things" hit me was at the Software Development conference, where I had created and chaired both the C++ and Java tracks for a number of years. At some point I could see it coming: the trade show portion of the conference (where they made the bulk of their money) made no sense anymore in the face of the Internet -- especially because this was software, which people could find out about by going to a web site, and usually downloading a demo. The cost of renting a space and sending a team to man the booth was enormous (especially in terms of lost productivity), and if you could have a 24-hour booth presence on the Internet, why bother? Other conferences were failing, and I did not want to stand by and watch the SD conference, where I had put so much time, effort and emotion, auger in. So I left.
After several years, I checked it out again, and discovered last year that it had not only survived the bloodbath but that I had a really good time. It may be that, because most other conferences of this kind (general, not product-specific) have vanished, everyone is coming to SD.
A central organization can be a good thing in some cases, and whether or not Ebay has an impact on Walmart, I think the two serve different enough purposes that Ebay is not going to threaten Walmart (even if Walmart had any conceivable way to fight back). Another example is Linux, and although Microsoft is doing its best to fight back it seems like they are operating in different spaces, and so, short of a completely fascist government outlawing open source (not entirely impossible in the current climate), it would seem that Microsoft must eventually adapt or die. Adaptation is certainly possible; Microsoft has value worth paying for and that value just needs to shift. Not all software can be free, because some of us need to pay expenses.
In any case I'm speaking at SD again this year, and really looking forward to it. As a speaker, I'm able to attend the pre-conference tutorials as well as the sessions, and everything looks fascinating. You can find details on the calendar.
After having some experiences organizing private conferences based on Open Space technology, this year I'm holding two small public conferences. Like comparing Ebay to Walmart, these are quite different from what SD even could do, I think, because they are topic-focused, inexpensive ($300 for a three-day conference), and a completely participatory experience, and Open Spaces are so energizing and engaging that they don't fit in a more formal conference like SD. Obviously I think the formal conferences are still important (as I said, I'm really looking forward to SD), but my experience of Open Space conferences is that the energy that you come away with is completely different, and that's why I want to hold them -- I get at least as much energy and excitement out of the conferences as everyone who attends, because it really is a group learning and sharing experience instead of one person giving information to a group (which still has its place). You can't really believe it until you've experienced it, because it's kind of counterintuitive. But in every single Open Space event I've participated in or held, people come in saying "I don't believe it," and come out saying "this is one of the best things I've ever done!"
You can see the topics on the calendar: Building Better Software and LAMP Patterns & Practices, but you'll also see that I have no links or signup forms yet. That will happen sometime after TIJ4 is completed.
MindView Home Page
Thursday, February 03, 2005
More powerful than C++, too
There has been some notice of my comment that Java is now approaching the complexity of C++. I should note that there are things that I can do with Java that I wouldn't dream of doing in C++. For example, I'm now trying to finish the Annotations (new J2SE5 feature) chapter, and this topic has gotten me involved with bytecode engineering, which is rather astounding. Kind of a back-end macro facility (in the sense of Lisp macros, which I have only a vague sense of). I've been using Javassist, which, while not trivial, is way easier than the alternatives.
I would still want to use C++ for certain types of programming: things that have to be close to the hardware and/or fast, but not so big or complex that I would have to start worrying about memory management. For problems that involved C++ in this way I would probably start by writing the system in Python, then creating components in C++ that had to be close to the hardware or fast (although for fast, Pyrex seems to be the up-and-coming way to solve that problem).
But for all my carping about Java (most of it still well-founded, I think) I have to admit that J2SE5 has been a breath of fresh air over previous versions, mostly because it does feel like they've been trying to make it "friendlier to the programmer" with things like the foreach loop, autoboxing, and the new enumerated types. The possibilities of annotation and bytecode engineering are quite mind-bending (annotation support for Active Objects keeps bouncing around in my head). Now that I've finally gotten up most of the learning curve for generics and can at least push them through hoops (note that they don't jump by themselves), I've found them to be at least tolerable -- but I do have the same feeling with them that I did when I finally mastered operator overloading in C++ (and understood temporary objects): I get this kind of smug, aren't-I-clever-because-I-know-all-these-tricks feeling which we techies love but is usually a bad sign because it means you've mastered something that is just arcane, and that doesn't have anything fundamentally insightful involved (like, I think, design patterns do).
Java attempts to straddle the gap between statically-typed languages like C++ and dynamic languages like Python, Smalltalk, Ruby, etc. In fact, I think that Java's biggest contribution may be as a bridge to dynamic languages, just like C++'s contribution was a bridge from procedural to OO.
I've seen people get very uncomfortable about the dynamic aspects of the language; the possibility that an exception may be thrown seems to be the same as an error to these folks. But I think that the dynamic nature of Java is its greatest strength; some C++ projects break apart because they try to make the language do dynamic things when it wasn't designed for it, and in those cases Java can, fairly easily, make the transition.
MindView Home Page
Wednesday, February 02, 2005
jre/lib/ext no more?
I vaguely remember hearing about a security issue that involved the jre/lib/ext directory. I do have the latest J2SE5 installed, and at least one program that used to work now runs into problems when trying to load something from a Jar file that is in the jre/lib/ext directory, but not in the classpath. It seems to work when I add the Jar to the classpath.
So I'm guessing that the days of saying "just put the jar in the jre/lib/ext directory" are no more. Can anyone confirm/deny this?
[Later] If I do a Java installation, I seem to get two jre/lib/ext directories. I have a directory called "ProgTools" where I put Java, and a jre/lib/ext appears there, and that's where I've been putting the extra Jars. There's a second one that appears under "Program Files" regardless of where I install Java. Oddly, Ant seems to find the ProgTools jars during compilation, but if I try to run a program by hand (without Ant), I have to explicitly put the Jar in my classpath to get it to work. What I haven't tried yet is taking all the extra entries out of my classpath and moving the Jars from the Progtools to the Program Files jre/lib/ext directory.
The thing is, I'm pretty sure it was working before I upgraded to J2SE5.01. There may be an configuration variable somewhere that says which jre/lib/ext directory to use.
MindView Home Page