Monday, January 31, 2005

Technical arguments

I thought this was pretty funny. And it helps keep things in perspective.

MindView Home Page

People use the tools that work for them

This is a brilliant essay. He talks sensibly about many things, and explains why my intuition has been telling me all this time that things like SOAP and their ilk are just wrong, somehow.

For some reason, partway through reading it, I had a sense of remorse for wasted college years. I had an Asian roommate once who had a study group. At the time I thought it was "an Asian thing," and perhaps it was, something brought from a culture that can live so closely together (although Confucious, I think, had a lot to do with that). But when I saw "The Paper Chase," I gained another data point about study groups, and attending a meeting of the Silicon Valley Patterns Group I saw a lot of value in working together. Jeremy Meyer came out to Crested Butte and helped me put together the Annotations chapter for Thinking in Java 4e, and I'm sure we got much more done together than we could have separately. Synergy exists, and I want more of it.

Perhaps it's because I'm trying now to schedule a couple of Open Space conferences (this Summer in Crested Butte) to produce this kind of synergy that I think back on what I might have accomplished in college if I had taken the step of forming study groups. Instead of working in isolation, we could have at least discussed and struggled with the ideas presented in class. Learning could have been collaborative. But are we guided away from collaboration by the idea that we must "do our own work?" And for that matter, do most college study groups devolve into test-preparation sessions?

I still think it would have been beneficial. Hey, why isn't this something that professors try to help the students do? At least ask if people want to form study groups, and give them time and chalkboard space to do so? That would be free for the university, but I think it could greatly benefit the experience (yes, I know there are the TA-led sessions, but that's not the same. The study group puts the responsibility on the individuals, and is thus a much better learning experience).

If you're still a student, try forming a study group, and tell me how it works out.

MindView Home Page


One of the messages produced by Lint4J is this:

Don't hardcode newline characters, use System.getProperty("line.separator") instead.
I started to change these, then began wondering if it was really necessary. I thought "what do they do in the JDK?" I found System.getProperty("line.separator") called six times in the entire JDK, whereas \n was used over 1400 times.

I'm going to go out on a limb here and guess that, if System.getProperty("line.separator") was important at one time, something has changed so that \n is now safe to use, at least for building strings. Cross-platform input scanning might still need line.separator.

Neither PMD nor FindBugs suggested this fix, either. In fact, it was quite interesting to see the differences in the kinds of things suggested by the three different tools; they were all pretty different. It seems like we could benefit from a common database of guidelines that any tool could use. Then the tools could compete on quality of implementation (if you've used these tools you know that there's generally a bit of wierd messing around just to get them to run) and features like auto-fixing, rather than their rule sets.

MindView Home Page

Thursday, January 27, 2005

Does OOP help?

From a reader:

...I myself have been in the process of questioning my beliefs with regards to programming. I discovered object-oriented programming in 1997 using first Turbo Pascal For Windows and then Java. When I look back now, I cannot help but wonder if the reason I was attracted to OOP was not so much because it was a better way to program, but simply because it was new. I had never seen it before and relished the challenge of mastering a new skill. Seven years later, however, I have seriously begun to question the entire OOP paradigm.

Does OOP really help us to write better programs? The problem with starting with that question is that it concentrates on the technology. I believe we should start with ourselves. What do we need to know to program well? What does a competent programmer look like? What is the best way(s) to think about programs? It is difficult to recall any discussions which have tackled these questions. Yet it seems that we need to answer them satisfactorily before we can then take the next step and actually decide what programming paradigms help us to program well.

Others have had doubts about OOP every now and then. Here's an opinion that was just published, for example. But all of that author's books are on machine code, so it seems doubtful that he knows OOP well enough to be able to give serious comment on it. People who are in the machine code world tend also to scoff at C.

OOP is an organizational tool, and yes, I've found that it -- especially when combined with design patterns, and a knowledge of OO design (disclaimer: things that I write and teach about, but I wouldn't if I wasn't fascinated with them and didn't believe in them) -- does help tremendously when organizing code and projects. But depending on where you're coming from -- like the machine-code guy -- it may not make sense right away, or be initially harder to use if you come to it with a particular mindset. I also started in electronic engineering, mostly focusing on chips and then adding more and more (assembly) code until the code started getting unruly, and that's what pushed me towards higher-level languages. But with that background, my brain tended to think about what the machine was doing rather than about what problem I was solving (that is, I was dwelling in the Solution Space rather than the Problem Space), and it definitely took time and struggle to get my head around ideas that others (especially those darn Smalltalkers) seemed to bandy about effortlessly. Indeed, it feels to me like only in the last few years, with the aid of languages like Python (and yes, even Java when compared to C++), that push me away from thinking about the low level stuff (rather than simply allowing me to think at a higher level if I put in the effort, as C++ does), have I really begun to be able to think naturally in Objects. And usually, when I start to believe that this is what I'm doing, something fundamental will pop up and slap me, such as this, which will make me realize that there's yet another element that I was either unaware of, or took so much for granted that I wasn't teaching it.

I suppose the best way to put it is that "you can learn the features of a language, but the way you use those features is an endless series of life lessons." Without State, or Strategy, or Factories, etc., polymorphism is indeed a bore. But with knowledge about interesting ways to assemble these features, the world becomes exciting and powerful.

Is OOP the end of the line for programming? Certainly not. It's just a building block on the path, just as procedural programming was. The procedural style of programming was clearly a good thing, and it gave programmers who were formerly chained by the limits of complexity of assembly language great leverage and the ability to accomplish far more with much less effort. But eventually it ran out of "reach," and OOP appeared. The reason these questions come up is not, I think, that OOP has "failed," but rather that we are beginning to see the edges of what OOP is capable of, even as the majority of folks are still coming up the learning curve of what has become thought of as the dominant, accepted paradigm.

Where does OOP fall short? Concurrency, certainly, which promises to become a major issues as multiprocessors become commonplace. Objects just don't seem to conform too well to concurrent programming; the only thing I've seen with any promise is the Active Objects design pattern, which basically serializes method calls to an object which is being driven by a single thread and queue (quite similar to most GUI programming systems such as Swing, but more formalized so that the programmer doesn't have to work as hard, which I think is key). Many have suggested functional languages, which tackle the problem from the other extreme by doing nothing that requires serialization. Personally I don't see the benefits of OOP being discarded in order to solve the concurrency problem; instead, I suspect that a hybrid of ideas will be incorporated into OOP.

I also think that improvements could be made with object packaging systems. One of the great benefits of Objects is the ability to create and reuse libraries, which I do believe may be the greatest leverage we can get out of them. And yet the library packaging and reuse mechanisms are still, to my mind, in a kind of infancy. C and C++ librarians have always been a platform-specific arcana, and Java, while consistent across platforms, involves figuring out Jar files and the never-ending miasma of Classpaths. And as much as I know about Python, the library creation and installation process is still a mystery to me (every time I go to figure it out, it just seems like too much effort, which is a bad sign). This seems to be the place where all the language designers lose interest (perhaps Ruby will have something valuable to offer in this area). But it isn't something that requires a great new paradigm, but rather just some focus on making the packaging issue effortless for both the programmer and the end user (unlike Java, which imposes the nightmare of Classpaths on both the programmer and the customer).

Your questions are appropriate in that they place the puzzle in the realm of ourselves and our psychology rather than assuming that OOP is broken and/or we need some different technology. Indeed, I believe that the benefits of OOP come from the fact that they are closer to the way that we think about problem solving that procedural programming. It's also valuable to notice that parts of every problem involve procedural thinking, and to realize that the initial pitfall was in thinking that (therefore) everything was procedural. What we've discovered, instead, was that procedural thinking is a subset of the bigger picture, just as we are perhaps discovering that OOP is also a subset of a bigger picture, which involves other issues like concurrency. What we may discover is that we collect over our lifetimes a bag of approaches (kind of a Chain of Responsibility pattern), along with ways to know when to apply them. Perhaps our languages may come to resemble this structure, adding things like backward chaining and neural networks to the mix.

Your questions have also sparked another thought for me. I've been reading Harrison Owen's "Open Space Technology" book. I've already had several very successful experiences using Open Spaces, and have been thinking about holding a conference or two this summer based on that technique. But Owen emphasizes that for an Open Space conference to work, you must have an overarching question that establishes the theme of the conference. I realized that the previous Open Spaces I've been involved with have had that question, although it was usually implied and not explicit. Your questions, however, could in fact establish a theme for a conference (in fact, I was involved in something like this in the past, the Writing Better Code summit, but we did not use Open Spaces and I think it would have made all the difference if we had). It's also the kind of conference where everyone has valuable experiences, and sharing those experiences are all that is necessary to impart that value to other participants. And the product of such a conference (a Wiki of notes from the sessions) would be valuable to everyone.

So I propose a 3-day conference called "Building Better Software," which will use Open-Space technology and be held sometime this summer here in Crested Butte. Size will be limited to 70 participants, and cost will be $300.

MindView Home Page

Wednesday, January 26, 2005

Thinking in Java Seminars this Summer

I'm beginning to plan the summer seminars and events. I've had two recent experiences which have made me rethink the approach I've been taking for seminars. One was last Fall's seminar for Sandia Labs, where I was forced to throttle back from my usual attempt to fit too much into a given amount of time. The result was that everyone seemed to have a much better experience. We didn't cover as much, but everyone learned it much better, and therefore they learned more.

The second experience is the process of writing the Fourth edition of Thinking in Java. This language, which was once hailed (admittedly, by the PR flaks at Sun itself) as being "much easier than C++" really isn't anymore. Sure, sure, lots of improvements and your efforts are usually much better spent, but still both complicated and complex in many cases. It's become clear that an introductory course will only be able to use some basic generics. It would just torture people to try to give any depth to Annotations or Concurrency or any of the more sophisticated ideas that are developed in the book. Indeed, the book itself has been separating into an introductory portion followed by more advanced (albeit necessary) topics.

So we're really talking about two seminars here. The introductory one, for people who are new to Java, Objects and the like, which will also work for non-C programmers (previously I would assume you knew basic C syntax, and the Thinking in C multimedia seminar is being reworked in Flash for internet distribution, so that should help even if C is no longer required). And the intermediate one will cover things like Type Information, Generics, Collections in Depth, Concurrency, Annotations, Enumerations in Depth, GUIs, Discovering Problems, Introduction to Analysis and Design, and will introduce design patterns along the way (I've been adding more of these to TIJ4). This one will be useful even for experienced Java programmers who want to catch up to the new features of J2SE 5 or to get more depth on some of these issues.

My question concerns scheduling. I can imagine that there might be some people who would like to take both seminars back-to-back, but I could be completely wrong about that. Work schedules etc. might make 2 weeks off completely unreasonable, and even if someone wanted to take both seminars, they would want them separate so that it didn't keep them out of the loop at work for more than a week. On the other hand, Europeans, Australians and other overseas-travelers might really like a 2-week stint.

Please add comments as to your preferences. Thanks.

Monday, January 24, 2005

Java Brain Drain

I can't be the only one noticing this. Calvin Austin has just left Sun, following on the heels of Josh Bloch and Neal Gafter. These are not easily replaceable persons; these are leaders. The fact that they are all splitting within a few months of each other doesn't seem like coincidence. Something's going on. I've never had particularly good contacts within Sun -- anyone know the scoop?

Flash Animation on Generics

You have to see this, it's combined nerdism (the next Star Wars + Java Generics). See it before he gets sued by someone.

I would like to see an example of "Autoboxing leading to NullPointerException," as Yoda asserts at the end of the piece. Can anyone post a code fragment in the comments?

Saturday, January 22, 2005

OpenOffice: not yet

I occasionaly fantasize about giving up Word and moving to OpenOffice, and someday even being able to live on a Linux box. But I push Word pretty hard and am able to do many interesting things with it, so unless OO gave me more and better it won't happen. One of the appealing things about OO is that it stores files in a true XML format, so I could imagine writing Python programs to do transformations on my books. The last time I looked at the XML format that Word used, it seemed like it was some strange proprietary thing, or that it could only be stored and not recovered, or something else that was too much of a limitation.

I told my friend Gary about OpenOffice, and he promptly went and tried it, and reported this:

Unfortunately, OpenOffice is quite inferior to Word, et al. I just uninstalled it and returned to Word. OpenOffice is woefully slow; it makes Word look like a high flyin' text editor, and it's not smart about memory. So if you load more than a couple of big files (the biggest I tested was 100+ pages, 8mb), it gets confused on the memory swapping. Finally, it did some repagination on a document, which I suspect doesn't mean much, but it does make me suspicious enough, and I have to stay compatible. So thanks, but I must stick with those bad guys in Seattle. Fun checking it out though.

While OO is wonderful as a free system for most folks to look at small documents that aren't too complex, it sounds like I'll have to wait awhile before I can hope to use it for my book development system. I'll keep watching and hoping -- I think there are lots of things about Linux that will eventually coalesce and make it a very compelling system for the average user.

I've even thought of a Mac laptop -- that instant-on, instant-off thing is so compelling, especially on a laptop. But the fact that J2SE5 is not yet available on the Mac and there's no word on when it will be, that kind of kills it for me (especially after Jobs claimed he was going to make the Mac the premier platform for Java. Oops, please Apple don't sue me for saying something about you in a blog).

Tuesday, January 18, 2005

PHP !?!

Over the past few months I've had this impulse to look at PHP. Perhaps it came from seeing the ".php" extension in a lot of places. But usually I could just lie down for awhile and this impulse would pass. After all, PHP is the web language for the unwashed masses, right?

I just skimmed through the PHP tutorial, mostly looking at Objects, exceptions, that kind of thing. Classes in PHP5 have a syntax that's sort of a weird amalgam of Java, C++ (at least, the scope resolution operator), Python and Perl.

I discovered that I didn't run screaming in disgust from PHP. I didn't find it offputting; the fact that they lifted syntax from these other languages was not only reassuring (they didn't feel the need to invent everything from scratch, but instead stole proven syntax from elsewhere), but made me think that the time-to-productivity with this tool could be quite short because I know the syntax from elsewhere. The fact that PHP is designed to solve the web problem, and just the web problem (although I've heard that people have used it to create desktop applications), is also appealing. I'm sure I'll continue to run Zope for its basic functionality, and for things I can do with simple Python Scripts, but I just don't seem to be able to keep all the arcana necessary for Zope programming in my head, and I'm convinced only full-time Zope programmers have actually waded through the slings and arrows of the ill-kept Zope documentation and really know what's going on, because they use it on a day-to-day basis. I'm also interested in looking at Quixote, but as many have pointed out, no really standard Web system for Python has arisen; on the contrary, more seem to appear on a regular basis. And for my needs, I just want something that's a straightforward solution to creating interactive web pages, and I don't want to have to remember a completely different world view every time I have to go back and build new web stuff (this is why Quixote is intruiging -- it's basically Python with help).

The only thing that gave me a slight lurch in the PHP syntax is the Perlish $ in front of variables. But my concern was that it was going to be all Perl from then on, and that didn't happen. It seems to be a fairly isolated case, and the absence of the rest of the Perl syntax it really isn't an issue.

So I'm intruigued, and will probably want to experiment with it. Once the Java book is done. Enough distractions, back to it.

Monday, January 17, 2005

No more getContentPane()!

I just discovered (and tested, on my codebase) that in J2SE5 Swing, you no longer have to get the content pane in order to add something to another component. This was always wierd, because you could call add() and -- even though it was acceptable to the compiler -- nothing would happen; instead you had to call getContentPane().add(). (There's probably something in here about the failure of static type checking to solve the problem of calling the wrong method.)

At the same time (and this may have been the case for awhile; I might easily have missed a meeting about Swing), all the examples in the Swing Tutorial now have main()s that look like this:

public static void main(String[] args) {
// Schedule a job for the event-dispatching thread:
javax.swing.SwingUtilities.invokeLater(new Runnable() {
public void run() { createAndShowGUI(); }

That is, now you are not supposed to start up a Swing UI directly from main(), but instead you must add a task to the event queue.

After all these years, apparently someone discovered a race condition from starting it directly in main(). I wonder how many people were bitten by this? (And how many new fundamental issues like this will show up over time?) Well, Swing is a large, complex library that has bitten off a huge challenge, so there are bound to be issues like this. But after spending the last few days with SWT, that library seems more straightforward in a lot of ways, probably because it's trying to leverage the underlying OS rather than replace it. In general, SWT feels like it's trying to "just build GUIs" rather than "create a whole new world for GUIs." I suspect that each has its place depending on what you're trying to do, but I'd be tempted to use SWT unless there was some compelling need for Swing, if only to provide a more familiar experience for the end user. I've seen a number of designers say "we think the standard UI (usually Windows) is bad, so we've reinvented it," without discovering whether the end user wants their UI to be reinvented. The point of a standard UI is that you don't have to go up a learning curve every time you run a new application. People want every application to look roughly the same, because "figuring out your cool new UI" is not the problem they want to solve.

Sunday, January 16, 2005

So much for Eclipse 3.1 Compliance

As an experiment, I imported the current code base for Thinking in Java 4e into the latest Eclipse 3.1 beta, since everyone has been saying how very compliant it is to J2SE5. After fooling around a bit to get rid of classpath and package import issues, my code base, which builds without problem, reported 1021 errors in Eclipse. A lot of these errors came from other errors, and one of the show-stopper starting errors is this one:

The type List is not generic;
it cannot be parameterized with arguments <Character>

For the perfectly legitimate line:

List<Character> chars = new ArrayList<Character>();

It accepts the code if I change the line to:

ArrayList<Character> chars = new ArrayList<Character>();

I think it's safe to say that, with a bug that fundamental, it's going to be awhile before Eclipse 3.1 is J2SE5 compliant.

Too bad; the other parts of it looked promising.

Sunday, January 09, 2005

Thinking in Ruby ... not

People have been bugging me about Ruby again, suggesting I write "Thinking in Ruby." Here are some fragments and replies from one conversation. The writer is talking about available books:

The first is the "Pragmatic Programmers Guide to Ruby" ... This book isknown as the "PickAxe Book" and is the standard reference most Rubyists keep at the ready... The online version covers Ruby version 1.6.7. A newer version of the book has been released recently and is updated for the current version of the language (1.8.2 ) .

I know Dave and Andy, and we had a conversation about this -- like this reader, Andy said that he had tried Python and it never made sense to him. I think there are a set of people for which that is true, and I'm starting to see that Ruby is a good alternative for them, since otherwise they'd probably just be stuck in Perl.

The next reference material would be "Why's Poignant Guide to Ruby". This, by far, is the most unconventional piece of reading I've ever found, but it has enough humor interjected to keep people reading. It's a bizarre piece of work, but you may find it to your liking.

Yes, I saw this when it was just coming out. This guy is brilliant, and I hope that someday he might translate his work to Python. In fact, his work is one reason that I wouldn't try to compete on an introductory level. The only thing that could happen would be (maybe a couple of years down the road) a "Thinking in Patterns with Ruby" but that would only be after the Java, Python, and probably C# versions.

The last of the "standard" tutorials that I would refer you to (of the ones I've read myself) would be the Ruby tutorial that comes in the standard install. A copy is available through the Ruby Online Documentation website:

Having another world class author that got fired up and started releasing books would be wonderful ( The thought of a 'Thinking in Ruby' book is just too much ... )

I just don't see it happening, other than (only possibly) the patterns book, which would not be introductory (and patterns would be the level at which I would be interested in exploring the language, anyway). I'm very happily ensconced in the Python community, and every encounter I have had with Ruby -- so far -- has not made me see the power in it.

I find that it has a less-than-elegant syntax because of its Perl influence (but if you like Perl, perhaps Ruby is the right fit for you). Again, to a Perl programmer this might be a great improvement. Look at one of the first examples from "Why's Poigniant Guide," where he's asserting that Ruby is "the language of our thoughts":

5.times { print "Odelay!" }

Or this:

exit unless "restaurant".include? "aura"

This makes sense if you used to be a Smalltalk (or perhaps Forth) programmer, and I know one who started with Python and has moved to Ruby. It also makes sense if you grew up Pennsylvania Dutch, where they say things like "Throw Papa down the stairs his hat," and "Throw the horse over the fence some hay." (This theory has been confirmed by a Pennsylvania Dutch Ruby Programmer.)

Of course I only get bits and pieces, usually from people who don't know Python trying to convince me that Ruby is better, but so far as I've seen it doesn't have the more interesting features or libraries that Python has. There's only one programmer I know -- the aforementioned former Smalltalker, which I think was the major influence -- that might have had a really good understanding of Python (I don't really know if he did) that has moved to Ruby.

Nonetheless, someday when a seminar is being taught when I am available, (ideally one of Dave and Andy's) I will take it. I think that would be the best and easiest way to get a grasp on it. I find that learning new languages is always good for new insights into programming.

And it's certainly possible that I might like it. Hey, the first time I picked up a Python book (after two intense months with Perl had come crashing down upon discovering how lame references and objects were, which I suspect was a major impetus in the creation of Ruby), I threw it down, saying "Indentation to determine blocks? That's the most ridiculous thing I've ever seen!" I eventually picked up the book again. A couple of years later, I was responsible for that year's Python Conference T-shirt "Life's better without braces!" (cartoon of a smiling kid who had just gotten his braces off).

So who knows? But don't hold your breath; my plate is very full right now and Python is the most functional language I use, so at the moment it's sort of "when they pry my cold, dead, fingers off my Python documentation."

But hey, if Ruby pushes the right buttons for you, great. It's probably the tool that will make you most productive right now, and that's what you should use. It doesn't really matter whether I am a fan (yet).

Thursday, January 06, 2005

Finding out what the customer really wants

IT Conversations has a great speech by Macolm Gladwell (author of "The Tipping Point") about marketing and customer feedback, and how people don't tell you what they really want. This destroyed some of my preconceptions and really made me think. He starts by telling the story of the Aeron chair (apparently the most successful office chair in history) and how all the focus groups universally hated it.

I've long struggled with a variant of this problem: how to know that a public seminar will have a certain number of attendees before deciding to hold it. If I could do that I'll bet I could give a lot more seminars. I think it requires some kind of upside-down inversion-of-control thinking to solve the puzzle. Such as: "groups form around various ideas, and when they reach a certain number of committed members they can commission a seminar." A big problem is that it's not a casual commitment since it's a traveling-and-full-immersion experience, and people have date conflicts etc. But the upshot is that if there's a way that a group of people can form and say to me "we have this many people willing to pay this much for a seminar on one of these dates," we'd work something out. I would imagine that the early committers would get the benefit of lower prices, as incentive to commit, and those that came after the seminar was officially announced would have to pay more.

Sunday, January 02, 2005


I think my fascination in the past few years has been in productivity and how (sometimes very subtle) things can increase and decrease productivity. Also the psychology behind this -- how we can often cling to things we are convinced are productivity enhancers which are actually reducing productivity.

And is there any way to awaken ourselves when we get stuck? Or is it like premature optimization -- regardless of how often we say "first make it work, then make it fast" to ourselves, we will always be inexorably drawn into the trap of thinking "well, I could do it that way but I know it would be slower."