Java versus C
Java’s greatest strength was the marketing machine propelling it at launch. Around the time of its release, they whipped up a frenzy amongst the technically inclined. Java was going to usurp all current popular languages because it would make programming easy and safe. You would only have to write code once, and your program would run on any platform. Vast libraries were going to be available.
Universities switched to teaching Java; I think even mine succumbed in the end. Marketing might made right. It must have seemed industry was poised to switch to Java, so they had better start teaching it to students.
I felt alone, or at least, in a small minority. The hype was seductive. Hardly anyone seemed to be evaluating the language on technical grounds.
The most amusing reminder of that atmosphere is JavaScript, so named for promotional purposes. The idea was to piggy-back on the fame of Java, which web browsers were scrambling to support so they could do amazing client-side feats. JavaScript was just icing on the cake, a toy for simple off-the-cuff special effects even beginners could use.
Of course, today JavaScript is now the language used for amazing client-side feats; Java-intensive sites are rare sights. An inevitable outcome. Despite many flaws, JavaScript is lighter and easier to develop than Java.
If a 2003 memo is genuine, Sun’s engineers recommended avoiding Java on their own systems.
Language flaws aside, Java should be shunned on the web for security reasons.
-
Apr 2010: A Java plug-in exploit lets malicious websites launch arbitrary applications on Windows, and possibly other platforms. Accordingly, Mozilla disabled Java on Firefox.
-
Oct 2010: A wave of Java malware exploits.
-
Jan 2011: Parsing certain floating point values can cause Java to hang, and apparently it’s been this way for about 10 years.
-
Sep 2011: Firefox disables Java because security.
-
Jan 2012: Security (and other) issues.
-
Feb 2013: 50+ security issues patched… but wait there’s more!
-
Aug 2014: a meta-security issue: Java programs may be killed off for non-technical reasons
-
Nov 2015: Java unserialize vulnerability
All or nothing
During these times, I was an Eiffel zealot, so I instantly saw flaws in Java because its object model was less pure. For example, there was a clash between basic types and classes, resulting in types like int being shadowed by classes like Integer. Shouldn’t everything be an object in an object-oriented language? Autoboxing has since been fixed, but how dare they omit it and claim Java has done objects right.
I was also a fan of multiple inheritance so I faulted Java for its single inheritance. Today, I’m a fan of zero inheritance so I still fault Java!
Less facetiously, I have always believed that objects cannot be neatly pigeonholed into one category. Thus when I was an objects cultist, of course I insisted on multiple inheritance. To simulate this with Java, one can inherit one class and "inherit" the others via interfaces. This begs the question: why not be consistent and always use interfaces? Why force ourselves to designate one as the true superclass?
Let us pursue this train of thought. When there are several "is-a" classes, the most consistent scheme is to use interfaces exclusively. One may reserve standard inheritance for classes with exactly one parent, but this is inconsistent precisely because we are singling out the case when a class has exactly one parent. In other words, we should use interfaces all the time, even when there is only one parent. So why do we need an object-oriented language? I can already maintain a bunch of related function pointers in C.
Interestingly, the chief designer of Java questions the wisdom of inheritance, and ponders whether an interface-only approach is practical. (Spoiler: it is.)
Reliably unreliable
Other problems lasted much longer. For example, a succession of widget libraries continued to produce ugly interfaces, underscoring the language’s immaturity, and directly conflicting with what marketers were saying. Great-looking Java applications are possible today, but again, how dare they make that claim in the beginning.
The libraries that purportedly made life easier for programmers are another example. Take the Thread class: it contains deprecated stop(), resume() and suspend() methods. By 1995, many thread libraries had been released, and furthermore, researchers had amassed a substantial body of theory on threading, so how did Java manage to get it wrong? If only they had diverted a few resources from marketing to engineering.
Here’s another anecdote from Java’s earlier years. In an ACM programming contest, one question involved computing the number of days between two given dates. A rival team tried to use a Java library call for this. A seemingly wise choice, as they can then avoid writing fiddly leap-year computation routines. Unfortunately for them, the Java library was buggy so they wound up having to solve the problem the hard way after being penalized for an incorrect submission.
The libraries are in better shape today, but Java was unreliable for years after its initial release.
Virtual insanity
In those days, a researcher from Bell Labs happened to be on sabbatical at my university, so I was exposed to Inferno, an operating system where applications were typically compiled to bytecode and run on a virtual machine. I witnessed its speed first-hand. While the most trivial of my Java applications would crawl, the Inferno applications felt as if they were running native machine code.
I later discovered Java’s sluggishness wasn’t because early JVM implementations had inefficiencies that were going to be ironed out soon. It’s because the JVM is stack-based. Inferno’s virtual machine is register-based, so it efficiently translates to native machine code. Stack-based machines, though simpler, are hard to simulate quickly.
I was flabbergasted. Why were they pushing a language that had the performance of a scripting language yet was more pedantic than a typical compiled language? Surely nobody will fall for that?
But it must have been too late to make changes. The forces of Java ploughed ahead, and eventually resorted to on-the-fly compilation (except they called it just-in-time compilation). By doing so, they sacrificed simplicity, removing the reason for choosing a stack-based VM in the first place.
Dalvik is an interesting wrinkle. It is a register-based VM for Java, though only supported on a handful of platforms.
For me, "Write Once, Run Anywhere" is more suitable for C than Java. I encounter platforms with C compilers more frequently than platforms with JVMs. It’s easier to write a C compiler than a JVM with JIT compilation.
Bad blood
I still harbour resentment towards Java. While I accept a valid strategy for making a language successful is to market it relentlessly, I feel it is a disingenuous practice that hampers progress. When evaluating a language, more weight should be given to its technical merits rather than its perceived popularity.
I also worry those who primarily learn Java are ignorant of low-level details, such as assembly and machine code. While thinking at a high level is often good, only thinking at a high level is bad. See also this interview with a former computer science professor at New York University.
A related concern affecting professionals is that details are so numerous and well-hidden that thinking at a high level becomes our only option. Or as "The Practice of Programming" puts it: the pile of system-supplied code gets so big that one no longer knows what’s going on underneath.
I was not present when C was unleashed upon the world, but I doubt there was much fanfare. It worked well even in its first days. It is a smaller, simpler system: for example, bugs in libraries, compilers and CPUs are relatively easy to identify and isolate. Imagine doing the same for Java. I suspect C mostly spread by word of mouth (and keyboard) via impressed engineers, the way a good honest language should.
Dysfunctional programming
Functions are second-class citizens. For the equivalent of function pointers, one has to define an abstract class with a virtual method. Closures are similar. This annoying overhead, though small, has tangible effects: sometimes I use an inferior design because I tire of making a one-off classes for function pointers.
For instance, the following GNU C code enumerates all the divisors of a number given its prime factors and their multiplicities, calling a given function for each divisor found:
void forall_div(void (*fun)(int div), int k, int *prime, int *mult) { void f(int div, int i) { if (k == i) fun(div); else for(int j = 0; j <= mult[i]; j++) { f(div, i + 1); div *= prime[i]; } } f(1, 0); }
How do I get similar code in Java? The best I’ve seen is to define an inline anonymous class that derives from Runnable. Function calls become ugly because of .run() warts, and I must ensure passed variables are finalled. It’s as fun as doing taxes.
The overhead is enough to discourage me. I’d duplicate code, use an iterative algorithm or better yet avoid Java. It makes no sense: why discourage short elegant routines?
"Execution in the Kingdom of Nouns" discusses these problems with an amusing parable.
One could take the opposite extreme and consider a language where only functions are first-class. Unlambda is deliberately unwieldy, but even a much friendlier version would feel strange. Languages are best when they recognize the importance of both verbs and nouns.
Like C++, later versions of Java are making amends for past excesses, and recent versions support lambdas.
Language issues
The goto statement is banned for religious reasons, despite legitimate use cases. (Then again, since Java is so slow, perhaps saving a dispatch when implementing a state machine is worth very little.) Exceptions are encouraged, which I find contradictory since they are even trickier than goto.
The synchronized threading feature is difficult to use. I’m mystified they did not choose a human-friendly threading model such as Hoare’s CSP.
Verbosity appears to be virtue. Code is hard to fit on 80-character lines. They say Java is the new COBOL. Is it because keywords are lengthy? Or that we need so many of them? Or that library classes and methods have long names? Or all of the above?
Literal arrays are unsupported.
The designers pandered to the C++ crowd, perhaps to foster migration. As a result, Java possesses some of the same flaws, such as idiosyncratic syntax, inheritance, and mixing of interface and implementation.
In C, we can subsitute a different implementation for the same interface at link time. For example, I often replace the default memory allocation routines with faster versions by linking with TCMalloc instead.
Each Java class compiles separately to bytecode so we could still substitute alternative implementations without recompiling all dependent code. However, at least some Java programmers prefer to invent yet more machinery such as dependency injection.
The package mechanism is a bright spot, a much better system than include files.
Garbage collection is nice, but hardly a new innovation. And it was initially a hindrance because the collector used to bring the system to a crawl. To renowned programmer Jamie Zawinski, garbage collection is a killer feature, but he also states he’s “back to hacking in C, since it’s the still only way to ship portable programs”. C is the desert island language.