I used to drink the kool aid, it had a nice taste, but the more time passes the more I find myself agreeing with Bart, my mentor of old. Objects are big pile of fail. The Rubyists and the Pythonistas are coming now, with their pitchforks and baling wire. But they need not worry, they will be last against the wall. But to the wall they still will go.
Let us start off by all agreeing that C++ is an abomination. Now, I know most of you will agree with me on that count, but chances are there are a few renegade cells still hanging on (masochists if they put up with me). If somehow the vastness of the FQA doesn't convince you, even Amazon knows C++ is the dumbest language on earth. One of the greatest claims to fame of C++ is that they've managed to convince everyone that an "object" is a struct with a vtable. Do you hear that sound? That screaming comes from all the stillborn projects where a dynamic message-passing/actor model of OOP would have saved years of work and millions of dollars. And rather than fix this broken model, C++ layered feature upon feature —each more broken than the last— in an attempt to cover up the smell.
The single greatest failing of objects can be traced back to this failure to even embrace the object-oriented perspective in the first place. If the original idea (or the original post-Simula idea) had been embraced instead of being bludgeoned in a back alley by a PDP-11 assembler, then the OOP paradigm might have stood a chance. It might have had a chance because it's a paradigm, an ideal of organization and style, not a bundle of language features. Language features help, don't get me wrong, but Perl has all you need, as does Haskell, as does C if you do the name mangling yourself. But without people harkening back to the elder days of CLOS and Smalltalk, they will never know what they lost.
But C++ did have one thing going for it: People Hate C++. People hated it so much they invented Java to begin hammering the final nails in the coffin. Java sucks, but Java sucks like a pre-frontal lobotomy. If you hate C++ you have verve and determination in your hatred. Noone seems to be able to give a shit enough to really work up a good hate for Java. As Jamie Zawinski says, at least Java doesn't have
free(); but that can be said for any language invented in the last 28 years.
But there are good reasons to work up a nice heartwarming hatred for Java. For one, explain Java's memory model to me. No, seriously. Perl uses five times less memory across the board, and it's interpreted. Ruby? Python? SWI Prolog? All of them use 3~5 times less memory. Prolog ferchrissakes! And don't even get me started on Haskell, Clean, OCaml, or Eiffel. Let's face it, Java's memory model sucks. And even with the JIT it's only marginally faster than these compiled languages.
One of the places Java's memory model is broken has to do with its single-minded view of objects: that everything is one (except when it isn't) and that all objects live on the heap. Oh where to even begin. Let's just say that adding light-weight tuples ("records" or "structs" to you imperative folks) as primitives to the language would increase performance by 20~50%.
While we're harping about performance, consider that setter and getter methods take 300 times longer than direct access. Not percent. Times. Why is this not optimized away? Who knows, but it probably has something to do with the Java security model which means you can't do tail call elimination or inlining. Of course, you can, in fact you can do it and still have a stack-inspection security model, just not the model Java uses. It's funny that a language so concerned with security offers no efficient mechanism for giving read-only access to fields, nor any deep read-only view of objects. Of course the whole Java ideology revolves around refactoring to small functions, it's a shame function overhead is so great. Want to know another hidden performance hole? Iterators. Don't believe me? Try writing an iterator for arrays instead of using the C-style for-loop. Want to know another one? Strings. Never, ever use
+ on strings. Heck, as Jamie Zawinski says, you'd be much better off just using
char if you care about memory; and we all know how well that worked out for C.
But we're not here just to talk about the failure of C++ to have a defined semantics, or the failure of Java to make the bare minimum optimizations. We're here to talk about the failure of OOP. Java inherited C++'s struct-based model so it's barely salvageable in the first place. The fact that it's an IDE-only language with too much boilerplate to be readable by human eyes is besides the point. Both of these languages have failed on a catastrophic level to manage the complexity inherent in large system design. But wasn't OOP supposed to be the silver bullet with modularity, encapsulation, and reuse? Well, that's the point. If OOP was supposed to be so good at those things then why are we still here? Both of these languages enshrine the public conception of OOP and both of these languages provide too low level of a view to be useful for any kind of reuse. As was said in the OOPSLA debate, they are languages for creating bricks, but bricks do not a city make.
The struct-based notion of objects undermines the very principles of OOP. The problem is that these languages fail to decouple implementation (the struct in memory and the bytes of code) from the interface that implementation represents (the methods and semantic behavior the object represents). Semantically, it is often the case that a subtype is more specified and so has less freedom than its supertype. If we have pairs
(X,Y) as the supertype, then
(42,Y) is a subtype of it. And yet, with the implementation-inheriting struct model we are forced to have objects be of monotonically non-decreasing size as we travel down the type hierarchy. That is, even though
X is always 42 in the subtype, objects of that subtype are forced to be at least as large as their supertype. Languages like Python and Perl just use dictionaries to encode an object's fields and so they don't suffer from this struct problem, though many have leveled complaints against such free-wheeling approaches. Smalltalk had it right in that you could just gut the inheritance tree and claim to be any type you want so long as you have the methods.
So here's another problem, one that'll knock that smug look off the Pythonistas faces. If you've ever done a serious project in an OO language then you'll almost certainly have run into the fragile base class problem. The most glaring example of this problem is the idea of having an Object class from which all other classes inherit. It's okay Python, all the other kids were doing it, I don't blame you. Java has one of the most monumental examples of why this approach is entirely wrong, it's called the Cloneable interface. The idea of having a singly rooted inheritance tree is the second most blatant failure undermining the very notion of OOP. If "everything is an object" then you simply design the language so that everything is an object. Your type system says that everything is an object, with whatever that entails. Full stop. Inheritance is a nice little thing, but it is not the be all and end all of language design. What it is that makes something an object is not a collection of methods that every value in the language supports, in dynamic languages you could remove those methods anyways. Being an object means simply: being. an object. Trying to have a single class that everything inherits from introduces a bottleneck into the system which guarantees that you be bitten by the fragile base class problem. Smalltalk at least had the good sense to push the singly rooted hierarchy all the way up to MetaObject.
Anyone who's been following the developments of Java over the last decade will begin to notice a pattern compared to C++. And as any good OO developer knows, when you see a pattern you should abstract it out. Only the OO paradigm does not give the versatility to do so, instead you end up with "design patterns". But when 16 out of 23 design patterns are invisible or simpler in a functional language, that makes you wonder if these patterns are missing language features. But despite all these design patterns, you also end up with a whole lot of "features". Just consider Java's simple OO system: with member classes, anonymous classes, local classes, and nested top-level classes. Oh, and language-wide support for monitors. Don't forget the final keyword which means as many things as
static do, oh and don't forget your annotations. Don't get me wrong, I like a fully featured language, but these are not they. These languages are designed in order to make the programmer feel smart for memorizing reams of worthless minutia. What ever happened to the concept of a small number of powerful and orthogonal features?
So back to the OOPSLA debate, because they said it all six years ago. The future of programming lies in organic programs that can fluidly evolve and can heal themselves rather than failing hard. Object-oriented programming, much as I enjoyed the promises a decade ago, cannot cope with such requirements. The OOP peddled by C++ and Java, like so much of computer science, is trapped within the shackles of modernism and the false idol of the Ur narrative. The OOP peddled by dynamic languages like Perl, Python, Ruby, and Squeak are just as confined by postmodernism and pomo's failed rebuttal of modernism. Pomo is dead, and it has been dead for years.
Modernism believes in a world where there is only a single variety of self, the monoculture. Postmodernism exploded this notion and offers a multiplicity of self, but it is just as self-centered as modernism ever was. The post-postmodernism is performativity, which holds that there is no self, there is only the enactment of a self, and since our actions are ever-changing so too is the self which is constructed by those deeds. It is our very actions which define a state of being, it is our interactions with others which defines our existence. Truth comes from the world, not from within. In the last five years performativity has moved from a sideline philosophy and has now infiltrated most of the social sciences. And so too will it be with computer science. The programs of the future will not be about the program itself, they will be about the programs' ability to interact with other programs and to interact with the world. A computer in isolation is meaningless. Our meaning, our content comes over the wires when we connect with one another. We do not play games alone anymore. We do not play them from behind computer screens. The world has infiltrated our electronic spaces and the programs of the future will exist not in computers but in that world itself. The future is event-driven, interactive, it is a dialectic about the Other, about what is outside of the Self, it is a search for consensus. The future is not about the programs that exist, it is not a time, it is a process, it is about forever becoming.