I never claimed C++ had any intentions of doing pure OO. What I did claim is that C++'s use of the struct+vtable approach is a source of the demise of OO because many people equate "objects" with struct+vtable and have completely missed the ideas of pure OO. Regardless of the successes of C++ and Java, this intimates that the paradigm behind OO —that is, the pure OO paradigm— has not succeeded as well as hoped. This sentiment is not mine alone and is discussed in the OOPSLA debate when they are citing the strong-typists as undermining OO. The fact that the struct+vtable paradigm or the composite-oriented paradigm are out there says little about the OO paradigm, other than that they are taking mindshare from it.
As someone who uses Java for governmental research, I do not agree that it is "v. v. fast". It is much faster than non-compiled languages, I'll agree. But among compiled languages it's not as noteworthy as you make it out to be. Fortran and C are notable among compiled languages, Java is a middle runner with much higher level languages quickly closing the gap. The fact that Java is unwilling to adopt optimizations such as lightweight tuples, fast field read access, tail call elimination, or reducing memory overhead all lead me to suspect that those higher level languages will continue to close the gap and will surpass Java rather soon. OCaml's already there, Clean is close behind, and Haskell continues improving in leaps and bounds. All three languages also greatly diminish the amount of programmer time required, which is generally the larger expense.
You claim that memory overheads are not important, but this is false propaganda. When dealing with research-grade programs the size of memory determines the size of models that you can run, which determines the adequacy of your results. If you don't have enough memory to use a good model, your results are worthless. If your model is too small to include all the data you've gathered, then all the expense of gathering that extra data is wasted. Swapping out to disk for lack of memory is also not an option because disk latency is far slower than any language overhead (for compiled languages). The absolute amount of memory used by a language is very important indeed for large classes of programs.
The other big reason that memory is important is because garbage collection takes time. If a language is creating too many objects then it takes a lot of time to crawl through the heap to figure out what to keep and what to dispose of. The benchmarks game doesn't give any indication of how much of memory is being used up by per-object overhead vs object churn, but churn is disastrous for time performance just as per-object overhead is disastrous for the reasons discussed above. GCed languages are many orders of magnitude better than non-GCed languages in terms of programmer time, but that does not mean that GC should be taken lightly.
Re: This is what I learnt from this post
Date: 2008-09-05 06:13 am (UTC)From:As someone who uses Java for governmental research, I do not agree that it is "v. v. fast". It is much faster than non-compiled languages, I'll agree. But among compiled languages it's not as noteworthy as you make it out to be. Fortran and C are notable among compiled languages, Java is a middle runner with much higher level languages quickly closing the gap. The fact that Java is unwilling to adopt optimizations such as lightweight tuples, fast field read access, tail call elimination, or reducing memory overhead all lead me to suspect that those higher level languages will continue to close the gap and will surpass Java rather soon. OCaml's already there, Clean is close behind, and Haskell continues improving in leaps and bounds. All three languages also greatly diminish the amount of programmer time required, which is generally the larger expense.
You claim that memory overheads are not important, but this is false propaganda. When dealing with research-grade programs the size of memory determines the size of models that you can run, which determines the adequacy of your results. If you don't have enough memory to use a good model, your results are worthless. If your model is too small to include all the data you've gathered, then all the expense of gathering that extra data is wasted. Swapping out to disk for lack of memory is also not an option because disk latency is far slower than any language overhead (for compiled languages). The absolute amount of memory used by a language is very important indeed for large classes of programs.
The other big reason that memory is important is because garbage collection takes time. If a language is creating too many objects then it takes a lot of time to crawl through the heap to figure out what to keep and what to dispose of. The benchmarks game doesn't give any indication of how much of memory is being used up by per-object overhead vs object churn, but churn is disastrous for time performance just as per-object overhead is disastrous for the reasons discussed above. GCed languages are many orders of magnitude better than non-GCed languages in terms of programmer time, but that does not mean that GC should be taken lightly.