
On 17/10/11 23:25, Russell Coker wrote:
On Mon, 17 Oct 2011, Chris Samuel<chris@csamuel.org> wrote:
As for RAM - well Java is not known for its light memory limits, there's one Java code that's run at VLSCI that can fail if it gets less than 30GB or 60GB of RAM, but we believe that's because it's doing stupid things with its file IO. :-(
There are also Java implementations on smart cards with RAM measured in kilobytes.
The RAM use is determined by what the application does and also by the garbage collection system. Often people tune their JVM to not do GC very often to save CPU time at the cost of using more RAM.
One thing that's worth noting is that in many environments Java code is written by people who don't know much about how computers work at a low level (they are the people who can't write a C program that doesn't SEGV regularly), so they tend not to be efficient with memory use. I think that Java gets a bad reputation because of this.
Agreed. For instance, my mobile phone runs what is essentially a tuned version of Java (well, Dalvik via Apache Harmony), and it runs many simultaneous processes, quite fast, with each process using only a dozen or so megabytes of RAM. Likewise, I did some experimentation with OpenJDK, and was able to run some small java or scala apps in a dozen or so megabytes of RAM as well. However, the default java implementation on 64bit machines assumes that it can grab a gigabyte or so up front, and generally does so! :/ You can set some environment variables in your .bashrc to fix that though. Or, just turn linux's memory overcommit feature up to infinite, although I don't particularly like that method. -Toby