Posted By: Anonymous
I am getting this error in a program that creates several (hundreds of thousands) HashMap objects with a few (15-20) text entries each. These Strings have all to be collected (without breaking up into smaller amounts) before being submitted to a database.
According to Sun, the error happens “if too much time is being spent in garbage collection: if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered, an OutOfMemoryError will be thrown.”.
Apparently, one could use the command line to pass arguments to the JVM for
- Increasing the heap size, via “-Xmx1024m” (or more), or
- Disabling the error check altogether, via “-XX:-UseGCOverheadLimit”.
The first approach works fine, the second ends up in another java.lang.OutOfMemoryError, this time about the heap.
So, question: is there any programmatic alternative to this, for the particular use case (i.e., several small HashMap objects)? If I use the HashMap clear() method, for instance, the problem goes away, but so do the data stored in the HashMap! 🙂
The issue is also discussed in a related topic in StackOverflow.
You’re essentially running out of memory to run the process smoothly. Options that come to mind:
- Specify more memory like you mentioned, try something in between like
- Work with smaller batches of
HashMapobjects to process at once if possible
- If you have a lot of duplicate strings, use
String.intern()on them before putting them into the
- Use the
HashMap(int initialCapacity, float loadFactor)constructor to tune for your case