0 votes
1 view
in Java by (2.6k points)

I am getting this error in a program that creates several (hundreds of thousands) HashMap objects with a few (15-20) text entries each. These Strings have all to be collected (without breaking up into smaller amounts) before being submitted to a database.

According to Sun, the error happens "if too much time is being spent in garbage collection: if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered, an OutOfMemoryError will be thrown.".

Apparently, one could use the command line to pass arguments to the JVM for

  • Increasing the heap size, via "-Xmx1024m" (or more), or
  • Disabling the error check altogether, via "-XX:-UseGCOverheadLimit".

The first approach works fine, the second ends up in another java.lang.OutOfMemoryError, this time about the heap.

So, question: is there any programmatic alternative to this, for the particular use case (i.e., several small HashMap objects)? If I use the HashMap clear() method, for instance, the problem goes away, but so do the data stored in the HashMap! :-)

1 Answer

0 votes
by (46.1k points)

You're running out of memory to run the method evenly. Possibilities that come to thought:

  1. Define more memory as you suggested, seek something in between like -Xmx512m first
  2. Run with smaller batches of HashMap objects to the processing at formerly if practicable
  3. If you possess a set of duplicate strings, apply String.intern() on them before placing them into the HashMap
  4. Apply the HashMap(int initialCapacity, float loadFactor) constructor to harmonize for your situation

Related questions

+11 votes
2 answers
0 votes
1 answer
asked Aug 29, 2019 in Java by Nigam (4.1k points)
0 votes
1 answer
Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...