64-bit JVM limited to 300GB of memory?

I can think of a couple of possible explanations:

  • Other applications on your system are using so much memory that there isn’t 300Gb available right now.

  • There could be a resource limit on the per-process memory size. You can check this using ulimit. (Note that according to this bug, you will get the error message if the per-process resource limit stops the JVM allocating the heap regions.)

  • It is also possible that this is an “over commit” issue; e.g. if your application is running in a virtual and the system as a whole cannot meet the demand because there is too much competition from other virtuals.


A couple of the other ideas suggested are (IMO) unlikely:

  • Switching the JRE is unlikely to make any difference. I’ve never heard or seen of arbitrary memory limits in specific 64 bit JVMs.

  • It is unlikely to be due to not having enough contiguous memory. Certainly contiguous physical memory is not required. The only possibility might be contiguous space on the swap device, but I don’t recall that being an issue for typical Linux OSes.


Can anyone please suggest a solution/workaround?

  • Check the ulimit.

  • Write a tiny C program that attempts to malloc lots of memory and see how much that can allocate before it fails.

  • Ask the system (or hypervisor) administrator for help.

Leave a Comment

Hata!: SQLSTATE[HY000] [1045] Access denied for user 'divattrend_liink'@'localhost' (using password: YES)