Monday, October 22, 2007

Out of memory......

Last week or so has been busy. We are working on our next release, which seems to work well on a functional level. But we get problems with Out of memory errors...

There are different causes for this, but the main cause is with Hibernate. The part that causes the most problems is where we read quite a lot of data and generate data for the online part of the system. The data is stored as BLOBs in the database, and Hibernate is used througout. We want to be sure that everything or nothing is processed, so it is all done in one tranmsaction.

Now it seems that even though we do not read from the table with the BLOBs, only write to it, Hibernate keeps all records in the session cache, filling the memory. To me this seems very strange. We dont need this cache, once the records are written to the database, they should be removed from the cache. In my experience this is a very common scenario, that you do not read the records created again in the same process. In fact it is good practice to get all data before inserting, so you do not need to read the same record again.

It is also difficult to understand that Hibernate does not take care of Out of memory conditions. A good cache should not grow without limit, it should purge data when memory usage is too high. This is not something you should have to do in your application.

The problem seems to be solved for now, but since data volumes in the system will grow, it will probably reappear. We will have to do something more permanent for the future. Probably this will be to bypass Hibernate for the BLOB insert, and do this with JDBC calls.

1 comment:

lepidus said...

Did you try using an external cache provider like ehcache. It is easily configurable.