I have a problem with an entity where I want to delete a lot of records. Problem is that I am constantly going out of my java heapspace and I can not cut the microflow up any further (all the records of a specific entity on a certain day). How could I fix this without changing my heapspace (currently 512 MB) and within the constraints of version 2.5.6 (community commons v2.1)? Regards, Ronald
Did you take a loot at the delete behaviour. Maybe this is causing the memory issues. Or you could use a batch delete in a java-action. That's what we do in a comparable situation.
If community commons does not fit: You can mark the records as 'to be deleted', filter visible data on that field. In a scheduled task retrieve the marked records and delete them in a loop. Quit that loop after 100 (or other number) of deletions.