In our case it usually helps to place the EndTransaction and StartTransaction from CommunityCommons inbetween batches.
Another option would be use of the ProcessQueue to process your data.
I have noticed this happening in situations where an entity X or its (1:n) associated entities Y are updated, where the before-commit event(s) makes changes to (other objects of) X or Y. Everything gets locked by the updates, causing rapidly increasing processing times.
I have solved this by determining all changes and storing them in a temporary attribute in X and/or Y, and in a second batchprocess process these changes into the associated entities.
Only applicable for batch processes. If this happens in interactive processes, reconsider your datamodel.
In this case, I would be looking at using a Queue or ProcessQueue (from the App Store) to break the processing up. If you do, each block will be run in its own transaction so you won’t have as much data in memory each time it’s run.