I can’t answer your question, but here are my strategies to avoid it.
In the Marketplace module Community Commons you can find the section ORM: https://docs.mendix.com/appstore/modules/community-commons-function-library/#37-orm
With these Actions:
EndTransaction– This commits the transaction, which will end the transaction or remove a save point from the queue if the transaction is nested.
StartTransaction– This starts a transaction. If a transaction is already started for this context, a savepoint will be added.
I use them to set up a new transaction per iteration through a partial scope. See an example of a Microflow where I use it. In your case, I would use it to deal with the list of files to break them down into individual import assignments.
The second what I would implement is to assign the import process (per file) to the task queue: https://docs.mendix.com/refguide/task-queue/
Each file will be imported using its own context (like individual background jobs).
Go Make IT,
(If it worked for you, please accept the answer)
Usually we limit to import the data with 1000 records by restricting the user to import 1000 records at a time. You need to increase the memory of the environment to tackle such kind of things