I want to import and export about 70,000 lines of data to the application. However, the following error occurs in processing I created. - Import I import csv File to the application using Flat & delimited file import of Mendix App. I committed every certain number of rows, When processing more than a certain number of rows, the response is abruptly delayed and processing does not end, or Critical level logs on the Mendix console frequently and finish. - Export I exporting to Excel using XLSReport of Mendix App. After creating the data to export to the persistent entity, I set it to an Input object that passes the data's relation to XLSReport. At this point GC overhead limit exceeded error occurs. I think the above phenomenon means that memory is insufficient for processing (data). When importing or exporting a large file that will run out of memory when processing, How do you devise a process?
I would use the batches and running in the background to process the import but also generate the export file and give the user a link to download when the export document is ready for downloading.
I hope this will help you achieve your goal to import 70.000 records and exporting them as well.
There is so much functionality that you will be re-inventing when building it via the community commons function module. I would advise to fork the 'Flat & delimited file import'-App on Github, get the problem solved and create a pull-request for it. In that way, we all benefit.
How to solve the problem? Have a look at Jeffrey's tips.