Hi, Is there any limit on how much data can be stored in an entity/DatabaseTable and Data that can be shown in a Data Grid? I am pushing data to an entity via Schedulars on Daily basis. The data has now crossed 50 lakhs and when we checked today data has not populated, I assumed that there might be issue in the logic microflow that is called by scheduler , but when i checked the logs the Scheulder microlfow was executed properly.
There is no set maximum as far as I know. But if you do extensive querying on an entity, it slows down after 800k+ records in my experience.
Jop ter Horst
Hi Dinesh, did you set the attribute limit as unlimited in an entity?
You are asking different items.
How many records in a table? That’s dependent of your database engine, your entity structure. 50k records is not a very big table. The structure (data types + lengths – please fix for example always the string length) and indexes are decisive for the speed of reading and showing them in your (paginated) datagrid. Indexes can be good for reading data, but slow down the writing. So you need to decide what’s more important for you.
I don’t know your microflow and your logging structure. No logs does not mean that your data is imported correctly. In this case I suggest you to add ‘Information’ logs for each step, so you can collect more detailed information about the correct working of your flow.
In addition to this: you can import split up your import in batches to do smaller jobs