Hi There, We've got a system with an entity called Applications. This entity has got twenty attributes. I've exposeda microflow as a webservice. This microflow takes a list of Applications as parameter. We are running on an ORACLE database. Then we've got a flat file that gets read via a C# application. The C# application imports the MENDIX webservice and, from the flat file builds up an array of Application objects. It then calls the mendix webservice and passes the array of Applications to Mendix. My problem : We are sending an array of fifty thousand Applications to the Web service, which is considered small relative to the four hundred thousand we would like to send. It has been running for about two hours now and is still processing applications, one-by-one. Is there a quicker, more effective way of doing this? Regards Frikkie
The problem with the approach you're taking is that you will have to have all of the imported objects in memory. This is because the webservice request immediately acts as the input parameter to the microflow.
Judging by your question, this is not what you want. You probably just want to import the data really quickly.
In that case, I'd suggest turning it around. Have the C# app publish a webservice that returns the objects. The runtime is really good at importing data from a published webservice, and will flush data to the DB as quickly as possible. This prevents the situation where you need to keep 400k objects in memory.