Microflows have a setting: “Disallow concurrent execution”. You can set that to “Yes” to prevent it running twice.
I had this issue with a long running microflow (the microflow would run twice which would result in duplicate objects being created), after spending a great of time debugging my microflow, I changed it to run asynchronously versus synchronously and it worked.
“when it reaches around 15,000 , the same microflow is ran again" What makes it getting run again?
Is your productionserver having multiple instances?
As per my observation whenever some error occur in a microflow it runs twice before terminating and throwing error message. I have seen this while debugging. please check why your microflow is generating warning that will solve the issue I guess.
Be careful disallow concurrent execution is set to all users, so if you have two or more users that want to run the same microflow at the same time, it will not be possible with that option
I have learned that the warning "Not all bytes were read from the S3ObjectInputStream” is caused by reading a file from a java action, but not the complete file. See https://forum.mendix.com/link/questions/96546 .