In case of such amounts (>10k) you need microflow that processes the data in chunks. Have a look at this blog
You must use batches to do this. See the documentation here: https://docs.mendix.com/howto40/retrieve-and-manipulate-batches-of-objects
Strange that the Mx7 update of this how to is not findable. But it still is true.
As the other answers already state, batching is the way to go here. I would even recommend considering the ProcessQueue for such large computations, then you can also run the batches asynchronously https://appstore.home.mendix.com/link/app/393/Mendix/Process-Queue
Of course, batches are needed as is mentioned above. But, with this amount of records, if it is really needed, only batching will not solve your issue. If everything is done within that Microflow, Mendix will hold all 170.000 records in that microflow until the Microflow is done. This is because Mendix should be able to do a full rollback, once something goes wrong. A way to free up memory, is to run each iteration in a separate Java transaction. Add after each iteration the CommunityCommons.EndTransaction and CommunitiyCommons.StartTransaction Java actions to make sure memory is released.
Of course only do this if it is really needed. A full rollback is in this scenario not possible anymore.
Other option, but way more time consuming, is the addition of the ProcessQueue module. Add for every iteration the QueuedAction and trigger this QueuedAction from the ProcessQueue engine. This will also fire each QueuedAction in a separate Java transaction, hence not clogging the memory.
To learn more about data and especially batches, try the Working with Data course:
Section 3 focuses specifically on batches.
The quick and dirty solution (since this microflow will only be used once or twice).
Add a boolean attribute to your entity - call it Updated or Processed or something similar.
In your microflow:
Now you can run this microflow as many times as needed to update all of your records. You can run it from a web page or via a scheduled event.
Once all the records are updated, you can remove the boolean attribute from your entity.
Note: best practice is to commit the whole list of objects after the loop is complete instead of committing each object in the loop. You may already be doing this.
Hope that helps,