-
Notifications
You must be signed in to change notification settings - Fork 115
Open
Description
This problem has been bothering me for a long time. I started by reading the csv containing the outline of 10 million polygons, then divided it into 1000 parts, created new polygons one by one and then did the zonal_stats.
As you can see in line 68 of the picture, the memory increases by a dozen mb while doing the zonal_stats and is not cleared by the end of the loop (it shows 1694.4 mb when going to the next loop, as it does at the end of this loop).
My computer's memory can't cope with the increase in the number of 10,000 cycles and the process is killed (over 140g, that is). I tried to split the data into 100 or 10,000 copies, but the problem still occurs and the process is still killed.
What is the reason for this?
MrChebur
Metadata
Metadata
Assignees
Labels
No labels
