I have a simple process to make some edits to all the contour files in a directory. It runs fine on the test of 100 odd files but in a directory with about 500 it stops after about 120 files.
Is there any way to release memory after each file is processed?
import processing, os, glob path=r'F:\Input\Contours\25_cm' outdir=r'F:\Output\Contours' os.chdir(path) for file in glob.glob("*.TAB"): filename=file[:-4] processing.runalg("modeler:contour_edits",filename+".tab",outdir+"\\contour_5m\\"+filename+"_5m.tab",outdir+"\\contour_25cm\\"+filename+".tab",outdir+"\\contour_1m\\"+filename+"_1m.tab")
أكثر...
Is there any way to release memory after each file is processed?
import processing, os, glob path=r'F:\Input\Contours\25_cm' outdir=r'F:\Output\Contours' os.chdir(path) for file in glob.glob("*.TAB"): filename=file[:-4] processing.runalg("modeler:contour_edits",filename+".tab",outdir+"\\contour_5m\\"+filename+"_5m.tab",outdir+"\\contour_25cm\\"+filename+".tab",outdir+"\\contour_1m\\"+filename+"_1m.tab")
أكثر...