If I run the below code, I would expect the second assignment of array_1 to free the old memory array_1 used but it appears a copy is kept even after forcing gc. Example:
import numpy as np
import psutil
import os
import gc
BIG_ARRAY = 1280*3*960
process = psutil.Process(os.getpid())
def make_frames(num):
frames = []
for i in range(int(num)):
frames.append(np.arange(BIG_ARRAY).astype(float))
return frames
def print_current_memory_usage(msg=''):
process = psutil.Process(os.getpid())
print('Using {:.2} GBs of mem for {}'.format(process.memory_info().rss/1E9, msg))
print_current_memory_usage('Initial mem')
array_1 = make_frames(1E2)
print_current_memory_usage('Mem after making large object')
array_1 = make_frames(1E2)
print_current_memory_usage('Mem after replacing old large object (expect mem to stay same)')
gc.collect()
print_current_memory_usage('Mem gc (expect to change if this is just a gc timing issue)')
I repeatably get the following output:
Using 0.028 GBs of mem for Initial mem
Using 3.0 GBs of mem for Mem after making large object
Using 5.9 GBs of mem for Mem after replacing old large object (expect mem to stay same)
Using 5.9 GBs of mem for Mem gc (expect to change if this is just a gc timing issue)
It's as if the initial array_1 copy was not freed. Can someone explain why this would be?
Note this is only if the appended array is a numpy array, if I just append a list, ie say frames.append([i for i in range(BIG_ARRAY)]) instead I get the expected result:
Using 0.028 GBs of mem for Initial mem
Using 4.1 GBs of mem for Mem after making large object
Using 4.1 GBs of mem for Mem after replacing old large object (expect mem to stay same)
Using 4.1 GBs of mem for Mem gc (expect to change if this is just a gc timing issue)