I'm not sure how it cleans memory, but here is one possibility:
Here we have
loss of time with these methods. The main purpose of a cache in any device is to
decrease access time as well. The larger the cache hit ratio, the better the prefetch, the faster the access time, the quicker the application/instruction time. This may hold us some clues.
You have some RAM address ranges reserved for hardware I/O mapping in Windows by default and others reserved deemed necessary for kernel level code. On top of this you'll have extra taken up by basic system processes and they tend to accrue and not release from memory even if the address range is no longer needed. Thus you'll have a section of memory unaddressable for no reason.
When you force the "cleansing" by something which will either a) defrag the RAM b) force it empty, such as a bigger process requiring the full RAM, naturally the Windows cache and memory will empty all unneeded extra address ranges which were reserved by other applications (prioritize) and start filling up if you've given Windows priority to real-time Programs. FWIW the HDD pagefile is also a cache managed by Windows kernel level. Then rather than the RAR files still retaining in memory as many applications will, after the copying finishes, they are released immediately and that part of memory becomes freely available to everything subsequently. I "suspect" the
type of files matter (RAR files) but I'll try it more thoroughly soon.
That's as far as I can see
if memory increase does play a role and how. For me, this has all to do with the various caches and prefetch algorithms as this is their well known function - to increase speed and decrease latency.
Bookmarks