Ex newsleecher

I am glad i was told about newsbin.
Used newsleecher for years, but stayed with 3.9 as i did not like the caching on the download directory.
With 3.9 i had the temp directory mapped to a 4 GB ramdrive. This worked ok to cache the parts till the file was completed to write.
I have upgraded to 500 mbps line and i did not manage speeds over 350mbps with newsleecher.
With some settings i did i saw peaks of 530 mbps now with newsbin.
ChunkCacheSize=1000
MemCacheLimit=2000
I do see the chunk buffer go back to 200 when i download large things (50 gb discussions)
The rar files in these discussions are up to 1 GB with only 1 or 2 blocks sometimes
Can someone explain me if i assume correctly that:
Memcachelimit is per KB, so 2000 is 2 GByte
Chunk cachesize = number of 384 ? kbyte usenet packets to store ?
So 1000 would be 384 MByte (assuming that 384 is the correct value)
I can than increase the buffers so i use my available ram beter (12 Gb under windows 7)
Used newsleecher for years, but stayed with 3.9 as i did not like the caching on the download directory.
With 3.9 i had the temp directory mapped to a 4 GB ramdrive. This worked ok to cache the parts till the file was completed to write.
I have upgraded to 500 mbps line and i did not manage speeds over 350mbps with newsleecher.
With some settings i did i saw peaks of 530 mbps now with newsbin.
ChunkCacheSize=1000
MemCacheLimit=2000
I do see the chunk buffer go back to 200 when i download large things (50 gb discussions)
The rar files in these discussions are up to 1 GB with only 1 or 2 blocks sometimes
Can someone explain me if i assume correctly that:
Memcachelimit is per KB, so 2000 is 2 GByte
Chunk cachesize = number of 384 ? kbyte usenet packets to store ?
So 1000 would be 384 MByte (assuming that 384 is the correct value)
I can than increase the buffers so i use my available ram beter (12 Gb under windows 7)