Hi Anvil,
I believe you are right that "it's not an issue as long as ATTO is performing the same number of IOs for that exact testfile size..." - although I suspect many/most folks might assume that the total number of I/O operations to be performed will be the exact number required to access the entire testfile size (presumably once) as indicated by the selected "Total Length" value.
In any case, a variation between ATTO runs in the total number of I/O operations (for the same given "Transfer Size" and the same associated testfile size) means, of course, a variation in the total amount of data transferred for the particular "Transfer Size" test.
And varying the total amount of data transferred under such circumstances (i.e., given the same transfer size and testfile size) in a benchmark test whose focus is on "data transfer throughput" (specifically "maximum sequential MB/s") seems suspect to me.
I can see that in those cases where a larger testfile size (e.g., 2 GiB) is selected, ATTO might want to reduce the total number of I/O operations performed (and correspondingly the amount of data transferred) for the smaller "Transfer Sizes" so as to help reduce the overall time required to run the benchmark test.
But, as I think that you would agree, such a reduction in the total number of I/O operations performed should be consistent (i.e., the same) between ATTO runs.
Hi Ao1,
Sorry that I have not yet replied to your post #222; I have already looked over the post and have several comments in mind, which I hope to get posted sometime later today.
Bookmarks