Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"First failing byte: -1" error on completion with a custom byte total set #25

Open
xPXpanD opened this issue Jan 24, 2024 · 1 comment

Comments

@xPXpanD
Copy link

xPXpanD commented Jan 24, 2024

Just received a new Kingston High Endurance 256GB card from a reputable shop, and figured I'd write ~33GB of data to it (set to 33000000000 bytes in UI) just as a quick sanity check without immediately filling it to the brim.

No issues while verifying the individual blocks, but I got a weird "First failing byte: -1" error at the end:
MediaTesterResults_2024-01-24_12-15-23_FAIL.txt

Tried again with 1GB (1000000000 bytes), same thing:
MediaTesterResults_2024-01-24_12-19-15_FAIL.txt

Tested a random 8GB Memory Stick Pro Duo I had laying around with both a fresh exFAT and FAT32 format, and it did the same thing both times. Was fine when I tested the full capacity, so I'm guessing this is a bug?

Also, bonus round: Setting 1000 bytes on the 256GB card led to some different weirdness, with the test going for about half a minute (way longer than expected) and the "Remaining" field going negative. Log:
MediaTesterResults_2024-01-24_12-20-03_FAIL.txt

(also got the same error in the end)

The 256GB card was tested with its factory exFAT formatting, and has now gone into service -- I can't run further tests on it if required, but let me know if I can provide any other info. Version was v0.4.1.0, Windows 10 x64 22H2.

@Newtomic
Copy link

I think the program has some bug and when using custom amounts of Bytes to test they must be multiples of GB, (real GB, not pseudo GB, in SI is represented by the annoying GiB), anyway, 1GB is the minimum, so "1073741824" Bytes, for 33GB in your case try the value "35433480192".

In any case the dev should fix this behavior or enforce (limiting) the custom amounts for multiples of GB.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants