> Ah snap! There was no error compiling the test kernel or running it, but the numbers don't check out. Please report your browser (+ version) and hardware configuration to us, so we can try to fix this. Deviation was 0.33731377391086426.
I'm on linux 4.7 + chromium 53 with Intel 4000 graphics
Hi, please report any issues to the repo at http://github.com/turbo/js. The more hardware combo test data I have, the sooner I can debug these faults. If you have any friends with exceptionally weird configs, please get them to test, too :-)
BTW: If you get the "Ah snap" error, there is some diagnostic data in the console. Please include that in any issue report. Cheers.
Ah snap! There was no error compiling the test kernel or running it, but the numbers don't check out. Please report your browser (+ version) and hardware configuration to us, so we can try to fix this. Deviation was 0.3373333334457129.
console:
Values are:
0.00900000031106174
and
1.0210000006482005
Well you are here, what I did in the past when some of my projects got into HN is just copy/paste those reports in the issue tracker in Github. A bit of work but I got great reports in HN that didn't want to waste.
> Ah snap! There was no error compiling the test kernel or running it, but the numbers don't check out. Please report your browser (+ version) and hardware configuration to us, so we can try to fix this. Deviation was 0.673000000262012.
Latest Chrome on a macbook air. Looks like at least one other person has had this issue, I'll report my experience as well. https://github.com/turbo/js/issues/1
It works on my Note 3, but not on my S7. The N3 results are about 1.5/3.0 for me. Though wildly varying between runs.
Edit: It works on my windows PCs (even in a qemu KVM instance using the std vga, but of course the emulated GPU is slower). However, it doesn't seem to like any browser on linux.
With regards to Linux (though Android's Chrome has chrome://flags and chrome://gpu as well, so maybe some of this applies as well):
It didn't work for at first, but I got it to show me the "Ah snap! There was no error compiling the test kernel or running it, but the numbers don't check out." message at least. By changing the deviation var to e.g. 0 in the debugger you can also force it show you the results (JS: 0.56, JS&Turbo.js: 1.74), but I don't know if those results mean anything.
Steps to maybe fix:
Does e.g. glxgears work?
On Chrome, check the following two pages:
chrome://flags/ <-- I enabled "Override software rendering list", "Experimental canvas features", "WebGL Draft Extensions" but don't know if all of those are necessary. Better change them back later.
chrome://settings <-- Advanced -> System -> Use hardware acceleration when available
chrome://gpu/ <-- Check if WebGL is enabled, etc. (also check log messages at the bottom)
My graphics hardware is a positively ancient Intel 965GM.
(On Windows on a much faster CPU and using Intel HD Graphics 4000, I get JS: 1.28, JS&Turbo.js: 3.82.)
Reading the benchmark script, and the explanation of it at github, I think the deviation is there to account for floating point artifacts. I.e. where JS = 0.01499.. and GPU = 0.01500.., that's fine, but if the numbers (read results from the fractal function) are deviating more, that's an error.
Edit: Actually, "(JS: 0.56, JS&Turbo.js: 1.74)" seems to be a valid result from what I've seen. May just be that the chosen deviation value is an unfortunate edge case.
> Ah snap! There was no error compiling the test kernel or running it, but the numbers don't check out. Please report your browser (+ version) and hardware configuration to us, so we can try to fix this. Deviation was 0.33731377391086426.
I'm on linux 4.7 + chromium 53 with Intel 4000 graphics