“The silent majority” was originally a euphemism referring to the dead, who outnumber the living. It only got its political meaning in the 19th century.
Are you talking about things like sensor noise and chromatic aberration? It would be interesting to see if downsampling the image beforehand affects the result.
However, it's hard to separate image patterns from camera structure insofar as linear projection is a result of camera structure.
I was thinking about CFA mosaic and JPG compression, I think these may introduce some axis aligned artifacts. But maybe they took it into account (using raw format?) or effect is not relevant in this case.
Even in raw format, all digital cameras apply some amount of sharpening [1] even when the setting is "off" in the camera menu. Also, all raw format conversion software (Lightroom, Capture One, etc.) applies sharpening by default.
I could imagine that a sharpening algorithm could transform a random distribution into something with structure. That the authors appear to not reference camera or image sharpening anywhere in the paper is somewhat worrisome.
That rule doesn't hold for some languages. For example, a Python lexer needs to remember a stack of indentation levels to know whether a given line's indentation should be tokenized as an INDENT, a DEDENT, or no token at all.
True. Strictly speaking, that means it isn't context-free in the usual sense (right?), but it's a practical extension.
Matt Might uses Python's indentation as a lexing/parsing example in his ongoing "Scripting Language Design and Implementation" course (http://matt.might.net/teaching/scripting-languages/spring-20...), which is worth following if you're reading this thread.
now we just need to write google search, google maps, facebook, twitter etc for dosbox, some type of linking mechanism, have it installed on all computer by default and we are sorted!
Most demo competitions do not assume you'll require internet connectivity at all, so that won't be required ;)
I understand your irony and do agree that using standards of today is expected.
This doesn't remove any validity to the OP point that what we use today (apart from WebGL or other hardware accelerated code) is several orders of magnitude behind what was already there 10 to 15 years ago.
I do hope though that we'll see more and more widespread/standard techniques to grab back a bit of speed there without only counting on Moore's law :)
Stephen Lavelle's Platonic Archetypes of Dice is a Pokemon-style Flash game based on this Rock-Paper-Scissors-like quality of differently distributed dice.
I agree, but only up to a point. When a book is advertised as part of a series, that is in effect a promise that there will be a series. Otherwise it's false advertising.
There’s a (very) short story about a device exactly like this: https://www.nature.com/articles/436150a