Oddly enough I find that the #define macro soup detracts from the performance here. I was rather unimpressed at first because obfuscating C code by just #define'ing a bunch of code is a trivial and rather uninteresting way to write unreadable code. Of course you can make any arbitrary code look like a printf call with enough macros!
But it's actually a lot more clever than that.
I feel like this would be a lot more impressive if it was written in simple and clear C since then you'd see that there really isn't any tic-tac-toe logic being explicitly implemented the way you'd expect.
The reason he had to use macros is that the actual formatting string is huge and difficult to understand. In this case, macros really serve a useful purpose, instead of just obfuscating code.
Much of the original Bourne shell was written in C that had been macro'd to look like Algol [1].
E.g.
/usr/src/cmd/sh/mac.h:
#define IF if(
#define THEN ){
#define ELSE } else {
#define ELIF } else if (
#define FI ;}
#define BEGIN {
#define END }
#define SWITCH switch(
#define IN ){
#define ENDSW }
#define FOR for(
#define WHILE while(
Quite some time ago I saw a piece of C code (probably linked from here) in APL style, I have been trying to find it since, does anyone know what I mean?
I understand the temptation. I keep catching myself on writing `unless( ... )` in C++ every couple days, and I've considered adding a macro for it, but ultimately decided against it, as the rest of the team would not understand why I need it.
Wouldn't be surprised if that was intentional, actually. The idea of making syntax a little unwieldy for things that you should really think twice before using is not a new one.
Real Programmers write machine code in any language, thank you very much. (Which TFA rather nearly is, actually, though it's a bit macro-assemblery for a proper example.)
I am afraid that the author has disqualified himself by publishing the source code before the judging of IOCCC 2020 is over. See catch 22 in the rules: "The judges STRONGLY prefer to not know who is submitting entries to the IOCCC."[1] Even the winners are asked not to publish the code until it is available on the IOCCC web site.[2]
It is true that publishing the source code this early is risky, but submitting the code that has been already published is not prohibited by its own. My winning entry (2012/kang [1]) was in fact published independently and was even on HN [2]. I'm pretty sure that this is not the only occurrence, having seen some winning entries trending on Twitter before the public announcement.
Very interesting. Roughly, that the existence of Turing-complete functions within programs creates a fundamental vulnerability that even rigid control over the control flow of a program cannot avoid.
That’s a bit more robust than using strlen(s)+2, where you have to keep that magic constant 2 in sync with ": ". Moving ": " to a variable and using strlen(s)+strlen(separator) would fix that, though (at the price of speed, unless you’ve a compiler that optimizes that away)
This only works if you're not dealing with Unicode, where the number of bytes, the number of characters, and the width of those characters can all vary.
You don’t need Unicode for that. It also requires you use a monospaced font.
I think the feature predates that and Unicode, though. But even then, it fails if you underline text the way it was done at the time, either by using backspace and underline characters or by using termcap (https://en.wikipedia.org/wiki/Termcap)
Yeah but the difference is that you can use %n wherever you need it in the format string. Depending on what arguments come after it figuring out the length of the printed arguments might not be trivial whereas with printf, it already has to keep a count of it during execution in order to return it at the end so it's easy to add support for it.
I seem to recall using it to auto-adjust column widths.
And we did something with it involving string translations. Since we didn't control the translated format string we couldn't just count the characters in the source code.
But it's been a really long time and I don't recall the details.
a law would be government overreach, but the idea is solid and the results tangibles; should be within the realm of a union regulations, if we ever manage to get one going.
> I think more companies should allow for their employees to have some plain old fun with no strings attached on a regular basis.
Nicholas Carlini's resume is crazy-insane and he's highly desirable, so of course Google Brain is going to lend him far more flexibility than Google Analytics would have with a recent college-hire, or at the other end of the supply/demand distribution: the working flexibility Amazon would afford a warehouse employee (if they aren't a contractor...).
syntactically it is. But it's not a very clear description. So how would you describe that single call location getting executed in a loop so everybody understands what is really meant? A single line of code is worse.
In case you're wondering (like me) how you'd get input from printf():
> We ab^H^Huse [the Turing-completeness of printf()] to implement a the logic of tic-tac-toe entirely within this one printf call (and a call to scanf() to read user input).
I hear you on that. But for me, the jealousy dissipates quickly once I realize that even if I had /lots/ of time, I don't think I would come up with this.
That's not too unusual for IOCCC entries. Of the top of my head, there's a flight simulator shaped like an airplane, and an addition routine shaped like a full-adder.
A more subtle touch is that the #defines spell out NOUGHTs AnD CRoSSES.
I don’t want to jump on the title hate bandwagon - what the author has done is definitely creative and clever and this discussion detracts from it - but it also uses scanf.
Ah, well spotted, thanks. I didn't think it was fair to call the title click bait just because it used a while loop to drive the calls to printf, but also including a scanf call is definitely not in the same spirit.
If you’re willing to lose almost all portability, I think you could read a value from an I/O port, pass that to printf, use that as the field width for a string to print, count the character length of the resulting string using %n to move it into a variable, and then backspace over that to prevent the output from dirtying your output. You could only use the results in a subsequent call to printf, though.
Of course, just hiding an assignment in a printf argument would be much easier, but wouldn’t be fun.
Of course, that would lose lots and lots of portability, and you could, likely, only get it to work on systems that don’t do memory protection between processes, and then, not all of them.
I guess it would not be very hard to use this trick to have printf read joysticks on many micro’s from the 1980’s.
Well, I got downvoted. I should have mentioned that I meant this in a good way. My account is new and I have since read the rules in the welcome page. But think about it, a whole game implemented only in a print statement? Wow, that can’t be real. And it is at the very 1st place of the top posts. I clicked on it expecting a cool article explaining something interesting, but no. It is the implementation. So, well done.
But it's actually a lot more clever than that.
I feel like this would be a lot more impressive if it was written in simple and clear C since then you'd see that there really isn't any tic-tac-toe logic being explicitly implemented the way you'd expect.