Hacker News new | past | comments | ask | show | jobs | submit login

As mpengine will unpack arbitrarily deeply nested archives...

Surely not - what happens if you feed it the zipfile quine?




If that broke it, I imagine it would've been discovered by now. A simple mitigation would be to check if the newly-unpacked file is the same.

However... suppose someone crafted an alternating version -- where A.zip contains B.zip contains A.zip etc. -- I bet there are systems out there which only do one tier of checking.


There exists a much simpler way, it is for example how C++ compilers deal with recursion in template instantiation:

  for level = 1 to arbitrary_depth:
    do_some_work_which_may_produce_more_work()
You just have to select arbitrary_depth large enough that nobody notices and there you have it - arbitrarily deep recursion without infinite loop ;)

Totally wouldn't be surprised if MS did it this way too. Otherwise you are right - they would need to remember at least hashes of all "outer" archives unpacked so far.


Haha, that's how I like to write all my code now. Sometimes I'll write a function that generates a random id, and checks the database to make sure that it doesn't already exist. I always like to add a counter and throw an error (or just return some default value) if it gets up to 100 or so. It might not ever happen in this universe, but I like to imagine there's a parallel universe out there where I saved some server from going into an infinite loop.

I actually did this recently for a random phone number generator, using the phony Ruby library to validate numbers. It was just for some test fixtures, but it's nice to know that it will always fallback to a default test number in case something goes wrong and it runs out of attempts. Or I'm in some universe where my random number generator suddenly starts producing an endless stream of zero bits.

This is just a little 'tick'. I also find myself using 12px and 14px a lot more frequently than 13px.


That would avoid getting stuck, but in an adversarial situation, that hands the attacker an easy attack: Just use N+1 layers of packing.

... So it really needs to be semi-random :p


I wonder whether the zip-quine could be modified to insert additional garbage data with each iteration, to defeat any system that stops when it reaches a known hash. Similarly, could a zip-quine that loses data with each iteration be created? What about one that, after it has lost N iterations of junk data, the data loss mutates part of the quine mechanism, turning what was previously a block of junk data into a valid file? Could you devise a zip template that has space to insert random noise to give each copy a unique set of hashes, and has space for a payload that is revealed after a configurable number of iterations? What about a variant where each iteration contains two copies with different junk data changes?

The idea of a zip-quine and how it interacts with poorly-designed malware detection offers so many interesting hypothetical variations.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: