So your problem is with the use of unbounded quantifiers that range over all natural numbers?
So for example you would consider "∀x. ∀y. x + y = y + x" a nonsense statement because we are quantifying over all natural numbers, and there are an infinite number of natural numbers, so we cannot quantify over them?
(For the record the ' in 0' or x1' is a post-fix notation for the successor operation. See http://tachyos.org/godel/proof.html for details).
I don't know what the problem with that particular long statement is.
You might just say that a thing is nonsensical if it's not provable but is not false either.
This might be sensible 'stack overflow' exception if we really are unable to provide reasonable limits on self reference reference and reasoning relying on infinities.
Nonsensical if it's not provable with respect to what theory exactly? Elementary function Arithemetic? Primitive Recursive Arithemetic? Peano Arithemtic? Martin-Löf type theory? ZF set theory? ZFC+"there exist an infinite number of Woodin cardinals"? "The set of true statements of number theory"?
Each of these logical theories are each able to prove an increasing number of arithmetic propositions. What is or is not provable is relative the deduction system or selection of axioms.
For example, that big expression that I linked to is designed so that isn't provable in Peano Arithmetic, but it will be provable Martin-Löf type theory, ZF set theory, etc.
So for example you would consider "∀x. ∀y. x + y = y + x" a nonsense statement because we are quantifying over all natural numbers, and there are an infinite number of natural numbers, so we cannot quantify over them?
(For the record the ' in 0' or x1' is a post-fix notation for the successor operation. See http://tachyos.org/godel/proof.html for details).