I think we may be reaching a point where the line between mechanical and cognitive get blurred slightly. While I feel that this falls into the mechanical end of the spectrum because I understand how the model was created, RNN's demonstrate some very fascinating emergent behaviors that will become more complex as computational power increases.
I wouldn't say that this is cognitive but it definitely learns to perform a skill in a similar fashion to intelligent beings which is remarkable. Extrapolating that behavior to self-awareness isn't likely however (of course this could lead to a philosophical debate)
Couldn't this be seen as a kind of motor cognition (https://en.wikipedia.org/wiki/Motor_cognition)? Isn't this essentially what motor cognition does when we write? Perhaps this is more low-level than that, but it seems to me this neural network embodies the 'muscle memory' we develop when we practice handwriting.
What happens in a neural network and what happens in a semiconductor device that we can choose to interpret as simulating the computational characteristics of a neural network (ignoring EM emissions, chemistry, etc.) are mechanically very different. If they both generate cognition (including the subjective aspects implied by that word), I would be very surprised. But that doesn't preclude the artificial neural network from mapping input-output in a way that correlates well with the input-output we observe in biological neural networks (again, ignoring a shit-ton of input and output that we don't consider part of the "computation").
Cognition is not "merely mechanical" by definition. If it was mechanical, there would be no special word to distinguish it from other reactions of the natural world.
So I guess you're fine with calling books "merely paper" because they are a "subset" of paper-based products? And you're fine with calling people "merely animals" because they're a "subset" of animals?
Human language is not based on set theory. There is a huge difference in connotations here.
Context is important, and the word "merely" can be interpreted in multiple ways. In the first case in this thread, I understood "merely mechanical" to mean "only mechanical," i.e containing nothing that is not mechanical. Using that definition, then humans are merely animals, because they're not something that's not an animal. Books are a bad example, because they do have things other than paper (like ink, glue, maybe a plastic or leather cover).
You seem to be using "merely" to mean "nothing more specific than," which is a very different meaning that would not be appropriate when talking about things which are subsets of other things.
If you care about context, I shall remind you that the original question was:
"Are we sure what is being simulated here is cognitive? It seems to me merely mechanical."
This is analogous to asking "are you sure this is done by humans rather than animals?" The question makes sense and is valid. To answer it with "humans are merely animals" would not address the subject at hand and, again, would have specific connotations.
I see that this kind of semantic acrobatics is extremely common in discussions of AI on HN.