This implementation is too naive. You can't just autocomplete queries, you have to accept typos and errors and suggest the correct query. That's the hard part.
The usual implementation for autocomplete is to generate a trie data structure. A naive implementation will not have auto-correction or fix problems more established implementations have already addressed.
True, but tries do not automatically solve those problems, either, and most tries use lots of memory, especially on 64-bit systems (pointers, pointers everywhere!).
I'm currently implementing my own trie (for learning) for my own autocomplete module ... and I don't see how a trie (prefix tree) can solve the issues you just wrote.
When you traverse the prefix tree and you are blocked: meaning, the prefix does not match an entry in the structure, you perform edit operations: insert, delete, transpose, and substitute. If one of those garners a match, you keep going down the tree until you reach your edit distance.
I love Redis, nginx, and Lua and wrote my own autocomplete implementation using Metaphone (e.g. https://github.com/threedaymonk/text) but after a while it became clear that what I really wanted was an instance of Solr.
nginx+lua can be exceptionally fast. It's the basis for OpenResty, which is a consistent top-performer in the Web Framework Benchmarks (http://www.techempower.com/benchmarks/)
Webdis is a neat project, but I'd be a bit wary of deploying a custom HTTP server written just for Redis interactions. Nginx is battle-hardened.