Hacker News new | past | comments | ask | show | jobs | submit login
Neural Symbolic Machines: Learning Semantic Parsers with Weak Supervision (arxiv.org)
94 points by fitzwatermellow on Nov 2, 2016 | hide | past | favorite | 8 comments



I know that the field of deep learning / machine learning generally moves too fast for researchers to target conferences or journals, but several of the citations in the PDF of this article are missing (LaTeX inserted [?]).

(GRU, in case anyone is wondering, stands for "gated recurrent unit" and is a building block of standard LSTMs)

EDIT: now that I've finished the paper, I've realized that the citations are straight-up missing. That's no good, but I'm sure the authors just messed up the arxiv upload. If OP knows them, they should let them know... failing to include any citations at all is a quick way to decrease the credibility of an article.


A GRU is more like a simplification of the ideas in LSTM, rather than a building block. At a high level, it uses the hidden state as the memory of the cell (rather than a separate cell state) and it uses a single "update" gate, merging the forget and input gates. Overall it performs similarly to LSTM while being more computationally efficient (fewer matrices).


FWIW, my guess is that a lot of the novel stuff that has been released in the last week is because of the impending ICLR deadline (Friday). The review process for that conference allows the papers to be updated until the reviewers' decisions are made. So getting the text 'finalized' isn't an essential step right now.


I've let them know already. I think it is just a LaTeX compilation issue.


If you look at the source you see that the \cite{} statements do reference meaningful anchors but that the bibtext file seems to be missing from the LaTeX archive:

https://arxiv.org/format/1611.00020v1


A GRU is a simplified version of the LSTM, not a building block.


Hi, I am Chen Liang, the first author of the paper. Thanks for pointing out the Latex problem and sorry for the inconvenience.

We are trying to fix the Latex problem and submit a replacement to ArXiv soon. In the meantime, we hosted the PDF version of the paper on another link:

https://www.researchgate.net/profile/Chen_Liang14/publicatio...

Thanks and look forward to your feedbacks and suggestions :)


Interesting. It wasn't until the end of the paper and re-read the top that I noticed "Liang" isn't actually Percy Liang (with whom Berant has collaborated a lot in the past)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: