Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That Markov Chain model operates on 4-grams by default. The RNN featured in the article generates output character-by-character, which is significantly more impressive. Here's a sample from the Markov Chain model operating on 4-grams:

  Ther deat is more; for in thers that undiscorns the unwortune, 
  the pangs against a life, the law's we know no trave, the hear, 
  thers thus pause. 
The only reason why it seems like the model can occasionally spell, and create anglo-sounding neologisms, is because it operates on 4-grams.

Here's some character-by-character output from the same Markov Chain model.

  T,omotsuo ait   pw,, l f,s teo efoat t hoy tha fm nwo   
     bs rs a h enwcbr lwntikh  wqmaohaaer ah es aer 
  mkazeoltl.etnhhifcmfeifnmeeoddssmusoat irca   
  do'ltyuntos sih i etsoatbrbdl


"do'lty untos sih i"

maybe the computer was drunk?


it's a completely legit invocation for awakening cthulhu.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: