Hacker News new | past | comments | ask | show | jobs | submit | Bjoern's comments login

Not quite what you want, but in the same spirit of Mutt is Newsbeuter.

http://newsbeuter.org/


Maybe a good idea to study this then, could be useful to help drive your understanding on the problem domain.

http://sirius.clarity-lab.org


OK. Thanks. Aside, looked into the sirius video and I kept craving for a demo :)


More on the topic of Dancing Robots, HRP-4C from AIST & Kawada

https://www.youtube.com/watch?v=xcZJqiUrbnI



Here is a well balanced explanation why ECC-RAM. That being said, if the RAM is the weakest link it doesn't matter what underlying FS is used.

https://pthree.org/2013/12/10/zfs-administration-appendix-c-...

In my experience 16GB is only needed if you want to run Deduplication.


> "if the RAM is the weakest link it doesn't matter what underlying FS is used"

When is this ever the case, though? Even non-ECC RAM is more reliable than hard drives except when a RAM chip has died, which most ECC systems don't handle well either. Full chipkill capability is pretty rare.


As I understand the reasoning, it's that ZFS checksumming creates a risk of false positives where non-checksumming filesystems wouldn't have them. So without ECC RAM you're trading the risk of losing data to your IO chain for the risk of losing data to increased RAM activity. With ECC RAM it's an unambiguous win.


It sounds to me like there must be some situation where ZFS reads data from disk, verifies the checksum, keeps it in memory where it could get corrupted, then recomputes the checksum on the corrupted version and writes either the corrupted data or the incorrect checksum to disk but not both, so that the data on disk is now guaranteed to not match the checksum on disk.

Without something like the above sequence of events, I don't see how ECC eliminates any class of errors, rather than just reducing their probability. So what actually triggers such a chain of events?



Panspermia has nothing to with where the water came from.


http://www.nature.com/news/dna-has-a-521-year-half-life-1.11...

DNA has a 'half-life' of ~500 years. Panspermania may still be a viable theory, but it is very restricted in terms of stellar distance/time in transit.


That's at (or near) STP, though.


Very true, but getting this data in a vacuum and in a stellar radiation environment is tough. Who knows what the preserving effects of space are like to DNA. However, one data point is better than none.


What's STP?


Probably "Standard Temperature and Pressure":

http://en.wikipedia.org/wiki/Standard_conditions_for_tempera...


You should maybe tell them to remove their own "google cache" version as well...


The other way around. The link you gave is a duplicate of this post actually.


Here is the HN post from yesterday that also points to the NYTimes article: https://news.ycombinator.com/item?id=8621658


OK – this post has a higher ID than the other one; I assumed they were assigned sequentially.


How does Terraform compare to e.g. Mesos? Or am I comparing apple and oranges here?



Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: