Hacker News new | past | comments | ask | show | jobs | submit login

Don't know.. But you make it sound like a bad thing. If you're out of memory you kind of have to evict stuff, don't you?

I get the feeling you are kind of anti-Redis and I don't get why? Redis is a very cool project and could be useful for a lot of things.. It's not Redis' fault some people misuse it..




I am not at all anti-redis. Its not at all redis's fault it gets misused either; I say so that in the blog-post, even.

So why would you compress something that you can only decompress if its recently-reused?

How would you do mailinator with your strings in redis - and taking O(n) calls to redis to recover them to decompress an email where n is the number of lines (or consecutive lines, granted) in the email?


"I am not at all anti-redis. Its not at all redis's fault it gets misused either; I say so that in the blog-post, even."

Yeah, you actually do.. sorry.

"So why would you compress something that you can only decompress if its recently-reused?"

Not sure I understand your questions, and I've just started looking at Redis. But I guess you could do it the same way, but the added latency may make it infeasible. But the better answer is probably that you don't: You would modify the implementation to fit Redis' (or whatever) strength and weaknesses.


Trying to work out how fit mailinator into Redis rather than questioning if Redis fits into mailinator is exactly the cargo-cult cool-kids zombism I was ranting against, though ;)

You really can process mailinator- quantities of email with a simple Java server using a synchronized hash-map and linked list LRU and have some CPUs left over for CPU-intensive opportunistic LZMAing.

Trying to do it with IPC TCP ping-pong for each and every line though; well I'm not sure you could process mailinator quantities of email within any reasonable hardware budget...

Luckily you have a chance to see the error of your ways :)


Well, I never claimed Redis should be used for this, you are the one who asked how to do it.

But you have to remember that most people can't have important data in just one process; it's going to crash and your data is gone. The LMAX guys solved this in a cool way, but I wouldn't call it easy: http://martinfowler.com/articles/lmax.html#KeepingItAllInMem...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: