Hacker News new | past | comments | ask | show | jobs | submit login

> it's technically rational to do just that

It's rational only in the same way that Vulcans are always logical (as Star Trek writers would have you believe).

Rationality is a tool to help you achieve your goals and realize your values better. Any particular method can only ever be rational relative to some goal one is seeking.




I think you're both talking past each other a bit. The difference is between maximizing short-term gains and maximizing long-term gains.

The thing about investing is that you can take advantage of maximizing short-term gains and get out before the investment would crater. So taking advantage of a non-sustainable business model is fine if you can get out before the bill comes due. This is rational, but not sustainable.

If you're running the company, and you care about it, and its employees, then you want something sustainable in the long-term.


Actually, the comment you're replying to is querying what "gains" means - i.e. what you're gaining.


Yeah but for most people, $$$ is a very important goal and, when sufficiently large, overrides most others. Especially other goals in the professional space.


> It's rational only in the same way that Vulcans are always logical (as Star Trek writers would have you believe).

But Star Trek writers wouldn't generally have you believe that (they'd have you believe that some Vulcans themselves would believe that and/or have others believe that and/or seek that.)

> Rationality is a tool to help you achieve your goals and realize your values better.

Rationality isn't a tool, it is perfectly optimal goal seeking.

> Any particular method can only ever be rational relative to some goal one is seeking.

Well, no, it's relative to the utility-weighted combination of all of your goals; that is, rational behavior is whatever behavior optimizes your complete utility function, not just addresses one of your goals in isolation.


> Rationality isn't a tool, it is perfectly optimal goal seeking.

I think you just agreed with the comment you're trying to disagree with. To paraphrase your comment: "rationality is a tool for achieving one or more goals".

If not a tool, what would you describe it as? Animal instincts - the kind we're hard-wired with - aren't "rational" in the context of human society so being rational is in some sense a choice. The only way to define this away is to extend the definition such that all human behaviour is "rational" because it's meant to fulfill some conscious or unconscious goal. But this definition is useless!


> I think you just agreed with the comment you're trying to disagree with.

You are wrong.

> To paraphrase your comment: "rationality is a tool for achieving one or more goals".

No, “rationality is a tool...” is not a correct paraphrase of “rationality is not a tool...”.

Rationality isn't a tool for optimization, it is simply a label for perfect optimization of the utility function.


Utility weighting is fine and good, but rationality in the long term is definitely constrained by uncertainty. Rationality cannot be perfectly optimal, because perfection would require closed ended choices. If you don't know the size of a factor like uncertainty, you cannot rationally assign it a utility weight.


> Rationality cannot be perfectly optimal, because perfection would require closed ended choices.

Rationality is, by definition, perfectly optimal, though you describe succinctly why it cannot actually exist, only at best be approximated.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: