With all that it borrowed from Perl, I'm surprised Ruby didn't also borrow the defined-or operator `//=` (https://perldoc.perl.org/perlop#Logical-Defined-Or). (Or, if it did, I guess the author hadn't encountered it.)
That's an anti-pattern. In a properly lexically scoped language, an undefined identifier is erroneous, diagnosable before you even run the code.
If optional dynamic scoping is supported, it can make sense to deal with an undefined variable, but is still something you would want to avoid. E.g. Common Lisp programs almost always assidiously define dynamic variables via a top-level defvar.
Testing for definedness is usually part of some hack that doesn't deserve its own syntax as a predefined language feature.
The basic idea is that programs should usually know which of their variables are defined, at what point in the program. It should be statically obvious, or failing that, at least dynamically obvious. The remaining situations can be valid, but are low-priority hacks.
(You can always have your macro for it, if you want it badly enough.)
For the environment var example, we can use the regular or (which is much like ||), if the lookup yields nil, which is false:
1> (or (getenv "HOME") (getenv "LOGDIR") (error "You're homeless!"))
"/Users/kazinator"
2> (or (getenv "XXXHOME") (getenv "LOGDIR") (error "You're homeless!"))
** You're homeless!
** during evaluation at expr-2:1 of form (error "You're homeless!")
Looks like Perl made a mess of this simple thing, too.
I understand some languages have an "undefined" value for a variable that is actually defined, but that's pretty lame too; it gives rise to multiple ways of being false. Awk is like this. Awk knows every variable in the program before executing it. GNU Awk builds a global symbol table of all variables at compile time. So every variable is considered defined in the sense of being known to the implementation and existing. It just has a kind of undefined value initially that serves as an empty string or zero, and tests false, but is neither a number nor string. It's a pretty cockamamie design from the POV of larger programs, but does lead to very short one-liners.
These thoughts on proper typing discipline are a valuable perspective on good programming-language design, but I think that they probably don't help much for writing code in Ruby, which, I believe, doesn't behave this way. (I'm not a Ruby-ist, so can't speak authoritatively to it.)
Nonetheless, supposing that we're working in a language that behaves as you suggest—if memoization isn't built in, how would a user build it in a way that allows caching of the return value `false`? (For example, for testing whether a positive integer is prime, where getting that `false` value could be very expensive.)
The problem is identical to how do we have an hash table (or other associative array or dictionary) in which we distinguish the presence of a false value from the absence of a value.
There are solutions for that. For instance, a lookup operation that returns the underlying dictionary entry, or nil if there is no entry. If there is an entry, we can dereference it: entry.value has the value. That's a solution not requiring any special type system constructs like Optional or sum types.
We can have a double lookup: if the lookup yields the ambiguous false value, then we can do another lookup like contains(hash, key) that has a boolean result.
Some languages like Common Lisp have multiple return values. The gethash function in Common Lisp returns a second value which indicates whether the key is present. If the first value is nil, but the second value is t, then it indicates that the nil was actually pulled from the hash table.
> Some languages like Common Lisp have multiple return values. The gethash function in Common Lisp returns a second value which indicates whether the key is present. If the first value is nil, but the second value is t, then it indicates that the nil was actually pulled from the hash table.
But this seems to be what you said was an anti-pattern:
> That's an anti-pattern. In a properly lexically scoped language, an undefined identifier is erroneous, diagnosable before you even run the code.
You were speaking there, presumably, of bare variables, not of hash entries, but Perl's defined-or works just as well on lookups, like `$a{ key } //= "Hi there!"`, in which case it is just a less-verbose version of the two-step lookup you propose.
ActiveSupport adds tons of great convenience methods to Ruby and you can require it even outside of Rails! blank, present; date and time conversions like 1.month.from_now; except; enumerable methods like pluck... It's just lovely
That is not the same, because Object#present? is a test for nonblankness, not definedness. So, for example, the empty string "" returns nil for #presence, which is exactly what we don't want when caching the result of a potentially expensive templating operation, external service query etc.
The proper equivalent uses the defined? keyword, as in
(defined? @result) ? @result : @result = ""
which will produce @result with conditional assignment over definition. Note that defined? is a keyword, not an operator.
There is no shorthand for this expression. There's less motivation because instance variables are better accessed symbolically. If you're not the accessor code then it's none of your business, and if you are a memoizing accessor then your specialist action doesn't really warrant an operator simply for golf. This isn't autovivification but Perl has that as an adjacent idea already, and all this adds up to the //= operator making a lot more idiomatic sense and holding more purpose in Perl than it does in Ruby.
It's not immediately obvious to me how one could use that to write a Ruby analogue of `a //= b`. If I understand correctly, it would still need to be the relatively verbose `a = a.present? ? a : b`. (I think even `a = a.presence || b` wouldn't work in case `a` was defined but nil, which is exactly the edge case the defined-or operator is meant to handle.)
It's a natural pattern, and translates with obvious changes to Perl, but still `x = defined? x ? x : y` is longer than `x //= y`, so all I was meaning to say in my original comment was that I was surprised that Ruby didn't borrow the latter from Perl.