2000 square dollars? If I'm choosing between ways to spend capital so as to improve the efficiency of a process, and that process currently produces five widgets per dollar, then the quantity I'm comparing to choose between my courses of action can be measured in widgets per square dollar.
Hiring a better engineer for more money may create an efficiency improvement of 1 widget per dollar, with an outlay of $1k extra for the better engineer, giving a total gain of 0.001 widgets per square dollar; hiring a worse engineer for much cheaper may represent an improvement of 0.1 widgets per dollar, at an outlay of $1, giving 0.1 widgets per square dollar.
Perhaps not the most intuitive unit, but it's not impossible. (Though since you can even measure it in dollar-sterling if you like, I suppose that doesn't make it a counterexample to "stop multiplying dollars by sterling".)
You have provided a real example, which I was looking for, of why one might need to express a square dollar; thanks.
I wonder if the people who want to argue "types save you from bugs" see your example as very unwelcome, since they'd want to use "squared dollars" as an example of something nonsensical that should be flagged as a type error. I hope those people can reflect rationally on the limits of type systems in the real world.
A type system is merely a tool to encode information to help better model things. If you want to prevent multiplying dollars together, types can help you do that. If you want to enable multiplying dollars together, types can help you do that, too.
# oops my scaler has a unit
x unit * x unit = x unit^2
The value isn’t catching this line of code since it’s potentially valid. It’s catching the line of code where you pass the result to a function that expects unit.
I agree, but the original article at dusted.codes hopes types will "prevent silly mistakes like multiplying $100 with £20".
I don't know what that author would think of multiplying $100 with $20, but my point is that this embrace of type systems is apparently not just about function interfaces; it also includes the operands of things like multiplication, and preventing that operation if the types are fishy.
For better or worse hopefully they think the two examples the same.
Using the example above, "multiplying $100 with £20" can be achieved just by making the better engineer a remote employee paid in pounds. It adds the exchange rate into the mix, so the math will change over time, but conceptually the math is the same as the "dollar * dollar" example.
There isn't a reason why you shouldn't write something like $100 * (£20 / £47) as "int dollars = 100 * 20 / 47;". Note that this expression assicates to the left instead of the right, which can be the right thing to do if doing integer arithmetic. But it would not work with a strongly typed setup as in your example.
In my experience trying to prevent accidental mistakes is a waste of time and often makes our lives miserable. Catching the rare bug by doing complicated work in the type system when it would have been easy to find in normal code anyway is not worth it.
It was just the first example from the top of my head. The expression above calculates the right thing, cast to int. In general, prescribing which units we can multiply and which not, is extremely silly if you consider how we learn it in school. You can multiply anything and everything, simply take care of the units. There isn't an obvious reason why we couldn't have 2000 dollar-pounds as a transient value in a longer computation.
The real problem is that most type systems aren't fit to track the units automatically. Solution: Don't beat yourself up, track the units in your mind / in comments / in variable names instead of the type system. And just get it right. It's not that hard - if you mix something up that's usually the type of bug that is immediately noticed and fixed.
Programmers will have immediate answers for you -- stated confidently as if to imply there is a spec somewhere when in fact no spec exists and the programmer you're talking to is peddling their own bullshit as gold.
A dollar times a dollar is a dollar squared. You don't need a spec for that!
For example, if you have a random variable that's in dollars, its variance would have units of dollars squared. People consider the variance of dollar estimates all the time.
Honest question, what should multiplying $100 with $20 give?