That seems very logical to me, but then, I’m not a functional programmer, I just like map. It’s elegant, compact, and isn’t hard to understand. Not that list comps are hard to understand either, but they can sometimes get overly verbose.
filter has also lost ground in favor of list comps, partially because Guido hates FP [0], and probably due to that, there has been a lot of effort towards optimizing list comps over the years, and they’re now generally faster than filter (or map, sometimes).
First off, writing f3(f2(f1(x))) is painful - keeping track of parentheses. If you want to insert a function in the middle of the chain you have some bookkeeping to do.
Second, that's all good and well if all you want to do is map. But what if you need combinations of map and filter as well? You're suddenly dealing with nested comprehensions, which few people like.
An alternative for python is to flip what you're iterating over at the outermost level. It's certainly not as clean as F# but neither is it as bad as the original example if there's a lot of functions:
iter = [1,2,3,4]
f1 = lambda x: x*2
f2 = lambda x: x+4
f3 = lambda x: x*1.25
result = iter
for f in [f1, f2, f3]:
result = [f(v) for v in result]
Then the list comprehension can be moved up to mimic more closely what you're doing with F#, allowing for operations other than "map":
result = iter
for f in [
lambda a: [f1(v) for v in a],
lambda a: [f2(v) for v in a],
lambda a: [f3(v) for v in a],
]:
result = f(result)
And a step further if you don't like the "result" reassignment:
from functools import reduce
result = reduce(lambda step, f: f(step), [
lambda a: [f1(v) for v in a],
lambda a: [f2(v) for v in a],
lambda a: [f3(v) for v in a],
], iter)
Fair, but how would it look if you had some filters and reduces thrown in the middle?
In my F# file of 300 lines[1], I do this chaining over 20 times in various functions. Would you really want to write the Python code your way every time, or wouldn't you prefer a simpler syntax? People generally don't do it your way often because it has a higher mental burden than it does with the simple syntax in F# and other languages. I don't do it 20 times because of an obsession, but because it's natural.
[1] Line count is seriously inflated due to my habit of chaining across multiple lines as in my example above.
My example was just a way to do it with plain python and nothing special. There are libraries that use operator overloading to get more F#-style syntax.
I think we can just let this rest. These kinds of operations are not as ergonomic in python. That's pretty clear. No example provided is even remotely close to the simplicity of the F# example. Acquiesce.
The fact is the language just works against you in this area if you have to jump through hoops to approximate a feature other languages just have. And I don't even mean extra syntax like F#'s pipe operators (although I do love them). Just swapping the arguments so you could chain the calls would look a lot better, if a little LISPy. It really is that bad.
Generators require a __next__ method, yield statement, or generator comprehension. What you've got is lambdas and a list comprehension. Rewriting using generators would look something like:
items = [1,2,3,4]
gen1 = (x*2 for x in items)
gen2 = (x+4 for x in gen1)
gen3 = (x*1.25 for x in gen2)
result = list(gen3)
It's nicer in a way, certainly closer to the pipe syntax the commenter your replying to is looking for, but kind of janky to have to name all the intermediate steps.
filter has also lost ground in favor of list comps, partially because Guido hates FP [0], and probably due to that, there has been a lot of effort towards optimizing list comps over the years, and they’re now generally faster than filter (or map, sometimes).
[0]: https://www.artima.com/weblogs/viewpost.jsp?thread=98196