If P is a built-in function taking a single argument, or one that's already been defined, no. That's often not the case, though, and using a simple lambda is annoying and usually less efficient the rest of the time.
I think this is the case that the choice of the syntax (and a bit of semantics) did affect the choice of API. I'm not a Python programmer, but I can see in your second example that they would prefer comprehensions to filter.
If there were curring or some convenient partial application syntax in the language, using 'filter' can be more natural choice. In Gauche Scheme, pa$ is a built-in partial applicaiton and I'd write:
In general I'd rather write this in Gauche, for a regexp object is applicable:
(filter #/\.py$/ (sys-readdir some-dir))
So, if you have more variations than the bare lambda to create a higher order function (either in currying, partial application, or promoting objects to applicable something, etc), the more the functions such as filter, fold etc become useful. If you need to fall back to lambda for most of times, it would be natural that using those fns becomes annoying.
I've found that one of the most important things
to eliminate from a program is unnecessary names.
Introducing a variable is more than
1 token's worth of conceptual load.
I wish there was an easy syntax (in Python) for not only map and filter - but for something like reduce/foldr/foldl as well. You still have to write loops to do this (or use reduce.)
Me too. I think the introduction of the functional module will actually be good, because there's less resistance to adding clever features for higher-order functions in the standard library than there is in the core language (in Python, at least, since it aims to be small and beginner-friendly). Look at the itertools module to see how that turned out -- lots of useful tricks borrowed from the ML family, and you can "buy it by the yard".
It's a question of habit. If you're experienced with Python List comprehensions, they become clearer than anything else.
"The main problem I see with Ruby is that the Principle of Least Surprise can lead you astray, as it did with implicit lexical scoping. The question is, whose surprise are you pessimizing? Experts are surprised by different things than beginners. People who are trying to grow small programs into large programs are surprised by different things than people who design their programs large to begin with.
For instance, I think it's a violation of the Beginner's Principle of Least Surprise to make everything an object. To a beginner, a number is just a number. A string is a string. They may well be objects as far as the computer is concerned, and it's even fine for experts to treat them as objects. But premature OO is a speed bump in the novice's onramp. " Larry Wall, http://interviews.slashdot.org/article.pl?sid=02/09/06/13432...
You're quoting Larry "I've done more than anyone to fuck up programming languages" Wall as an authority on programer clarity?!? Um... no?
here are some examples from real python (twisted)... let's see how much clearer they are than anything else:
''.join([''.join(['\x1b[' + str(n) + ch for n in ('', 2, 20, 200)]) for ch in 'BACD'])
''.join([''.join(['\x1b' + g + n for n in 'AB012']) for g in '()'])
msgs = [os.path.join(b, mail.maildir._generateMaildirName()) for b in ('cur', 'new') for x in range(5)]
It is code like this that makes me run for the hills... Even ignoring the list comprehensions, I use join as an example of how python gets everything backwards...
ary.join(', ') # ruby - oop: tell the array to join its elements with a str
join(', ', ary) # perl - fun: join ary with a str
', '.join(ary) # python - wtf: tell the str to join the elements of some other ary?!?
I'm quoting Larry Wall because he said something interesting.
The comprehensions from twisted were not written for clarity so they aren't good examples. The programmer wanted a one-liner to define these constants.
Asking Larry Wall about good OO/language/library design, or PoLS is like asking George Bush (either really) about effective foreign policy or human rights...
Even if they say something interesting, is it valuable information coming from that source? Should it be trusted?
yes, they did add that form after a while... (so much for one right way to do things, eh?). It seems that people still prefer the `sep.join(ary)` form more based on the code reading/debugging I've done.
"better order" is subjective and I have to disagree in this case. perl's form isn't limited like pythons as it takes any number of values after the separator. Much like ruby's "* arg" (splat arg--HN's formatting is besting me) or lisp's &rest. Lisp's (+ ...), (< ...), etc are lovely because of this property.
Twisted is an example of some of the most popular python code out there... It is representative of real world python and is what drives me away from the language (and has me running screaming away from twisted).
With filter you have to remember that the predicate is the first argument and the list is the second. With the list comprehension the syntactic cues make it obvious.
In some cases, it might be, such as when you're combining map and filter. Then the list comprehension just looks like psuedocode: include this in the result if these conditions are met. In that example. though, it isn't much clearer. And the 'x for x' seems sort of repetitive.
I think this, and the P/S example, are both symptomatic of how Python (anthropomorphized) wants you to think about functions: they're things you stick parens after and call, not things you pass around on their own.
I find comprehensions to be especially useful in two situations:
1. Transforming a list using some non-standard function, often with some basic input checking:
[int(x)*2+1 for x in seq if x]
(Here, we're assuming x is a string which can possibly be empty or only blanks)
2. Transforming elements from multiple lists together:
['Iter %d took %0.2f secs' % (i, t) for i, t in zip(iters, times) if t > 1.5]
(Where we want output strings only if some iteration took a long time)
Although I'm not too fluent in lisp, I think the equivalent formulations might be longer (since you'd have to write lambda, map, filter and many parens), and perhaps harder to decipher.
Agreed. This example highlights the great savings of the [... _ ...] syntax in Arc (which I'd imagine is a fairly common use-case, especially for map, filter, etc.).
Although ironically in this case, I'd probably use the following:
I think the idea is that [x for x in S if P(x)] is more general; it can do more than just filter. Knowing about filter and how it works will take up a small part of your mental capacity (that could be used for something else, perhaps), and all it can do is filter. Why learn two ways to do the same thing, when one way is more general?
I think that's what Python hackers think, anyway. I use Perl :P
Is this widely believed by Python hackers?