Hacker News new | past | comments | ask | show | jobs | submit login

I felt it was a grave mistake for Python to direct the community effort into backwards incomparable Python 3, rather than on focusing on the speed (and better profiling instrumentation) of Python 2 (perhaps adding a standard JIT compiler). As a result, had our company started now, we would have written our Twisted server in C or Go instead of Python like we did.



Sadly I agree.

Looking back Python 3 came too late, didn't offer much in terms of an incentive to justify switching to it, divided the community and in the process also left other things people worry about by the wayside -- performance, packaging, better concurrency handling (no not via Tulip or Twisted).

I think, like you point out, many before they decided to sink the time into upgrading to Python 3 to get better unicode support for example, would also take the time to evaluate and go with something else altogether -- Go for example.


I still don't get the point of using Go instead of PyPy or Cython.


The point was that once people are looking at restructuring the code base and re-writing parts of it, it also becomes a point of evaluation other technologies and frameworks. So there is a high chance many will not switch Python 2 -> Python 3 but Python 2 -> Go or something else.

Then well ok, the answer to "What did Python 3 bring to the Python community?" becomes "Yeah it ended up driving a lot of people away", which I am sure wasn't the intended purpose of Python 3.0


So switching Python 2 -> Python 3 is a higher cost than Python 2 -> Go?!?


Yes if you add in amortized benefit of extra performance and static type checking during compilation.

benefit(2->3 + performance benefit + maintainability) <

benefit(2->Go + performance benefit + maintainability)

The time cost of 2->3 will be less but if I know I will get better memory footprint, and faster response times I might be willing to invest some more time into it.

Or another way to look at it, once one sits down and rolls their sleeves looking to re-factor/renew/refresh the code base there is a good chance they will look around them and see what else is out there.


Or something else, yes. If the cost is potentially going to be big enough to convert for a large code base (keeping in mind testing might be a huge part of it - having to potentially re-test everything), and for very little benefit (well, unicode possibly, but it's not like Python 3's really any faster, and the GIL's still there), it makes you evaluate if it's worth enduring a bit more transition pain to convert to something else completely which would bring the possibility of new features / better speed.

It's also quite an issue for COTS software - sometimes to jump forwards and clean up APIs they have to do a huge refactor, sometime even a re-write of the API. If this is troublesome enough, it's worth taking the time to look into the competition.


I wouldn't consider rewriting in a complete different language, without any support for idiomatic Python code a "bit more transition pain".

GIL only exists on CPython, there are other implementations.


GIL also exists on PyPy.


Because Go is at least as fast as PyPy or Cython, have a similar coding feeling to Python, have language level concurrency support, and has a clear/consistent development roadmap moving forward.

PyPy is not really a good example here. It is still highly experimental, takes long time to compile, it's not compatible with quite a few libraries in normal Python, and still has all relative downsides of dynamic typing. To be frank, I don't get the point of using PyPy instead of Go.

Previously most Python libraries already use C to optimize critical components. If you can use just one language to get similar performance, why do you bother using two then?


The issue is not about writing clean new programs in Go, but rather re-writing whole code bases from Python into Go, instead of trying out Python implementations that offer better performance than CPython.


Clean and efficient concurrency primitives, greater memory efficiency, and the ability to finely control memory layout, combined with just enough "high level" in the language to write complex systems. For that, you give up some (but not too much) of the rapidity of dynamic typing, and you don't have the marvelous ecosystem of libraries Python has built up. (yet)


You can get that with PyPy and Cython.


Clean and efficient concurrency primitives? I think not? I would be very interested if that is indeed the case.


PyPy supports greenlets and a Stackless Python-style CSP with tasklets and channels. See http://doc.pypy.org/en/latest/stackless.html . And it has for years. The original Stackless author is also one of the PyPy developers.

Or do you mean some other style of concurrency primitives?


Stackless Python has very clean and efficient concurency primitives, something lot of people dont seem to know..


> Stackless Python has very clean and efficient concurency primitives, ....

Sounds like a good topic for a post (or at least a link - hint, hint).


There are plenty of examples on http://www.stackless.com/ :)


I like Cython, but how do you do CSP (Communicating Sequential Processes) with it?


With PyCSP, as one possible example

https://code.google.com/p/pycsp/


Cython is really impressive, but it produces incredibly huge C files, which then compile into enormous binaries. So it's not terribly suitable when code size matters.


Sure because Go binaries produced by Go static linking are small...

https://code.google.com/p/go/issues/detail?id=6853


Meanwhile, refactored PHP makes Wordpress 20% faster. http://news.php.net/php.internals/73888 , https://wiki.php.net/phpng and https://news.ycombinator.com/item?id=7699322

HHVM and now PHP 5.7-dev, both with JIT - the future of PHP looks bright.

Starting a new project, we would choose PHP, Node.js or Go according to the requirements.


[deleted]


Please read the page again (search for "JIT") and follow the link for more info.


Yeah, so refactored PHP makes a PHP app 20% faster.

But you might actually not like to hear Node's V8 engine is between 50 and 150 times faster than PHP in synthetic benchmarks, and about 20 times faster in more diversified applications (as long as they don't sit an eternity blocking waiting on a DB call which is the way people did web apps 10 years ago). That's 2000% faster than PHP, just so we use the same units here.

I'm very happy PHP is getting faster and all, but just don't juxtapose it next to JS engines (which the article talks about), let alone something more serious like the JVM, because the "bright future" of PHP starts looking quite dim in comparison.


It depends, for things like Web Sockets Node.js is really good. But for other things, HHVM (Facebook's HipHop JIT) and the new PHP 5.7 dev with JIT are really fast.

Check out an independent benchmark for comparision: http://www.techempower.com/benchmarks/#section=data-r9&hw=i7...


Did you consider PyPy? It kinda works and it kinda focuses effort on being a faster replacement.


PyPy is good. I wish there was more moment and more _pull_ for it from the main Python team. I feel the big push for the last 3-4 years should have been for PyPy (or performance and tooling in general).

PyPy is good. I wish there was more moment and more _pull_ for it from the main Python team. I feel the big push for the last 3-4 years should have been for PyPy (or performance and tooling in general).

EDIT:

To expand, I remember feeling so excited during the PyCon-s in Chicago and then Atlanta. Unladen Swallow was making some progress but PyPy was fantastic, great speed improvements. Everyone looked in awe at you guys, and your presentations. We talked about STM and all the great things.

And then...Guido announced in a rather annoyed tone how PyPy will never be a part of main Python distro and that's that. "Stop dreaming folks". And that made me pretty disappointed.

Then the drumbeat for Python 3.0 started. Unicode, unicode, unicode... it got louder and louder.

The wall of shame for "incompatible" libraries that were not being ported to Python 3.0. Lots of how to migrate and to Python 3 talks. Lots of oh look at the great view iterators.

Things that would make me get off my seat and start porting to Python 3 would be: greenlet integration for low memory, concurrency green threads (gevent and eventlet). Great CPU speed improvement with included PyPy. Notice what is not on the list -- unicode, view iterators and renaming Queue module to queue. Sorry if I am being too sarcastic here, it is because I am bitter about it.

It is also important to realize that Python 2.0 is just very good that it becomes hard for Python 3.0 to beat it. There were not big hairy warts that just bothered me about it. So it isn't that Python 3.0 is bad, it is not, it is just not better enough Python 2.0.


I think you're a little hard on Guido. CPython had amassed quite a few C extensions by that point. PyPy changes the low-level API in numerous significant ways. If Guido were to merge CPython with PyPy it would break all of those extensions. These extensions are in no small part the reason why Python is so popular. If he were to break all of this work, he'd piss off quite a few people. Many of the same people that helped him make Python what it is today.

Now I agree with you having PyPy's performance improvements in the main implementation would be awesome, and PyPy itself is quite impressive. If they could get both NumPy running on PyPy and support Python 3's changes, it would keep Python a strong competitor for the foreseeable future.

And to your last point, I entirely agree Python 2.0 had so many strengths that it's really hard to improve upon it. Though I think that it's unicode support was the big hairy wart.


> I think you're a little hard on Guido

Perhaps you are right. My bitterness was coming through. It no t just that he said but how he said it.

In a way Python 3.0 also broke compatibility, for what I think, are not very good reasons. Some of that code was from people/companies who have adopted and used Python and made it popular. They are now living on a dead-end maintenance only branch.


Read http://python-notes.curiousefficiency.org/en/latest/python3/... and tell me if you still not convinced.


Read it, still not convinced. Have you read what I wrote though? I think I was pretty clear that I knew about changes and knew about other things (iterator views) etc.

None of those things make me want to get up roll up my sleeves and say "I can't wait to dig in and make my code compatible with 3.4". If I woke up tomorrow and someone would have magically done that for me and tested it, yeah great. I would buy them a beer. But that is about it. I have 0 incentive today to upgrade.

Code works very well now just using unicode support from Python 2 and print as a statement + other warts.


Yes we have: May 2 16:58:35 db2 kernel: [11106.666665] pypy[5672]: segfault at 18 ip 00000000010d9f39 sp 00007fff73204fa8 error 4 in pypy[400000+11dd000]


did you post a bug report? do you have a way to reproduce it? did you check the newest pypy?


I feel quite the other way round. They should have cleaned up their language before so many tools where developed on it.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: