cpython is faster than c++ under the following circumstances:
1) when you're using native libraries via python that were written by better c/c++ programmers than you are and you're spending most of your time within them
2) when you're using native libraries in python that are better implementations than the c/c++ libraries you're comparing against
3) when you don't know the libraries you're using in c/c++ (what they're talking about here)
...otherwise, if you're just using doing basic control flow optimizing compiler c/c++ will almost always be faster. (unless you're using numba or pypy or something).
point stands about the constants though. yes, asymptotic analysis will tell you about how an algorithm's performance scales, but if you're just looking for raw performance, especially if you have "little data", you really have to measure as implementation details of the software and hardware start to have a greater effect. (often times the growth of execution time isn't even smooth with the growth of n for small n)
I think a key part here is also being realistic about the time available to write and optimize your program. I’ve seen Python completely crush C++ a fair number of times (order of magnitude or better) and it basically came down to the C++ programmer having bitten off more than they could chew, spending so much time on the basics that they never got around to optimizing algorithms or the data layout. (One variation: Python hashlib > whatever C textbook example you copy-and-pasted because you thought calling OpenSSL was too much work)
This is frustrating for programmers because everyone wants to focus on the cool part of a program and forgets how much the rest takes to write, debug, etc. There are many reasons why I prefer Rust but one of the biggest is simply that having a standard package manager means that you can get high-quality code as easily as in languages like Python or JavaScript and are more likely to avoid that pitfall of reinventing wheels because it initially seems easier than finding a library.
yeah, the c/c++ ecosystem never really had the benefit of a internet connected curated library community. afaik the first big example of that was perl in the late '90s. CPAN was awesome: here's this big library of awesome libraries that have been curated with full tests and documentation that you can search and add to your system with a few easy cli invocations. (for the uninitiated, this was npm or pip for perl in the 90s- complete with dedicate wikipedian level pedants gatekeeping/curating)
moreover, batteries included scripting languages like perl, python, matlab, etc all tend to have the benefit of having their core bits be best of breed. perl has/had one of the best re engines out there, matlab has a great blas all compiled with the best optimizing compiler and ready to go, python was more generic i suppose, with fairly strong libraries for most things (strings, io, network io, etc).
other than the microsoft nuget stuff, the c/c++ ecosystem never really had the benefit of anything like that other than boost which was pretty tough to pull into a given project and didn't really have the community of people writing high level libraries like the scripting languages did. that said, i often used to think it would have been interesting to build a language agnostic platform for language centered library communities. (a sort of generic cpan/pip/npm in a box for pulling down libraries and running tests for any language- a combination of build system, online community platform and search engine)
but the real moral of the story: use the libraries, luke/lucia! also, know them!
CPAN definitely deserves more attention, especially for the emphasis on testing which many successors still don't have. That was even more necessary back when OS consistency was worse but it really should have been seen as a first-class feature.
I think C/C++ also had this problem with the whole cultural evolution around shared libraries. Because installs were non-trivial I think there was an almost subconscious blinder effect where people restricted themselves to what shipped with the OS / distribution even if that meant keeping a workaround for a bug which had been fixed upstream years before because that was seen as better than static linking or bundling a newer version.
the pedantry around design and testing in CPAN is where i learned how to be a serious software engineer. perl was special in that it bridged the divide between software engineering and system administration. it was software engineering with the pedantic nature of high strung sysadmins who insisted the garden was in perfect shape at all times.
C++ code usually use std:: string, std::string_view, or other string classes that stores the string size and has some capacity pre-allocated to avoid such issues.
Profiling is essential. I found a performance bug in calling some C++ functions a while ago, because they accepted a const std::string& and were being called in loops with a C const char*. Every single call had to construct a std::string involving a strlen, allocation and copy.
std::string_view is a nice fix for this but few programmers seem to use it yet.
1) when you're using native libraries via python that were written by better c/c++ programmers than you are and you're spending most of your time within them
2) when you're using native libraries in python that are better implementations than the c/c++ libraries you're comparing against
3) when you don't know the libraries you're using in c/c++ (what they're talking about here)
...otherwise, if you're just using doing basic control flow optimizing compiler c/c++ will almost always be faster. (unless you're using numba or pypy or something).
point stands about the constants though. yes, asymptotic analysis will tell you about how an algorithm's performance scales, but if you're just looking for raw performance, especially if you have "little data", you really have to measure as implementation details of the software and hardware start to have a greater effect. (often times the growth of execution time isn't even smooth with the growth of n for small n)