Angry Jesus writes:
We all know that python is slower than compiled languages like C. But what can you do about it? Jake VanderPlas, director of research in the physical sciences for the university of Washington's eScience institute, digs into python's internals to explain how it works and what program design choices you can make to use python efficiently.
Python gets a lot of serious use though. As I recall, Dropbox is almost 100% python (so much so that they are developing their own JIT [dropbox.com]). Youtube is also mostly python (with ffmpeg for the video processing).
For good reason.
The same reasons, in fact, why you don't waste time optimizing an inefficient string processing routine which, when you profile, gets used 0.01% of your runtime. To turn that around: if Python lets your team hack out 80% of your application in 5% of the time, thanks to these features, that is an obvious massive win.
In my experience (warning: Python dev, I'm biased but also not sugarcoating) this is true. Python results in clearer, easier to maintain code. When I need serious performance there's always Cython or the various methods to hook into lower level libraries, but the remaining >= 80% of the time using Python was a clear and objective win for efficiency and maintainability.
Dito for Perl. What I really like is the powerful and easy pattern matching capability. One can mix huge binaries with bit operations and laissez faires operands that stretch arithmetic to string matching without boundaries.
As always if you code to shoot yourself in the foot. The code will deliver your misery.
Clear, easy to maintain Python code? So, you're the guy.
Damn, meant perl code. Doh.
The raw number crunching is slow, but the things I write in Python are network or disk access bound, not CPU bound. I can imagine a sufficiently large data set where that would shift, but in practice it's never happened.