Angry Jesus writes:
We all know that python is slower than compiled languages like C. But what can you do about it? Jake VanderPlas, director of research in the physical sciences for the university of Washington's eScience institute, digs into python's internals to explain how it works and what program design choices you can make to use python efficiently.
Remember the days when you could still get FROTHY PISS on (SoylentNews||newsitename) long after the story had been posted? Oh, what heady, foamy times those were.
I like programming in Python for it simplicity, but I know I have to pay something for that. Fortunately for most almost everything I need that cost isn't high, or there are modules that help, so Python is good enough for the job. It is also quite a nice tool to make quick programs and test ideas before a serious implementation.
Python gets a lot of serious use though. As I recall, Dropbox is almost 100% python (so much so that they are developing their own JIT [dropbox.com]). Youtube is also mostly python (with ffmpeg for the video processing).
For good reason.
The same reasons, in fact, why you don't waste time optimizing an inefficient string processing routine which, when you profile, gets used 0.01% of your runtime. To turn that around: if Python lets your team hack out 80% of your application in 5% of the time, thanks to these features, that is an obvious massive win.
In my experience (warning: Python dev, I'm biased but also not sugarcoating) this is true. Python results in clearer, easier to maintain code. When I need serious performance there's always Cython or the various methods to hook into lower level libraries, but the remaining >= 80% of the time using Python was a clear and objective win for efficiency and maintainability.
Dito for Perl. What I really like is the powerful and easy pattern matching capability. One can mix huge binaries with bit operations and laissez faires operands that stretch arithmetic to string matching without boundaries.
As always if you code to shoot yourself in the foot. The code will deliver your misery.
Clear, easy to maintain Python code? So, you're the guy.
Damn, meant perl code. Doh.
The raw number crunching is slow, but the things I write in Python are network or disk access bound, not CPU bound. I can imagine a sufficiently large data set where that would shift, but in practice it's never happened.
I have Pythonista friends who tell me that Python is a transformable dialect of LISP. I ask them why they don't transform it to LISP and run it through the well-proven LISP to C compilers (to juice their runtime performance).
They don't answer then question and we eat pizza instead; wait about three months and they tell me that Python is a dialect of LISP.
I have no idea whether your friends are right, but even if they are, actually writing a Python-to-Lisp translator would still be a lot of work. Probably more work than justified by the gain it would give them. Especially if their programs just don't need that runtime optimization. For example, speeding up code that spends most of its time waiting for I/O is generally useless.
I have Pythonista friends who tell me that Python is a transformable dialect of LISP.
I've never heard this claim (but I don't know many proper Python fans).
There's a lot of good stuff in Python, but transformability? No. LISP-dialect? No, of course not.
Next time they try it, point them at the failed Unladen Swallow project.
I don't get why they characterize "forgiving" as a virtue of dynamic typing. I don't want my language to be forgiving. I prefer to be told quite from the start that I'm trying to do nonsense, not some time later when my code fails at run time.
Dynamic typing has advantages, but "forgiving" is not one of them. Forgivingness is a disadvantage. It's a price you pay for the flexibility dynamic typing gives you. It's often a price worth to pay, but it's still a cost, not a benefit.
Exactly. Add on to that the fact if you write 'forgiving' code, you might get to meet your unforgiving boss or his boss!
But ... I'm so proud of my forgiving password check. It not only tolerates typos, it even tells you how to correct them! ;-)
I don't get why they characterize "forgiving" as a virtue of dynamic typing.
On that specific point, they don't. The comment about "forgiving" is a separate sentence to that regarding dynamic typing. They are saying that Python in general is "forgiving".
Whether "forgiving" is a good thing or not I couldn't really say in this case, since I'm not sure in what ways the author means that Python is forgiving (and can't really think of any myself).
Well, it certainly isn't forgiving about changes of indentation. ;-)
It's pining. For the fjords.
One problem of Python that I didn't see mentioned is that of parallellization to make use of multiple cores. See Jeff Knupp, Python's Hardest Problem [jeffknupp.com] for a thorough explanation.
TL;DR: a fundamental issue with the Python interpreter is that it is not suitable for parallel execution. Only data crunching that does not involve running the interpreter (e.g., data crunching by compiled code) can be executed effectively in parallel.
This really is the largest problem with the CPython implementation. The JIL doesn't exist in Jython and I don't think in IronPython (.NET) but lingers on in the C interpreter.
The Multiprocessing package gets around this by spawning a real live Python process and then lets the OS handle the parallelism and that works very well. Python doesn't yet have a nice proper threading package without the lock.
http://benchmarksgame.alioth.debian.org/u64/benchm ark.php?test=all&lang=v8&lang2=python3&data=u64 [debian.org]
That said it's strange it's slower than normal PHP: http://benchmarksgame.alioth.debian.org/u64/benchm ark.php?test=all&lang=php&lang2=python3&data=u64 [debian.org]
The benchmarks are old but I think my point still remains true :).