canopic jug writes:
On a python developers' mailing list for the core developers, Python Committers, Benevolent Dictator for Life Guido van Rossum has announced that he is stepping down effective immediately and with out appointing a successor.
Now that PEP 572 is done, I don't ever want to have to fight so hard for aPEP and find that so many people despise my decisions.I would like to remove myself entirely from the decision process. I'llstill be there for a while as an ordinary core dev, and I'll still beavailable to mentor people -- possibly more available. But I'm basicallygiving myself a permanent vacation from being BDFL, and you all will be onyour own.After all that's eventually going to happen regardless -- there's stillthat bus lurking around the corner, and I'm not getting younger... (I'llspare you the list of medical issues.)I am not going to appoint a successor.[...] I'll still be here, but I'm trying to let you all figure something out for
yourselves. I'm tired, and need a very long break.
Now that PEP 572 is done, I don't ever want to have to fight so hard for aPEP and find that so many people despise my decisions.
I would like to remove myself entirely from the decision process. I'llstill be there for a while as an ordinary core dev, and I'll still beavailable to mentor people -- possibly more available. But I'm basicallygiving myself a permanent vacation from being BDFL, and you all will be onyour own.
After all that's eventually going to happen regardless -- there's stillthat bus lurking around the corner, and I'm not getting younger... (I'llspare you the list of medical issues.)
I am not going to appoint a successor.
[...] I'll still be here, but I'm trying to let you all figure something out for
yourselves. I'm tired, and need a very long break.
Best of luck to him. It is good to change now and then.
Whether you are a regular Python user or not, he has contributed his time and effort towards something that makes the world a better place and keeps the wheels and gears of the machinery running smoothly.
Python has not made the world a better place. The language design is fundamentally difficult to compile or even JIT. This imposes a huge performance cost. The resulting slowness, besides being frustrating and an impediment to getting business done, has an environmental cost. We use more power, need larger batteries, and need faster more-complicated cores. Many errors that should be caught at compile-time are instead left as lurking surprises, hiding in wait to crash a business-critical production system.
What you say is true. I could similarly argue that Lisp is difficult to compile, but not impossible. It has a huge performance cost. The slowness is an impediment . . . battery cost . . . environment . . . etc. Many errors that could be caught at compile time are instead runtime surprises.
Yet I also consider Lisp (various dialects and implementations) to be great tools. Why? Because they are much higher level abstractions than C. It's unlikely that one would build a logic programming system, a computer algebra system, a theorem prover, etc in C. But they might in Lisp or Python.
While I haven't used them, Python has tools like NumPy. Good wrappers for TensorFlow.
For low-skilled programmers, Python is a great adhesive to glue systems together. It is a much better first language than grandpa's BASIC. We're talking about a language (Python 3) with Generators.
Python seems to work as an embedded language (LibreOffice, Blender).
On RasPi, it has good GPIO support and libraries for various chips.
I'm a Java developer. Not a heavy Python user. But I'm not so blind to be unable to see the immense value in Python. I'm also not taking a side in static / dynamic flamewar, but I find that strongly statically typed languages (eg, Java, Pascal type languages) are very useful. I agree with your point that catching errors at compile time is much better than at runtime. I have sometimes argued that the compiler is your zero'th level of unit tests. Yet I find dynamic languages (usually Lisp or Clojure) have great usefulness for certain things.
In the Java world, on the horizon, the Graal VM will make dynamic languages easier to compile.
Blessings upon all the people who contribute their effort to give us such tools we should be greatful for. Tools people would have drooled and gone bonkers for three decades ago.
most people should be able to write in a very high level language(s) then have the compiler handle all the tedious and/or technical, security related, low level stuff and still produce output that is efficient like if someone wrote the whole thing in C. a new Hip Hop type project with zero gotchas from the PHP core project would be nice, for instance. dlang may do some of this, but idk just how far it goes. rust tries to help with memory safety but idk how high level the language actually is. i think the real/more experienced programmers should be working on the languages, the compilers, and the IDEs (i know they already do, but i mean in this exact context) and build them all for less experienced application devs/designers with only higher level understanding/skill level. i think it's unreasonable to expect most people to learn the low level stuff and really we have different levels of nerd and super nerds want to work on super nerdy things, while your average nerd just wants to scratch an itch without it causing a bunch of problems/being a turd. I think this is the future, if AI doesn't completely replace us before then.
For low-skilled programmers, Python is a great adhesive to glue systems together...
Noooo, that's like giving a three year old a tube of superglue and a box containing multiple tubs of glitter....sure, the surface results are ever so shiny, but underneath? and the resultant clean up operations?....yech!
It is a much better first language than grandpa's BASIC.
Well, BASIC was never really my cup of tea, I cut my programming teeth on assembler, (though ISTR there was a version available on the DECSYSTEM-20 which was fun in a lets-crash-the-system-with-this kind of way..so I'm old enough for a great grandpa appellation) and there's probably a bunch of BBC BASIC programmers out there in their bath chairs, wheeling their way towards thee at this very minute all of a mind to strongly disagree with you on your point above.
For what it's worth, in my view, compared to BASIC, it's a hell of lot easier for programmers, beginners or otherwise, to make abstruse and pernicious coding errors in Python which can then lie hidden for years just waiting for their 5 minutes of fame, and despite the code being installed and in use on tens of thousands of computers globally, these errors never being spotted (ah, the smell of randomly corrupted Berkeley DB files in the morning...how I miss them!..we never did find the borked section of Python code responsible, so as a matter of expediency wrote a bit of Perl to watchdog the db files and fix them when they went titsup, thereby making the fixing of the Python code SEFP).
Easiest (and most profitable) consultancy job I've ever had? back in the '90s converting 20 BASIC programs (which started off life running on a mainframe back in the 60's) into a form which would compile properly in Turbo Basic and pass the test datasets checks, it was a bit of a revelation that not all 'business critical' financial code was written in COBOL.
It frightens me some mornings to think that they might still be out there, being used in anger, running in a DOS VM somewhere..
Noooo, that's like giving a three year old a tube of superglue and a box containing multiple tubs of glitter.
For their own three-year-old project, why is that so bad?
I'm not saying give a low skilled programmer a tool so that they can do something important and mission critical.
I mean, for example, how many Excel Power Users could benefit from, say, Python, or Julia?
uh, mathmatica is built in c. so is maple.
All yall punks like to pick on BASIC. Show me a better language you can stuff into an 8K ROM. Anyone? That was the design limit they were dealing with. Even then, it is an OK language to learn the very basics with. And back in the day, because it was so bloody slow, it usually lead to assembly language and EVERY programmer should be required to know at least one of those, to demonstrate they actually know how a computer works, before being allowed near important code.
And it doesn't require much to uplift BASIC into "real language" status. Motorola and Microware released BASIC09 in 1980 and it easily qualifies. Full complement of loop structures, named procedures with parameters and return values, etc. In fact I'd love to see it ported to Linux and given a decent complement of libraries to expose the POSIX APIs, do regexs, etc. Bet one could accomplish at least as much with that as any of the current scripting languages and it being semi-compiled it would likely be as fast or faster, especially if advances in compiler tech were applied.
The "state of the art" has not advanced nearly as much as some people think. Lots of speed bumps from better lithography wasted in ever more bloated executables is not advance. Debian and Red Hat once easily fit on a CD along with lots of extras. I have an InfoMagic box set on the shelf with RedHat 4.2, Contrib and Errata on one CD with other material. Debian 9.4 is now available (only via jigdo) on three BluRays. Fedora (the successor to Red Hat Linux) apparently doesn't even produce a complete set of images anymore. There is now an individual packages larger than a single CD and many more approaching that milestone.
I give you the champion:-r--r--r-- 1 root root 823M Jan 21 2017 redeclipse-data_1.5.8-1_all.deb
But there are many contenders for the trophy that aren't games:-r--r--r-- 1 root root 552M Mar 2 05:31 linux-image-4.9.0-6-amd64-dbg_4.9.82-1+deb9u3_amd64.deb-r--r--r-- 1 root root 348M Mar 13 2017 texlive-latex-extra-doc_2016.20170123-5_all.debHad they not spit out so many smaller packages from the texlive source package it would probably have been the winner.
Just the installer.....-r--r--r-- 1 root root 153M Mar 6 03:27 debian-installer-9-netboot-armel_20170615+deb9u3_all.deb
Some light reading....-r--r--r-- 1 root root 155M Jun 26 2017 qgis-api-doc_2.14.11+dfsg-3+deb9u1_all.debPro Tip: If your API's docs compress to 155MB you might have a problem getting folks to RTFM.
Notice, other than calling out the champ, I'm not picking too hard on the huge packages full of images, textures and other art assets for the games. Yea those have to be big because we have HD monitors and want fancy 3d, good sound, etc. Pretty is good. Nor the huge collections of fonts, clipart, wallpaper, icons, etc.
This is with debugging symbols (see the dbg part of the package's name):-r--r--r-- 1 root root 552M Mar 2 05:31 linux-image-4.9.0-6-amd64-dbg_4.9.82-1+deb9u3_amd64.deb
Not really fair. Debugging symbols can be huge.
Totally fair. The kernel has bloated up too. For a point of comparison, lets look at Red Hat Linux 4.2 on the InfoMagic CD set I mentioned.
RedHat Linux 4.2: 196.4MBContrib Packages: 215.8MBErrata: 134.9MBTotal: 547.1MB
So yea, the debug symbols for the fricking kernel being bigger than the entirety of Red Linux 4.2 is a bad thing. For another point of comparison, the entire source tree for RHL4.2 is only 320MB. You could develop on the kernel with a hard drive less than a GB because that is all people generally had in that time period. Now you have problems getting a bare bones install running in a GB of space without going with an embedded distribution and will probably end up with busybox or some bogosity. I had Linux with X and Netscape running on a laptop [beau.org] with a 320MB hard drive and 20MB of RAM. I actually did WORK on that machine in Applixware. On a dual P-133 with a 850MB drive and 64MB of RAM I could build a kernel in a reasonable time. Now you think a half GIG of -zipped up- debug symbols is reasonable and unremarkable?
Computers get better. Corporations sponsor ever crappier programmers. ;)
I just want to point out that I was a HUGE BASIC fanboy back in the day. While I was in college I had started a 3 ring binder, sort of a BASIC reference manual, as it were, for my ideal dream BASIC language. Since in my inexperience I thought BASIC was the cat's meow compared to other minicomputer languages.
I also gave significant thought on how to build an interactive BASIC compiler. That is, it would act like a BASIC interactive interpreter. You type in program lines. Type RUN, etc. My first thought was about how to build a much faster BASIC interpreter. My idea was that the interpreter stores the program text in an internal tokenized form, which was common. But this tokenized form should allow for a two-byte (woo a whole 64K) pointer into another part of the program text, whenever there was any kind of line number, or other reference.
For example, if line 180 said:
180 IF A = B THEN 430 ELSE GOTO 670
then the internal tokenized representation would represent it like:
180 IF A = B THEN IGOTO 430 ELSE GOTO 670
The IGOTO is an "invisible" GOTO. When LISTing the program text, you never see the text IGOTO, it would LIST like the original line 180. Now both the GOTO and IGOTO internal representation is the token of course, plus a two byte slot which is the pointer to the respective lines.
Similarly, when you execute a FOR loop, the stack records a pointer to the FOR token back in the tokenized program text. That way when you hit a NEXT, the top stack entry can directly and immediately proceed back to the FOR statement. No searching the text. No table lookup.
When any kind of line number reference occurs, even say, PRINT USING, there is a two byte pointer in the stored program text.
When you type RUN, the interpreter makes a single pass over the tokenized text and fills in all the line number pointers. The program execution could be significantly faster. This would also give you an immediate fatal error if your program had a line number reference to a non-existant line. Most BASIC interpreters would happily begin execution and you wouldn't get invalid line number reference until much later.
Next, I turned my thoughts to an interactive compiler. If memory were less of a problem, as I knew it would soon the case due to Moore's Law, I mean geez, there were now 64 K boards on a single board!; I thought the "interpreter" could also store a compiled representation. Each BASIC program line would be a single heap object. (Yes, even then I was thinking about heaps, allocators, etc.) There would be a table of line numbers, with pointers to the heap object representing that compiled line. The compiled code might consist of many JSR (jump to subroutine) instructions back into the runtime library for PRINT, INPUT, etc. But it's still a lot faster than an interpreter.
Next a batch compiler could do basically the same compilation, but in batch mode. Now you have both a fast interactive compiler along with a batch compiler for the SAME dialect of BASIC. And I wanted a rather expansive dialect. More advanced string and array capabilities. Date / Time values. BCD as well as floating point. (The idea of using a large integer for business currency values hadn't occurred to me yet.)
But as I become more advanced, I soon met Pascal. Fell in love. Never looked back at BASIC, except with fond memories -- even to this day fond memories of BASIC.
On the subject of "bloat".
One man's "bloat" is another man's "features".
Word is a much larger and slower program than Notepad.
Linux is a much larger and more complex system than CP/M.
People complain about bloat but are blind to the features. The most simplistic text editors today do on the fly spelling error detection.
Comparing Word to Notepad is silly. I'm not being a grumpus demanding people use Latex and to get off of my lawn.
Instead of Notepad, compare a modern Microsoft Word (or LibreOffice Writer that opens to a blank document with 209MB of resident set consumed) to Lotus Smartsuite of the day, or even Microsoft Office 95. Both have full featured productivity packages with the full GUI experience, OLE/DDE, tons of typefaces, both bitmap and vector graphics, embedded media, programming languages embedded in text documents, the good, the bad and the ugly of the "Integrated Office Suite" that ran on machines with far less than a single GB of ram and shipped on floppies or a single CD. We aren't talking about a "little" bloat here; we are witnessing a factor of 100 increase in memory usage for little obvious benefit to the user. Nobody really cares because Moore's Law provided but somebody should be asking "WTF is going on?" If you want to know why your phone can't make it through a day, answering the bloat question is a lot productive than demanding the engineers to solve getting 8 cores and 8GB of ram to stay lit all day on a small battery. If we could get 1990s era software to run on a mobile platform you could run a week on a charge. Imagine applying modern tech to scale down instead of up, instead of extracting more MHz we instead optimized the software to run well on sub GHz cores and half a GB of ram and used modern fabrication to make it consume minimal power. Then there is the GPU, not sure what can be done about that problem...
You make a good point. There REALLY IS bloat. But my point is that not all increased demand for cycles and bytes are due to bloat. I had been re-reading BYTE magazine from it's first issue.
I am struck by:* how shockingly primitive the technology was* how much they could get done with so little* how limited the usefulness of systems actually were* how poor the programming productivity was compared to modern tools / languages
That leads me to another point. Sometimes the "bloat" or inefficiency you describe is due to efforts to save human programmer time. The most expensive resource these days is no longer the computer, but the people who write software. Sure I could write in assembly language and optimize to the hilt. But the gains would be VASTLY outweighed by the cost. Instead most software is written in higher level languages, more abstract frameworks, etc. The inefficiency is outweighed by the cost savings of development.
Hypothetical example: If I can write a web based business application in Java and it only needs twice the CPU and six times the memory of a program in C++, but I can beat my competitor to market by a year, my bosses will say that is cheap at the price! You need an extra 64 GB of ram on this fire breathing 8 socket server? No problem! I'm optimizing for dollars, not for cycles and bytes. That is a legitimate tradeoff which people can choose to make.
Another example: If you asked most people this question: Would you prefer to have your next software upgrade six months sooner if it used 25 % more memory? I wonder what the answer would be?
I've never liked Python's choice to use indentation. But it seems to work.
I've had limited exposure to Lua, but that exposure was positive and I liked it. (Lua embedded in an installer product "Setup Factory" which generated SETUP.EXE installers back in late 1990s.)
All he did was create an easy to use programming language and give it away to the world completely for free.
Fuck him, right!
Here, I'll give you my refrigerator for free. Fuck me, right?
Trying to offload that death trap? Yeah, fuck you!
I think you're getting it confused with C++.
As opposed to compiled languages that never have runtime issues?
Python is a perfectly good tool that has it's place. Because it is a high level language, you can focus on the real problem rather than dealing with fiddly (and bug prone) low level details for the bazillionth time.
Certainly, Python (nor any other language) shouldn't be the only tool in the box, but that hardly makes it a bad thing.
> The language design is fundamentally difficult to compile or even JIT.
I think you misunderstand. Python is a language for scripting/rapid prototyping/hacking. The beautiful magic is that, if you use functionality a lot, you can rewrite it in C and it runs 10x faster.
For example, I want to hack a script together to check some physics theory calculation. I knock out some naive calculation in python in half an hour. If it works, I can put together something more sophisticated in C but it takes a few days.
The beautiful magic is that, if you use functionality a lot, you can rewrite it in C and it runs 10x faster.
And then someone like me will come along, rewrite your C code in Perl and it'll run either just slightly slower or at the same speed, but now has the added advantage that mere mortals will look at it and go WTF? how does that block of code do that? and so they'll then try implementing that Perl code in Python, then you'll come along and rewrite the resultant Python in C, and then I'll rewrite your resultant code again it in Perl...If we keep it up for a goodly number of program generations, what with all the weird transcription errors that are bound to creep in, and with programmers being the idiosyncratic characters we are, after all our trademark foibles start getting added to the rewritten code, we might accidentally come up with a spontaneously emergent 'real' AI, though I suspect that we might have to get a Perl->COBOL->Lisp->C->Haskell->Python->APL chain going for a couple of generations for some really fun 'misunderstandings' to creep into the code.
But yes, Python is a reasonable code hacking language, I still find it easier to think in Perl though and then translate to Python, 'tis probably a generational thing...
The beautiful magic is that, if you use functionality a lot, you can rewrite it in C and it runs 10x faster.And then someone like me will come along, rewrite your C code in Perl and it'll run either just slightly slower or at the same speed...
And then someone like me will come along, rewrite your C code in Perl and it'll run either just slightly slower or at the same speed...
And then someone comes along and reads your Perl code out loud in order to hammer some meaning out of it only to instead have to spend the next few years banishing all the summoned demons by rewriting the code Pythonically in Python which means it's so clean, readable, documented, understandable, thoroughly polished, and elegant that it achieves retroactive causality and slays all the monsters.
Then someone comes along who has a kinky thing for C++ and around we go again.
People who are convinced Python is too slow can use Cython. If they still say it's too slow it smells like PEBCAK code; the main source of all things slow.
More and more I'm hearing that from so many people -- about myself. Beautiful letter from Scott Pruitt when he resigned.......... foxnews.com/politics/2018/07/05/scott-pruitts-full-resignation-letter-to-president-trump.html [foxnews.com]
Plus, all we need is a different object with the relevant public methods and properties, and it will be a perfect drop-in replacement for Guido. After all, if it walks like a BDFL, and talks like a BDFL, then it's a BDFL for all intents and purposes.