On a python developers' mailing list for the core developers, Python Committers, Benevolent Dictator for Life Guido van Rossum has announced that he is stepping down effective immediately and with out appointing a successor.
Now that PEP 572 is done, I don't ever want to have to fight so hard for a
PEP and find that so many people despise my decisions.I would like to remove myself entirely from the decision process. I'll
still be there for a while as an ordinary core dev, and I'll still be
available to mentor people -- possibly more available. But I'm basically
giving myself a permanent vacation from being BDFL, and you all will be on
your own.After all that's eventually going to happen regardless -- there's still
that bus lurking around the corner, and I'm not getting younger... (I'll
spare you the list of medical issues.)I am not going to appoint a successor.
[...] I'll still be here, but I'm trying to let you all figure something out for
yourselves. I'm tired, and need a very long break.
(Score: 3, Interesting) by jmorris on Friday July 13 2018, @08:56PM (8 children)
All yall punks like to pick on BASIC. Show me a better language you can stuff into an 8K ROM. Anyone? That was the design limit they were dealing with. Even then, it is an OK language to learn the very basics with. And back in the day, because it was so bloody slow, it usually lead to assembly language and EVERY programmer should be required to know at least one of those, to demonstrate they actually know how a computer works, before being allowed near important code.
And it doesn't require much to uplift BASIC into "real language" status. Motorola and Microware released BASIC09 in 1980 and it easily qualifies. Full complement of loop structures, named procedures with parameters and return values, etc. In fact I'd love to see it ported to Linux and given a decent complement of libraries to expose the POSIX APIs, do regexs, etc. Bet one could accomplish at least as much with that as any of the current scripting languages and it being semi-compiled it would likely be as fast or faster, especially if advances in compiler tech were applied.
The "state of the art" has not advanced nearly as much as some people think. Lots of speed bumps from better lithography wasted in ever more bloated executables is not advance. Debian and Red Hat once easily fit on a CD along with lots of extras. I have an InfoMagic box set on the shelf with RedHat 4.2, Contrib and Errata on one CD with other material. Debian 9.4 is now available (only via jigdo) on three BluRays. Fedora (the successor to Red Hat Linux) apparently doesn't even produce a complete set of images anymore. There is now an individual packages larger than a single CD and many more approaching that milestone.
Behold:
I give you the champion:
-r--r--r-- 1 root root 823M Jan 21 2017 redeclipse-data_1.5.8-1_all.deb
But there are many contenders for the trophy that aren't games:
-r--r--r-- 1 root root 552M Mar 2 05:31 linux-image-4.9.0-6-amd64-dbg_4.9.82-1+deb9u3_amd64.deb
-r--r--r-- 1 root root 348M Mar 13 2017 texlive-latex-extra-doc_2016.20170123-5_all.deb
Had they not spit out so many smaller packages from the texlive source package it would probably have been the winner.
Just the installer.....
-r--r--r-- 1 root root 153M Mar 6 03:27 debian-installer-9-netboot-armel_20170615+deb9u3_all.deb
Some light reading....
-r--r--r-- 1 root root 155M Jun 26 2017 qgis-api-doc_2.14.11+dfsg-3+deb9u1_all.deb
Pro Tip: If your API's docs compress to 155MB you might have a problem getting folks to RTFM.
Notice, other than calling out the champ, I'm not picking too hard on the huge packages full of images, textures and other art assets for the games. Yea those have to be big because we have HD monitors and want fancy 3d, good sound, etc. Pretty is good. Nor the huge collections of fonts, clipart, wallpaper, icons, etc.
(Score: 3, Informative) by pvanhoof on Saturday July 14 2018, @04:57AM (2 children)
This is with debugging symbols (see the dbg part of the package's name):
-r--r--r-- 1 root root 552M Mar 2 05:31 linux-image-4.9.0-6-amd64-dbg_4.9.82-1+deb9u3_amd64.deb
Not really fair. Debugging symbols can be huge.
(Score: 2) by jmorris on Saturday July 14 2018, @06:25AM (1 child)
Totally fair. The kernel has bloated up too. For a point of comparison, lets look at Red Hat Linux 4.2 on the InfoMagic CD set I mentioned.
RedHat Linux 4.2: 196.4MB
Contrib Packages: 215.8MB
Errata: 134.9MB
Total: 547.1MB
So yea, the debug symbols for the fricking kernel being bigger than the entirety of Red Linux 4.2 is a bad thing. For another point of comparison, the entire source tree for RHL4.2 is only 320MB. You could develop on the kernel with a hard drive less than a GB because that is all people generally had in that time period. Now you have problems getting a bare bones install running in a GB of space without going with an embedded distribution and will probably end up with busybox or some bogosity. I had Linux with X and Netscape running on a laptop [beau.org] with a 320MB hard drive and 20MB of RAM. I actually did WORK on that machine in Applixware. On a dual P-133 with a 850MB drive and 64MB of RAM I could build a kernel in a reasonable time. Now you think a half GIG of -zipped up- debug symbols is reasonable and unremarkable?
(Score: 3, Informative) by bitstream on Saturday July 14 2018, @07:01PM
Computers get better. Corporations sponsor ever crappier programmers. ;)
(Score: 2) by DannyB on Monday July 16 2018, @02:30PM
I just want to point out that I was a HUGE BASIC fanboy back in the day. While I was in college I had started a 3 ring binder, sort of a BASIC reference manual, as it were, for my ideal dream BASIC language. Since in my inexperience I thought BASIC was the cat's meow compared to other minicomputer languages.
I also gave significant thought on how to build an interactive BASIC compiler. That is, it would act like a BASIC interactive interpreter. You type in program lines. Type RUN, etc. My first thought was about how to build a much faster BASIC interpreter. My idea was that the interpreter stores the program text in an internal tokenized form, which was common. But this tokenized form should allow for a two-byte (woo a whole 64K) pointer into another part of the program text, whenever there was any kind of line number, or other reference.
For example, if line 180 said:
180 IF A = B THEN 430 ELSE GOTO 670
then the internal tokenized representation would represent it like:
180 IF A = B THEN IGOTO 430 ELSE GOTO 670
The IGOTO is an "invisible" GOTO. When LISTing the program text, you never see the text IGOTO, it would LIST like the original line 180. Now both the GOTO and IGOTO internal representation is the token of course, plus a two byte slot which is the pointer to the respective lines.
Similarly, when you execute a FOR loop, the stack records a pointer to the FOR token back in the tokenized program text. That way when you hit a NEXT, the top stack entry can directly and immediately proceed back to the FOR statement. No searching the text. No table lookup.
When any kind of line number reference occurs, even say, PRINT USING, there is a two byte pointer in the stored program text.
When you type RUN, the interpreter makes a single pass over the tokenized text and fills in all the line number pointers. The program execution could be significantly faster. This would also give you an immediate fatal error if your program had a line number reference to a non-existant line. Most BASIC interpreters would happily begin execution and you wouldn't get invalid line number reference until much later.
Next, I turned my thoughts to an interactive compiler. If memory were less of a problem, as I knew it would soon the case due to Moore's Law, I mean geez, there were now 64 K boards on a single board!; I thought the "interpreter" could also store a compiled representation. Each BASIC program line would be a single heap object. (Yes, even then I was thinking about heaps, allocators, etc.) There would be a table of line numbers, with pointers to the heap object representing that compiled line. The compiled code might consist of many JSR (jump to subroutine) instructions back into the runtime library for PRINT, INPUT, etc. But it's still a lot faster than an interpreter.
Next a batch compiler could do basically the same compilation, but in batch mode. Now you have both a fast interactive compiler along with a batch compiler for the SAME dialect of BASIC. And I wanted a rather expansive dialect. More advanced string and array capabilities. Date / Time values. BCD as well as floating point. (The idea of using a large integer for business currency values hadn't occurred to me yet.)
But as I become more advanced, I soon met Pascal. Fell in love. Never looked back at BASIC, except with fond memories -- even to this day fond memories of BASIC.
Don't put a mindless tool of corporations in the white house; vote ChatGPT for 2024!
(Score: 2) by DannyB on Monday July 16 2018, @02:32PM (3 children)
On the subject of "bloat".
One man's "bloat" is another man's "features".
Word is a much larger and slower program than Notepad.
Linux is a much larger and more complex system than CP/M.
People complain about bloat but are blind to the features. The most simplistic text editors today do on the fly spelling error detection.
Don't put a mindless tool of corporations in the white house; vote ChatGPT for 2024!
(Score: 2) by jmorris on Tuesday July 17 2018, @03:21AM (2 children)
Comparing Word to Notepad is silly. I'm not being a grumpus demanding people use Latex and to get off of my lawn.
Instead of Notepad, compare a modern Microsoft Word (or LibreOffice Writer that opens to a blank document with 209MB of resident set consumed) to Lotus Smartsuite of the day, or even Microsoft Office 95. Both have full featured productivity packages with the full GUI experience, OLE/DDE, tons of typefaces, both bitmap and vector graphics, embedded media, programming languages embedded in text documents, the good, the bad and the ugly of the "Integrated Office Suite" that ran on machines with far less than a single GB of ram and shipped on floppies or a single CD. We aren't talking about a "little" bloat here; we are witnessing a factor of 100 increase in memory usage for little obvious benefit to the user. Nobody really cares because Moore's Law provided but somebody should be asking "WTF is going on?" If you want to know why your phone can't make it through a day, answering the bloat question is a lot productive than demanding the engineers to solve getting 8 cores and 8GB of ram to stay lit all day on a small battery. If we could get 1990s era software to run on a mobile platform you could run a week on a charge. Imagine applying modern tech to scale down instead of up, instead of extracting more MHz we instead optimized the software to run well on sub GHz cores and half a GB of ram and used modern fabrication to make it consume minimal power. Then there is the GPU, not sure what can be done about that problem...
(Score: 2) by DannyB on Tuesday July 17 2018, @01:34PM (1 child)
You make a good point. There REALLY IS bloat. But my point is that not all increased demand for cycles and bytes are due to bloat. I had been re-reading BYTE magazine from it's first issue.
https://www.americanradiohistory.com/Byte_Magazine.htm [americanradiohistory.com]
https://archive.org/details/byte-magazine [archive.org]
I am struck by:
* how shockingly primitive the technology was
* how much they could get done with so little
* how limited the usefulness of systems actually were
* how poor the programming productivity was compared to modern tools / languages
That leads me to another point. Sometimes the "bloat" or inefficiency you describe is due to efforts to save human programmer time. The most expensive resource these days is no longer the computer, but the people who write software. Sure I could write in assembly language and optimize to the hilt. But the gains would be VASTLY outweighed by the cost. Instead most software is written in higher level languages, more abstract frameworks, etc. The inefficiency is outweighed by the cost savings of development.
Hypothetical example: If I can write a web based business application in Java and it only needs twice the CPU and six times the memory of a program in C++, but I can beat my competitor to market by a year, my bosses will say that is cheap at the price! You need an extra 64 GB of ram on this fire breathing 8 socket server? No problem! I'm optimizing for dollars, not for cycles and bytes. That is a legitimate tradeoff which people can choose to make.
Don't put a mindless tool of corporations in the white house; vote ChatGPT for 2024!
(Score: 2) by DannyB on Tuesday July 17 2018, @01:36PM
Another example: If you asked most people this question: Would you prefer to have your next software upgrade six months sooner if it used 25 % more memory? I wonder what the answer would be?
Don't put a mindless tool of corporations in the white house; vote ChatGPT for 2024!