Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Tuesday September 10 2019, @04:26AM   Printer-friendly

Arthur T Knackerbracket has found the following story:

We are volunteers who make and take care of the Python programming language. We have decided that January 1, 2020, will be the day that we sunset Python 2. That means that we will not improve it anymore after that day, even if someone finds a security problem in it. You should upgrade to Python 3 as soon as you can.

We need to sunset Python 2 so we can help Python users.

We released Python 2.0 in 2000. We realized a few years later that we needed to make big changes to improve Python. So in 2006, we started Python 3.0. Many people did not upgrade, and we did not want to hurt them. So, for many years, we have kept improving and publishing both Python 2 and Python 3.

But this makes it hard to improve Python. There are improvements Python 2 can't handle. And we have less time to work on making Python 3 better and faster.

And if many people keep using Python 2, then that makes it hard for the volunteers who use Python to make software. They can't use the good new things in Python 3 to improve the tools they make.

We did not want to hurt the people using Python 2. So, in 2008, we announced that we would sunset Python 2 in 2015, and asked people to upgrade before then. Some did, but many did not. So, in 2014, we extended that sunset till 2020.

If people find catastrophic security problems in Python 2, or in software written in Python 2, then volunteers will not help you. If you need help with Python 2 software, then volunteers will not help you. You will lose chances to use good tools because they will only run on Python 3, and you will slow down people who depend on you and work with you.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Informative) by Anonymous Coward on Tuesday September 10 2019, @05:16AM (19 children)

    by Anonymous Coward on Tuesday September 10 2019, @05:16AM (#892080)

    Python2 doesn't have native unicode support, handles it badly in many existing cases, as well as a variety of math function issues due to being overly ambiguous in the original implementation. They have the Compat23 module or whatever it is called for working with python2 code on python3 or python3 code on python2, but really that is just barely enough to tech for deviant behavior between the two.

    That said, i firmly believe as much use as python gets, it's not ready for primetime.

    Unfortunately, with the way C, C++, Go, Swift, etc are being mismanaged by their steering committees, particularly to ensure their own jobs rather than to benefit both the language, implementers, as well as developers/compiling on limited systems, none of them have been doing a good job of it either since their creation or in a decade or more. C++11 got pushed before C++98 support was complete on major compilers for chrissakes. And the same is happening with newer standards where half baked C++14/17/20? features are included by default so dumbass developers can use them in what should be standards compliant code by default.

    So really the python2/3 mess is no different than any other language right now. Hell go look at perl and the 5.24/5.26+ debacle. All kinds of old stuff broken but they claim the language/API/ABI is the same.

    Starting Score:    0  points
    Moderation   +1  
       Informative=1, Total=1
    Extra 'Informative' Modifier   0  

    Total Score:   1  
  • (Score: 2) by c0lo on Tuesday September 10 2019, @05:46AM

    by c0lo (156) Subscriber Badge on Tuesday September 10 2019, @05:46AM (#892090) Journal

    C++11 got pushed before C++98 support was complete on major compilers for chrissakes

    I argue that wasn't a bug, that was a feature.

    Because the major compilers have had from 1989 to 1998 to implement C++ 2.0 and all of them had mostly incomplete or weird support for 9 years
    And then 1998 to 2011, another 10 years+ of the same garbage for C++ standard support (including the MS "managed C++" and "C++/CLI" extensions)
    Yes... "for chrissakes".

    I still remember the effort to write a C++ code that can be compiled cross-platform/-compiler (heaps of checks of the "#ifdef BORLAND" and "#ifdef MSVC" and "#ifdef WATCOM" and... and... kind)

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
  • (Score: 0) by Anonymous Coward on Tuesday September 10 2019, @05:59AM (10 children)

    by Anonymous Coward on Tuesday September 10 2019, @05:59AM (#892093)

    So really the python2/3 mess is no different than any other language right now. Hell go look at perl and the 5.24/5.26+ debacle. All kinds of old stuff broken but they claim the language/API/ABI is the same.

    So ALL modern languages are CRAP, go to know.

    I guess that marks IBM the winner everyday of week, since they planned decades ahead building tools that still work correctly in multiple generations and hardwares.

    • (Score: 5, Interesting) by lentilla on Tuesday September 10 2019, @06:38AM (1 child)

      by lentilla (1770) on Tuesday September 10 2019, @06:38AM (#892104)

      since they planned decades ahead

      No, they did not plan decades ahead - they simply refused to make changes to the language.

      Python; et al; could have all stayed immutable. The downside being that the advantages stemming from experience and advances in the state-of-the-art would be unavailable to users of that language.

      From a user's perspective I like the way LilyPond [lilypond.org] handles the language versioning: you put a directive like \version "2.18.2" at the top of your source (matching the version you are currently using), and it is expected that future versions will compile to identical output. (This is accomplished by using a conversion program to upgrade the source to each new version.)

      Unfortunately in Python's case, upgrading from version two to three is not able to be automated.

      The nastiest gotcha with Python versioning is that if your software depends on that one critical library written in Python2 your entire project is stuck on that version (without messy work-arounds). That's why they say sticking with Python2 is hurting the community. Python3 is what Python2 should have been - it is a better language - although to a neophyte the only obvious difference is that print "hello world" needs to be print("hello world"). It is just a pity the community has to go through the adjustment phase.

      • (Score: 2) by Freeman on Tuesday September 10 2019, @03:20PM

        by Freeman (732) on Tuesday September 10 2019, @03:20PM (#892235) Journal

        When, I got into tinkering with Python a year or two ago, now. I read up on what version of Python, I should be using. I went with Python3, as everyone was saying, use Python3, unless you're stuck on Python2 for some big project.

        --
        Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
    • (Score: 4, Interesting) by DannyB on Tuesday September 10 2019, @03:01PM (7 children)

      by DannyB (5839) Subscriber Badge on Tuesday September 10 2019, @03:01PM (#892225) Journal

      So ALL modern languages are CRAP, go to know.

      Java user here. While Java has its warts, due to being two decades old, I don't think of it as crap. It does what I need. It continues to be in the very top rated languages year after year. It is heavily used in the biggest businesses, especially financial or 'cloud'.

      The JVM ecosystem is enormous. Other language compilers. An embarrassment of well developed third party libraries.

      Like Python, and unlike C++, all of the third party libraries can interoperate together, because . . . GC. Everything uses the same memory management discipline. Unlike C / C++ where major projects introduce their own memory management discipline not necessarily like other major projects.

      One thing Java has done right is backward compatibility. It's always been easy. Old compiled code continues to run. They've strived hard to do this because, as Python 2 / 3 shows, you can't just expect everything to get rewritten.

      --
      To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
      • (Score: 4, Interesting) by ilsa on Tuesday September 10 2019, @06:16PM (6 children)

        by ilsa (6082) Subscriber Badge on Tuesday September 10 2019, @06:16PM (#892288)

        It's funny because I have the exact opposite interpretation.

        C (and to a lesser extent C++) is still the language that all other languages strive to be because it is unquestionably stable.

        Java tried to do that, but failed in a variety of ways.

        And there's a very simple explanation, and it all started with, IMO, and breathakingly idiotic decision made by Sun that everyone around the world has misguidedly copied.

        In the days of C, and basically most languages of that generation, there was a very clear delineation between the language grammar, and the libraries used in that language. There may have been improvements in libraries, or entirely new libraries, but the language itself was very slow moving, almost to the point of being static. And that paradigm worked very well. The language was stable. There were a set of core libraries that were also very stable, and then things branched out depending on need.

        Then Java came around and decided libraries were an integral part of the language itself. That meant new version of the language needed to be release to handle new libraries. And since Sun had a severe case of ADD, they kept swapping out wildly different library sets between versions. Should you use AWT, swing, or god know what else came out that week? And since *everything* was changing anyway, why not make changes to the core language too? People were going to have to upgrade anyway if they wanted to use the new library versions. The lack of stability became a self-perpetuating excuse to not even try to be stable.

        But Java entered "worse than failure" territory where it was just good enough to do what people needed, and so it proliferated. And as it did so, people's attitudes to how languages should even work also changed, using Java as a template despite the terrible precedent it set, and now we have an entire generation of new languages which are just as, if not more so, idiotic.

        We're now going through languages like used tissues because people constantly come up with a new flavour of the week, everyone goes, "OMG SHINY" and starts using it, only to have to throw all that effort out the window a month or two later because everything has changed again. "Frameworks" are even worse because they're practically a language unto themselves, but without the mental discipline because "it's only a framework."

        This is especially evident in the javascript world, and is one of the reasons why I loathe javascript with an absolute passion. I made the mistake of learning Angular. Then then abject moron that invented Angular decided, "Nope! Not shiny anymore! Time for a rewrite!" and boom, all that effort I spent learning Angular was a complete waste of time. We're hopefully going to look at this time period, decades from now, with the same disdain people have for the 60s and it's gharish colour schemes.

        The only reason Java could be considered relatively stable was because when Oracle bought Sun, they had absolutely no idea what they were doing and so Java was stuck at v1.6 for a long time. Then they eventually squeezed out 1.8 which wasn't a huge departure from 6. But now they've moved to an accelerated cadence so we can look forward to Java becoming an unstable pile of crap again.

        • (Score: 2) by DannyB on Tuesday September 10 2019, @07:39PM (5 children)

          by DannyB (5839) Subscriber Badge on Tuesday September 10 2019, @07:39PM (#892317) Journal

          very clear delineation between the language grammar, and the libraries used in that language

          Not quite sure what you're saying. Java grammar has been pretty stable. Few additions over time. In fact the most new language features seem to begin at Java 8 and beyond. I think of libraries as JARs -- binaries. They continue to work even with newer compilers used to compile code calling old libraries.

          Java came around and decided libraries were an integral part of the language itself

          I'm not sure what you mean. With C I can include a header, and at runtime be linked with a binary -- possibly written in a different language. With Java, I can include a class or classes, and at runtime be dynamically linked with those classes on the classpath -- possibly written in a different language.

          using Java as a template despite the terrible precedent it set, and now we have an entire generation of new languages which are just as, if not more so, idiotic.

          I don't understand what you're getting at. There have been a few new languages, over time. Groovy. Scala. Clojure. Kotlin, come to mind. I know there have been a few others. Other than that these run on the JVM, I'm not sure why you blame Java. I could blame C that there are new languages that compile to code that can be called from C and vice versa. If you don't like a new language, don't use it.

          Sun had a severe case of ADD, they kept swapping out wildly different library sets between versions.

          This might be before I got into Java. I would appreciate an example.

          Should you use AWT, swing, or god know what else came out that week?

          I think for a very long time it was only AWT or Swing. A few years back Java FX was introduced, and then significantly improved. I could argue with C, should I use GTK, QT, wxWidgets, etc. I don't blame C that there are multiple GUI frameworks to choose from. In fact, I would consider that choice to be an advantage, as they probably offer various pros and cons. I have occasionally even seen projects that let you use QT or something from the C/C++ world in Java. I wouldn't blame any language for this -- even if I thought that was a bad thing that deserved some kind of blame.

          The lack of stability became a self-perpetuating excuse to not even try to be stable.

          I would really be interested in an example of what you mean. The biggest change I can think of in Java, before Java 8 / 9, was the introduction of Generics, which was completely backward and forward compatible at the source and binary level.

          We're now going through languages like used tissues because people constantly come up with a new flavour of the week

          That has been going on in every area of the industry since at least the 1980's. There have constantly been new languages, compilers, libraries, frameworks, tools, IDEs, etc. It never has and never will stop. In the last decade, JavaScript has been especially bad for new web framework of the week -- on the front end. Java has had a number of server backend frameworks over time, but not so many as JavaScript. But again, I see this as a feature rather than a problem.

          I can clearly see that you hate Java. And that's fine. Don't use it. It hasn't been the top or in the top few languages for years now for no reason.

          I came from a Pascal background in the 80's. In the mid 90's I tried C and C++ quite seriously. (not just dabbling) For example, I wrote a BASIC interpreter just for fun that ran in the GUI. (Using Metrowerks CodeWarrior on both Classic Mac and Windows 95/98 ish) I wrote a tool to convert an old proprietary database format into DBF usable by dBase / FoxPro / etc. (Back in the early 1980's, in Pascal, on microcomputers, there weren't off the shelf databases we needed -- so we had to write our own, and it was uphill both ways!) My point is I didn't just dabble a little bit with C and C++. But I didn't consider them the future for our next generation of GUI software, and later Web software. And the rest of the industry seemed to agree. IMO, C, and also C++ are just too low level for application programming. Great for boot loaders, OSes, microcontrollers. But you shouldn't be writing applications in C.

          A language is too low level when it forces you to think about things that are irrelevant to your problem.

          --
          To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
          • (Score: 2) by FatPhil on Tuesday September 10 2019, @11:20PM (1 child)

            by FatPhil (863) <{pc-soylent} {at} {asdf.fi}> on Tuesday September 10 2019, @11:20PM (#892446) Homepage
            > A language is too low level when it forces you to think about things that are irrelevant to your problem.

            So one that makes you wonder whether the answer you want is 42 or Integer.valueOf(42) would be too low level.

            Which reminds me of the "fix" that was the cache of Integers, that's just freaking hilarious; more language decisions like that please, they're laughs for the ages!
            --
            Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
            • (Score: 2) by DannyB on Wednesday September 11 2019, @03:59PM

              by DannyB (5839) Subscriber Badge on Wednesday September 11 2019, @03:59PM (#892757) Journal

              I never seem to have problems with boxed vs unboxed integers or other primitive types. I suppose back in the 90's the decision could have been made that all primitive types must always be an object in the heap. Every boolean, byte, characters, integer, etc. But I suppose nobody would have liked that.

              Many languages make practical design decisions for efficiency.

              The cache of integers is something that happens, again for efficiency, in the runtime engine, invisible to the programmer. That's a runtime efficiency implementation decision, not a language decision.

              That would be like blaming C language for some highly optimized malloc library implementation that someone might find amusing for some reason -- despite its efficiency gains.

              --
              To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
          • (Score: 2) by ilsa on Wednesday September 11 2019, @08:07PM (2 children)

            by ilsa (6082) Subscriber Badge on Wednesday September 11 2019, @08:07PM (#892876)

            I got introduced to Java when it was still 1.0, and was doing more than just dabbling, at the 1.2 mark. Most of the issue's I'm talking about are pre-1.6. Another Java version, another major toolkit change and you were more or less expected to just... stop using the old versions. AWT vs Swing (and then JavaFX) thing was one of the most egregious examples. But more fundamental libraries had the same treatment. Luckily the more fundamental libraries stabilized pretty quickly because Sun was starting to seriously piss people off.

            And while you're right that C/C++ also had a very diverse ecosystem, the difference is that all these Java packages were sanctioned by Sun. These were official libraries distributed by Sun/Oracle. In a matter of years, Java wasn't just Java anymore. It was now a massive ecosystem of conflicting libraries. It looked like Sun really had no vision, no idea what they were doing, so they were just going to shotgun every conceivable technology/paradigm/whatever and let God sort it all out. And I'm specifically speaking about just what Sun was providing. I am not even considering 3rd party libraries. It was total confusion and a lot of people got turned off from Java by the sheer immensity of it all. It became hard to tell what you *should* be using. They single-handedly proved the XKCD comic about standards.

            C, by comparison, does have a massive ecosystem of libraries and whatnot. However most of those libraries were 3rd party. You want to use Boost? Sure no problem! But there is no question that Boost is an external library. Different vendors put out all sorts of different "standard" libraries, but that was typical vendor nonsense where they are trying to lock you into their platform. Borland, Microsoft, etc. They were all pulling these shenanigans, but at no point did anyone ever stand up and say that one of their versions was the "true" C. Of course, that's a mess until itself, but not one you can blame the language for.

            Unfortunately at this point it's been years since I've seriously touched Java, so I'm having to work from now degraded memories. I haven't really touched Java as a language since 1.6, so I don't know how things have been going apart from news releases which inevitably talk about people's complaints about Java "not moving fast enough", etc. All that talk just makes me think that Java is again entering those bad old days where stuff will get introduced and deprecated ad nauseum, and I've basically tuned out completely now.

            WRT Javascript, we'll have to agree to disagree because I cannot possibly fathom how you could see the sheer instability of that ecosystem as a good thing. The single biggest cost to all this churn is the complete lack of stability. That lack of stability also means a complete lack of security. But this doesn't affect the developer. The developer just cranks out code, and then drops the final product somewhere like a dog drops it's poop in the middle of the park, so someone else has to clean it up. And who are the ones that suffer? The poor suckers that bought the software that now have to spend even more money not just replacing the software but migrating their old data. Now take Javascript and it's "new framework a week" mentality... the tech stack will be not just outdated, but obsoleted, before the project is even finished. We are amassing technical debt at a rate so staggering, that keeping up is impossible. So we end up with software that gets written once, and is almost completely unmaintainable a week later because everyone has moved onto the new shiny and there's few people around familiar with the old tech. It's like the Y2K nightmare but at light speed.

            I'm kinda rushing through this, but your final point is regarding C being too low level for things like GUI apps. I would say I sort of agree. Dealing with GUIs, especially in Windows, is SUCH a hassle. Microsoft has done a fantastic job of putting the AWT/Swing issue to shame with their 5 bajillion different libraries and paradigms and solutions. But at the same time, the argument is a cop out to allow for crap code. Languages that divorce people from the hardware allow for worse-than-failure levels of poor code quality and performance. But it compiles so who cares? The user can just buy more ram and storage, right? Look at Facebook for example. The last time I had their apps installed on my phone, they took up a solid GIGABYTE. For a freaking website portal and a messaging app. At that time, the entire microsoft office suite took up less space. When my phone "only" had 16GB of storage, that was devastating. As a user I shouldn't need to choose between dedicating a phone to just using facebook vs everything else. And people look to Facebook as a major player in the tech industry. It's depressing.

            Personally, I have my eyes on Rust. It's been relatively stable and they are trying very hard to keep it that way (apparently v2018 introduced some backward compatibility breaking changes but based on their history that's an anomaly). Of all the languages I've seen lately, Rust is the first systems level language I've seen that is less risky than C(.*) while having strong interoperability with the C ecosystem. The single biggest advantage of Rust is that the model the language enforces, puts a very high barrier on software quality, so even if we do hit library proliferation like we see with Javascript, even abandoned libraries will be of a much high quality in all ways and won't be nearly as debt-ridden as what we see with current stuff. The one missing piece is decent GUI support. If they can crack that nut, then I see Rust exploding to the top of the charts.

            • (Score: 2) by DannyB on Wednesday September 11 2019, @08:17PM (1 child)

              by DannyB (5839) Subscriber Badge on Wednesday September 11 2019, @08:17PM (#892880) Journal

              That is interesting. Thanks.

              It sounds like you and I have experienced different "halfs" of the Java history. I got started at about 1.5 just as 1.6 was coming in.

              As for JavaScript, my biggest gripe is not about the frameworks. But the language itself. I wish it had a lot more formal and less slapdash attention paid to it in its early days. JavaScript, as a language, has been massively improving in the last few years. But when I read of each great improvement, it is years before I can safely assume I can use those improvements, when they are commonly deployed in everyone's browser. And I know about polyfills.

              I don't mind the number of frameworks in JavaScript -- because nobody is holding a gun to my head to force me to use any of them.

              While I don't use C nor Rust, I also have the opinion that Rust is probably better for what C is mostly used for. I'm not criticizing anyone for using C just because I think it is wholly inappropriate for the problem domain I work in. Java seems to fit the bill rather nicely. And almost the very first thing I said is that Java does have some warts. It is not without its criticisms.

              --
              To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
              • (Score: 2) by ilsa on Wednesday September 11 2019, @08:38PM

                by ilsa (6082) Subscriber Badge on Wednesday September 11 2019, @08:38PM (#892890)

                Ugh... Don't get me started on the javascript language itself. It was originally designed to be a slapped together language to do very basic dom manipulation, and everything done to improve the language will never be able to fix some of the outrageous flaws introduced at it's very inception. If you haven't seen it already, I recommend watching this: https://archive.org/details/wat_destroyallsoftware [archive.org]

                Javascript is the only language I am aware of, that have had other languages (typescript) written in order to remediate the most offending aspects.

                I'm really hoping that WebAssembly hurries up and becomes popular. I believe most languages already have wasm compilers written (I know Rust does for sure...), and the sooner we kick Javascript out the door the happier I'll be.

                I strongly recommend, if not actually trying rust, at least do some reading about it. It has the most unique and fascinating memory model I've ever seen (which is what gives it such robustness), and is a great thought exercise in how one designs their code. I haven't actually done anything with it yet (I'm primarily an IT Manager and Sysadmin, so there's not much call for heavy coding there...), but Rust is the first language I've seen in a while that really makes my coding fingers itch to start making something.

  • (Score: 3, Interesting) by Arik on Tuesday September 10 2019, @08:13AM (1 child)

    by Arik (4543) on Tuesday September 10 2019, @08:13AM (#892129) Journal
    "Python2 doesn't have native unicode support"

    Upsets me much less now than a few years ago. Modern unicode is nothing but emoji bullshit anyway. I wouldn't want to support it. I'd probably break it intentionally, in such a way as to make sure it stays broke.

    "a variety of math function issues"

    Can't you just reference a MathML interpreter?

    I won't pretend I grasp all the intricacies here, but in my mind, if you've been maintaining your language for a decade or two and think it has design flaws, you write a new language and avoid those flaws.

    But you don't name the new improved language the same name as the old one. That's just begging for confusion.

    Python 3? How about Boa 1 instead?

    Nah, that would make too much sense. Can't have that.

    --
    If laughter is the best medicine, who are the best doctors?
    • (Score: 0) by Anonymous Coward on Tuesday September 10 2019, @07:03PM

      by Anonymous Coward on Tuesday September 10 2019, @07:03PM (#892306)

      > "Python2 doesn't have native unicode support"
      Upsets me much less now than a few years ago

      Of course, what use would you have for unicode? Just because you're stuck with a typewriter, doesn't mean the rest of the world hasn't moved on.

  • (Score: 4, Interesting) by PiMuNu on Tuesday September 10 2019, @08:17AM (4 children)

    by PiMuNu (3823) on Tuesday September 10 2019, @08:17AM (#892130)

    > Python2 doesn't have native unicode support, handles it badly in many existing cases,

    Python3 deprecates the simple string manipulation functions replace, rjust etc and replaces them with a ghastly mini-language. That is a deal breaker. I don't give a rats arse about unicode so this is a non-feature and python3 is a downgrade.

    > as well as a variety of math function issues due to being overly ambiguous in the original implementation.

    I use python2 for serious maths virtually every day and have never encountered these issues.

    > That said, i firmly believe as much use as python gets, it's not ready for primetime.

    Why not? The world has voted against you.

    • (Score: 0) by Anonymous Coward on Tuesday September 10 2019, @09:05AM

      by Anonymous Coward on Tuesday September 10 2019, @09:05AM (#892142)

      You can rjust, replace, etc in python3. It appears there was some reorganization - from the string lib to the str builtin, but I don't recall having noticed the difference in practice. https://docs.python.org/3.7/library/stdtypes.html#text-sequence-type-str [python.org] for details, but I expect you'll find the methods you know and love.

    • (Score: 1, Informative) by Anonymous Coward on Wednesday September 11 2019, @05:57AM (2 children)

      by Anonymous Coward on Wednesday September 11 2019, @05:57AM (#892559)

      python 2: 5 / 2 == 2 # int / int returns int
      python 3: 5 / 2 == 2.5 # int / int returns float

      python 2: round(0.5) == 1.0 # round(float) returns float Half-away-from-zero
      python 3: round(0.5) == 0 # round(float) returns int Half-towards-even

      python 2: 010 + 1 == 9
      python 3: 010 + 1 raises SyntaxError

      And those are just the ones I could think of off-hand.

      • (Score: 2) by PiMuNu on Thursday September 12 2019, @08:35AM (1 child)

        by PiMuNu (3823) on Thursday September 12 2019, @08:35AM (#893090)

        > python 2: 5 / 2 == 2 # int / int returns int
        > python 3: 5 / 2 == 2.5 # int / int returns float

        Okay, I started on C programming so I am used to int/int returns int.

        > python 2: round(0.5) == 1.0 # round(float) returns float Half-away-from-zero
        > python 3: round(0.5) == 0 # round(float) returns int Half-towards-even

        Did you type that wrong? Or does 0.5 round down in python 3? This is surely wrong??

        > python 2: 010 + 1 == 9
        > python 3: 010 + 1 raises SyntaxError

        That is an improvement.

        • (Score: 0) by Anonymous Coward on Friday September 13 2019, @05:33AM

          by Anonymous Coward on Friday September 13 2019, @05:33AM (#893539)

          Nope the behavior in Python 3 is correct, and the bug tracker sees a report about it every few months and the mailing list every few weeks. Technically, the previous way was a bug back from when GvR and other core team members didn't know better.

          Regardless, they changed it in order to conform with IEEE 754, which defaults to half-to-even rounding (also known as banker's rounding) for floating point. The reason for the rule is that it reduces the propagation of rounding errors in one direction, which causes the math to be more accurate overall. It was done with a couple more changes they made to the way Python handles floating point numbers and the various operations on them (which are far more technical and less easily noticed than that one). This had the benefit of allowing them to use hardware implementations of IEEE 754 and related standards directly without having to code around where they contradict the software implementation of floating point in the language. This increased performance, accuracy, code maintainability, interoperation with other numerical standards (like the General Decimal Arithmetic Specification), aligned behavior with most other languages and implementations, and a few other benefits as well.

          For all of them, I agree they are all improvements, but I was responding to the implication of your statement that they didn't change anything regard how they handle math. Like I said, there are more beyond those, but they are probably the ones most people will run into.