Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Sunday July 29 2018, @09:09PM   Printer-friendly
from the Unicode-12.1 dept.

Submitted via IRC for TheRealLuciusSulla

Emperor's 2019 exit will be first era change of information age, and switchover could be as big as Y2K say industry figures

[...] On 30 April 2019, Emperor Akihito of Japan is expected to abdicate the chrysanthemum throne. The decision was announced in December 2017 so as to ensure an orderly transition to Akihito's son, Naruhito, but the coronation could cause concerns in an unlikely place: the technology sector.

The Japanese calendar counts up from the coronation of a new emperor, using not the name of the emperor, but the name of the era they herald. Akihito's coronation in January 1989 marked the beginning of the Heisei era, and the end of the Shōwa era that preceded him; and Naruhito's coronation will itself mark another new era.

But that brings problems. For one, Akihito has been on the throne for almost the entirety of the information age, meaning that many systems have never had to deal with a switchover in era. For another, the official name of Naruhito's era has yet to be announced, causing concern for diary publishers, calendar printers and international standards bodies.

It's why some are calling it "Japan's Y2K problem".

"The magnitude of this event on computing systems using the Japanese Calendar may be similar to the Y2K event with the Gregorian Calendar," said Microsoft's Shawn Steele. "For the Y2K event, there was world-wide recognition of the upcoming change, resulting in governments and software vendors beginning to work on solutions for that problem several years before 1 Jan 2000. Even with that preparation many organisations encountered problems due to the millennial transition.

[...] A much harder problem faces Unicode, the international standards organisation which most famously controls the introduction of new emojis to the world. Since Japanese computers use one character to represent the entire era name (compressing Heisei into ㍻ rather than 平成, for instance), Unicode needs to set the standard for that new character. But it can't do that until it knows what it's called, and it won't know that until late February at best. Unfortunately, version 12 of Unicode is due to come out in early March, which means it needs to be finished before then, and can't be delayed.

Source: https://www.theguardian.com/technology/2018/jul/25/big-tech-warns-japan-millennium-bug-y2k-emperor-akihito-abdication


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Informative) by canopic jug on Monday July 30 2018, @09:10AM (7 children)

    by canopic jug (3949) Subscriber Badge on Monday July 30 2018, @09:10AM (#714659) Journal

    Computers don't keep time that way, and there was nothing special about the decimal year 1999->2000 other than formatting on some reports, that may make it messy for humans to read, but other than that, time_t is a C++ long word, and won't do anything special until January 19, 2038, when it will roll over.

    That will hit embedded systems the hardest. They are cheapest, meaning usually 32-bit, and have long life cycles, often measured in decades, and they are often forgotten about. So the sooner 32-bit time is deprecated the better. We're already at high risk by having put off fixing time_t so long. Systems still in will have to be found, tracked, and audited by 2037 and a real panic will occur because swapping 32-bit time for 64-bit time is not a simple formatting change like with Y2K. Many embedded systems are simply forgotten about and those using 32-bit time will make their presence known in 2038 if they are.

    I'm not sure the compiler is relevant, whether GCC, clang, or LLVM. Where the problem lies is the code compile. OpenBSD went through their own code and removed 32-bit time back four years ago [undeadly.org] already. Theo DeRaadt did several presentations on the time change [openbsd.org]. If I recall correctly their efforts got everything in place by OpenBSD 5.5. I'm sure they would encourage learning from their Y2038 work.

    --
    Money is not free speech. Elections should not be auctions.
    Starting Score:    1  point
    Moderation   +1  
       Informative=1, Total=1
    Extra 'Informative' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 1) by anubi on Monday July 30 2018, @10:10AM (4 children)

    by anubi (2828) on Monday July 30 2018, @10:10AM (#714665) Journal

    My concern is I do things with Arduino - compatibles, with the intent to embed. It uses C++ time_t. How it will handle 2038 is of much concern for me.

    Before I go off on a wild fling trying to patch, I want to see what others are doing.

    Like you say, its not a trivial changeout. Hence the query. If GCC will handle it, that will be great for me. Recompile and I'm good to go!

    I am trying to make this stuff to last hundreds of years. To be embedded in something else. And just work. Without requiring anyone else's permission. Ever.

    --
    "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
    • (Score: 2) by Alfred on Monday July 30 2018, @01:53PM (3 children)

      by Alfred (4006) on Monday July 30 2018, @01:53PM (#714716) Journal
      Arduino does not have a real time clock unless you add one. If you want stuff to last 100 years you messed up at the design stage by using an Arduino, especially if you are using dates somehow. However you are not sunk, maybe. What happens when the "clock" rolls over? Does the device crash or does it just roll over and keep going? If it crashes you are sunk, but if it rolls over gracefully then you can detect and work with that. Good luck testing it, you will have to set the value to near rollover state and see what happens. You may not be able to set that register from the Arduino IDE, you may have to go to AVR Studio or something.
      • (Score: 1) by anubi on Tuesday July 31 2018, @04:53AM (2 children)

        by anubi (2828) on Tuesday July 31 2018, @04:53AM (#715078) Journal

        You are right about not having a "real" clock. I use a DS1307, supposedly good until the year 2107, but I also have some legacy POSIX time_t variables which I am coupling to an "epoch" word to extend the time, and had to re-write some time display and compare routines.

        32 bits was not enough, 64 seems way overkill. I wanted to use the sign bit, but that would screw everything up for me.

        My implementation is quite messier than I really like, and was curious if anyone here was doing similar stuff - just so I did not waste a lot of my time coming up with something other than what everyone else was going to do.

        I am trying to put these things into places where people expect to program them, then leave 'em be until they want to change something.

        --
        "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
        • (Score: 2) by Alfred on Tuesday July 31 2018, @09:12PM (1 child)

          by Alfred (4006) on Tuesday July 31 2018, @09:12PM (#715399) Journal
          I think you would use an unsigned int for time.

          In microcontrollers class we had a similar problem, because we were using 8 bit chips. We had to use two bytes where on was the overflow from the other. We were making a 16-bit int the hard way. I don't think we were counting time so we used an interrupt for whatever we were counting. Using an interrupt for overflowing time could be bad.

          A lot of it comes down to the chips behavior and feature set. If you don't need milli- or nano- second resolution maybe your chip, or external RTC, has a just plain whole seconds number you can use.

          Just thoughts, use or discard at your discretion. I'm sure someone else has done this in a way better than i can imagine already.
          • (Score: 1) by anubi on Wednesday August 01 2018, @02:59AM

            by anubi (2828) on Wednesday August 01 2018, @02:59AM (#715503) Journal

            About unsigned INTs... sometimes I may to reference something that happened, which requires me to add in a negative time to get a pointer into the past. POSIX (C++ Unix Time_t ) uses an unsigned 32 bit for this. I wish to remain compatible.

            Nearly all my stuff is POSIX compatible, even with the rollover - it won't lock up, but will display the time and date referenced back to the POSIX zero of Thursday, January 1, 1970, which is the negative 68 years... somewhere in the 1901 timeframe.

            Just rattling the cage to see if anyone is using some sort of time_t extensions to get around the 2038 rollover.

            Else I still use the UNIX time, but add offsets and fix the leap year and day of the week so it will display correctly.

            What I came up with is sorely lacking in elegance.

            That means my code only works on my stuff, and other people have to figure out what I did to code for it. Which goes against why I am basing my stuff on Arduino in the first place. It is my intention that your average high-school kid can program my stuff, with freely available tools.

            Being I was brought up in the oil fields, where it was common to find stuff that had been in place for over a hundred years, still doing what it was supposed to do, I felt a need to build a technology that would do the same... simply do what it was told to do, until someone pulls it loose and tells it to do something else. The stuff I am seeing today is so full of rights enforcement and "customer lock-in" technologies that I would have a hard time designing this kind of stuff into any sort of industrial plant that has a design life of over a century. The kind of crap I am seeing today looks far more suited for something that's not designed to last more than five years. Like building temporary bleachers for a town event... why use concrete, steel, or treated wood for that kind of thing? Just use the off-the-shelf business-grade stuff of the day, and toss it when you are done.

            --
            "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
  • (Score: 0) by Anonymous Coward on Tuesday July 31 2018, @06:30PM (1 child)

    by Anonymous Coward on Tuesday July 31 2018, @06:30PM (#715328)

    Lots of embedded systems might still be in the 1900s as far as they know.

    Plenty of such systems don't need to know the current 4 digit year correctly.

    It's just like those old VCRs or microwave ovens with the wrong time settings.

    • (Score: 0) by Anonymous Coward on Wednesday August 01 2018, @03:08AM

      by Anonymous Coward on Wednesday August 01 2018, @03:08AM (#715507)

      Quite right... I have a lot of stuff that has embedded microprocessors, and they are not going to die just because time_t overflows. I don't believe a one of them even knows what time_t even is. An integral part of my design involves timestamping events. Overflowing time_t does not crash anything.... but makes the time and date display as a little over 136 years ago.

      Maybe I abandon POSIX for a 64_bit time word. Considering how insignificant using eight bytes to store a timestamp, I am sorely considering recoding it. I'm open for suggestions, before I coin yet another incompatible "standard".