Submitted via IRC for TheRealLuciusSulla
Emperor's 2019 exit will be first era change of information age, and switchover could be as big as Y2K say industry figures
[...] On 30 April 2019, Emperor Akihito of Japan is expected to abdicate the chrysanthemum throne. The decision was announced in December 2017 so as to ensure an orderly transition to Akihito's son, Naruhito, but the coronation could cause concerns in an unlikely place: the technology sector.
The Japanese calendar counts up from the coronation of a new emperor, using not the name of the emperor, but the name of the era they herald. Akihito's coronation in January 1989 marked the beginning of the Heisei era, and the end of the Shōwa era that preceded him; and Naruhito's coronation will itself mark another new era.
But that brings problems. For one, Akihito has been on the throne for almost the entirety of the information age, meaning that many systems have never had to deal with a switchover in era. For another, the official name of Naruhito's era has yet to be announced, causing concern for diary publishers, calendar printers and international standards bodies.
It's why some are calling it "Japan's Y2K problem".
"The magnitude of this event on computing systems using the Japanese Calendar may be similar to the Y2K event with the Gregorian Calendar," said Microsoft's Shawn Steele. "For the Y2K event, there was world-wide recognition of the upcoming change, resulting in governments and software vendors beginning to work on solutions for that problem several years before 1 Jan 2000. Even with that preparation many organisations encountered problems due to the millennial transition.
[...] A much harder problem faces Unicode, the international standards organisation which most famously controls the introduction of new emojis to the world. Since Japanese computers use one character to represent the entire era name (compressing Heisei into ㍻ rather than 平成, for instance), Unicode needs to set the standard for that new character. But it can't do that until it knows what it's called, and it won't know that until late February at best. Unfortunately, version 12 of Unicode is due to come out in early March, which means it needs to be finished before then, and can't be delayed.
Related Stories
New Era Name 'Reiwa' Defines Japan As Emperor Akihito Prepares To Abdicate
Japan has revealed the name of its next imperial era to be "Reiwa," set to begin May 1 as Crown Prince Naruhito is expected to take the throne.
Yoshihide Suga, Japan's chief cabinet secretary, announced the name at a press conference Monday morning local time, unveiling a board with the two kanji characters written on it. While there was some deliberation over the exact meaning, the two characters that make up the new name, or the "gengo," translate roughly to "good fortune" and "peace" or "harmony," according to The Japan Times.
"We hope [the era name] will be widely accepted by the people and deeply rooted as part of their daily lives," Suga told reporters.
The announcement comes as the current "Heisei" era draws to a close after three decades, with Emperor Akihito set to step down on April 30 in the first abdication of the throne in over 200 years.
[...]Announcing the name one month in advance gives companies and government entities time to incorporate the name into paperwork and computer systems, The Guardian reports. Even as the Western calendar has become more widespread in Japan, the era name is still used frequently, including on newspapers, coins and official documents like driving licenses. Under the system, 2019 is known as Heisei 31, or the 31st year of Akihito's reign.
Also at BBC.
See also: Japan's New Era Gets a Name, but No One Can Agree What It Means
Previously: MonarchyNews: The King is My Co-Pilot and Japanese Succession "Crisis"
Japan Clears Way for Emperor to Step Down in 1st Abdication in 200 Years
Big Tech Warns of 'Japan's Millennium Bug' Ahead of Akihito's Abdication
Japan grants half a million pardons to mark enthronement of emperor Naruhito
Japan has pardoned more than half a million people found guilty of petty crimes such as traffic violations to mark the formal ascension of Naruhito to the Chrysanthemum throne.
Naruhito proclaimed himself Japan's new emperor and vowed to "stand with the people" after performing a series of ancient rituals on Tuesday that culminated in his appearance on the imperial throne alongside his wife, Empress Masako.
The 59-year-old, who ascended the throne in May following the abdication of his father, Akihito, marked his official enthronement in front of around 2,000 guests, including heads of state and other royals from more than 180 countries.
[...] To mark the occasion on Tuesday, Abe's ultra-conservative government granted pardons to about 550,000 eligible applicants. The decision was not publicly debated.
The pre-war custom of clemency by the emperor, who was revered as a god in those days, has triggered criticism as being undemocratic and politically motivated. At the time of former Akihito's enthronement, 2.5 million people were given amnesty.
Also at CNN, Asahi Shimbun, and Japan Times.
Previously: MonarchyNews: The King is My Co-Pilot and Japanese Succession "Crisis"
Japan Clears Way for Emperor to Step Down in 1st Abdication in 200 Years
Big Tech Warns of 'Japan's Millennium Bug' Ahead of Akihito's Abdication
Japan's Next Era to be Called "Reiwa"
(Score: 5, Funny) by Anonymous Coward on Sunday July 29 2018, @09:18PM (5 children)
Thirty years.
Here in the tech industry, we don't have coders older than 30, and all of our software is recoded in the latest fad language every six months.
Come to think of it, that Linus guy is looking kinda old. Google bros will code a new kernel for us any day now.
(Score: 1, Touché) by Anonymous Coward on Sunday July 29 2018, @10:22PM (1 child)
https://fuchsia.googlesource.com/zircon/ [googlesource.com]
(Score: 0) by Anonymous Coward on Sunday July 29 2018, @10:43PM
Thank you, Captain Obvious. We have until 2021 to adopt Zircon as the trendy hobbyist preferred kernel before the 30th anniversary of Linux. Expect Fuchsia to replace Android everywhere at the same time. Linux will soon become obsolete legacy garbage for unhip losers just like BSD is now.
(Score: 0) by Anonymous Coward on Sunday July 29 2018, @11:47PM
Didn't systemD already replace the kernel? Should be on next released then.
(Score: 0) by Anonymous Coward on Monday July 30 2018, @08:31AM (1 child)
Been on the throne for 30 year?
That laxative ain't working, man!
(Score: 2) by c0lo on Monday July 30 2018, @08:44AM
On the plus side, can't get more stuffed than that.
No extra shit to fit in, in such a case one is already full of it.
https://www.youtube.com/@ProfSteveKeen https://soylentnews.org/~MichaelDavidCrawford
(Score: 3, Funny) by ilPapa on Sunday July 29 2018, @09:25PM (8 children)
A day will come when all Americans count the date starting with Donald Trump's inauguration, which had the greatest attendance of any inauguration ever.
Suck it, libs.
You are still welcome on my lawn.
(Score: 2, Touché) by Anonymous Coward on Sunday July 29 2018, @09:32PM (3 children)
We already do. This is the Trump Era, First Term, Year Two.
(Score: 2) by Gaaark on Monday July 30 2018, @12:22AM
Year two, Trump nothing.
--- Please remind me if I haven't been civil to you: I'm channeling MDC. I have always been here. ---Gaaark 2.0 --
(Score: 1, Troll) by ilPapa on Monday July 30 2018, @02:11AM
"We"? Are you speaking for the entire FSB?
You are still welcome on my lawn.
(Score: 2) by krishnoid on Monday July 30 2018, @09:58PM
Day 556, hour 6 ... dear God ...
(Score: 1, Touché) by Anonymous Coward on Sunday July 29 2018, @09:46PM (2 children)
I think June 18, 2018 is a good mark for the beginning of the a new era when Donald J. Trump announced the formation of the United States Space Force. Space Era, Universal Era, take your pick.
(Score: 3, Interesting) by Anonymous Coward on Sunday July 29 2018, @10:25PM (1 child)
You couldn't be more wrong. The Space Age ended in 1972 with the cessation of manned Moon landings. The Information Age which followed it ended in 2006 with the rise of Facebook. We live in the Social Media Age now.
(Score: 2) by c0lo on Monday July 30 2018, @08:51AM
Maybe it just peaked [reviewjournal.com] one can only hope (both FB and Twitter stocks took a deep plunge in the last couple of days).
https://www.youtube.com/@ProfSteveKeen https://soylentnews.org/~MichaelDavidCrawford
(Score: 2) by archfeld on Sunday July 29 2018, @10:48PM
It is Trumplstiltskin and the Orange era.
For the NSA : Explosives, guns, assassination, conspiracy, primers, detonators, initiators, main charge, nuclear charge
(Score: 4, Funny) by Anonymous Coward on Sunday July 29 2018, @10:20PM (2 children)
"Of course, if you upgrade to Windows 10, you will have no problems. We guarantee that Windows 10 will reboot every time a throne changes hands. In fact, Elizabeth's looking a bit creaky so there's another reason to upgrade to Windows 10. Did I mention that Windows 10 handles emojis... in fact, we took a quite a few programmers off the security team and reallocated them to Millennial Support Services to ensure that we have the best emojis evah."
(Score: 4, Insightful) by SomeGuy on Monday July 30 2018, @12:54AM (1 child)
That actually puts the problem in a bit of perspective. These unicode shitheads (and I'm not inserting the damn poop unichode here) will happily pull whatever random symbols out of their asses and ram them in to the unicode standards at the whim of Apple just to keep their consumertard masses entertained. But when an actual genuine need for a new one comes up... well fuck that.
(Score: 2, Interesting) by Anonymous Coward on Monday July 30 2018, @05:54PM
ISO/IEC 10646 is the trademark equivalent of patent pools. They catalog and list all the symbols, emojis and characters ISO associated industries agreed upon for use in their documents and products. They don't tell those companies what to do. They're descriptive rather than prescriptive. Unicode was originally a reduced selection of ISO/IEC 10646 maintained solely to contend with the memory constraints of the early 90s. However, they've grown to become far more politically influential over the process. Now, the ISO guys are afraid they'll lose thier jobs to Unicode. The Unicode guys are afraid they'll lose their jobs to obsoleteness. And there's no one left feeling safe enough in their positions to resist when some Google Japan guy wants a poop emoji since they had those on their feature phones and want to shoe horn it into the the telecom standards through the Unicode standard.
Welcome to the corporate-led world government: Where poop is ISO standardized to prevent the competition from trademarking it. May the floods take us all.
(Score: 1, Offtopic) by realDonaldTrump on Sunday July 29 2018, @10:23PM
They're talking about, maybe they'll do a flamingo. Hopefully they will! People don't know this, it's NOT the State Bird of Florida. How did that one happen?! dailymail.co.uk/sciencetech/article-5796875/Emoji-hopefuls-2019-include-emoticons-people-disabilities-flamingo.html [dailymail.co.uk]
(Score: 3, Funny) by arslan on Sunday July 29 2018, @10:35PM (1 child)
The new era under Emperor Naruhito will be called the Naruto era...
(Score: 2) by Gaaark on Monday July 30 2018, @12:31AM
The salute will be the leaf secret finger jutsu: One Thousand Years of Death!
--- Please remind me if I haven't been civil to you: I'm channeling MDC. I have always been here. ---Gaaark 2.0 --
(Score: 4, Insightful) by Anonymous Coward on Sunday July 29 2018, @11:32PM
Why can't it? Just...change the release date...
(Score: 4, Interesting) by isj on Sunday July 29 2018, @11:46PM (3 children)
My first reaction was that this was a good example of the difference between amusing/quirky and outright stupid. Up there with the Prince symbol.
But then I remembered the goal of Unicode: to encode all symbols that are used in writing. So if the Japanese use a new symbol in writing and it isn't simply a ligature then so be it. I still think it is stupid but Unicode should include that new codepoint anyway. I'm still sceptical about including codepoints for family names.
(Score: 0) by Anonymous Coward on Monday July 30 2018, @01:38AM
I'm pretty sure that Prince (still) has more fans than Japan has population.
(Score: 2, Interesting) by Anonymous Coward on Monday July 30 2018, @05:31AM
It is also a brilliant demonstration of how much forward thinking they actually do, especially given the fact that the Emperor of Japan at the time they standardized was not going to live forever. Rather than reserve a block, or at least a codepoint or two, both sides of the encoding of the era names are taken up by other characters. I am interested to see where this one ends up as huge chunks of space are already taken up now.
Oh well, at least they will be tackling real problems again, rather than just coming up with more emojis.
(Score: 2) by SanityCheck on Monday July 30 2018, @08:21PM
Yes, this is exactly why we got the Burrito emoji.
(Score: 0, Funny) by Anonymous Coward on Monday July 30 2018, @12:57AM (1 child)
Or they could switch to a sane calendar system.
Like one based on long dead magic man Jebus' child prodding dick. Yea, sane.
(Score: 2) by HiThere on Monday July 30 2018, @01:20AM
Well, there is no sane way for picking the zero point, except the one that other people are using. Network effects matter.
FWIW, I normally use CE and I don't believe that there ever was a historical Jesus, at least not within a century of the "official date", but that CE and AD use the same zero point and year length that doesn't bother me. I do, however, normally write dates in the format yyyy-mm-dd, because it makes sorting easier. (ISO 8601)
Now if they want to source their calendar with a different zero point, that's no real problem as long as they define the correspondence to CE. But if they want to change the starting day of the year, they're asking for problems.
According to what I've read, most Japanese applications appear to also use CE dates internally, so the problem will be limited to display and entry. Many programs seem to be planning on handling this by allowing dates to be entered in CE form as well as in nation-specific form. That should be an easy enough approach. The problem will be those programs that aren't, possibly because they can't be, updated.
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 1) by anubi on Monday July 30 2018, @08:13AM (9 children)
Now, Y2K did not scare me one iota. I do not think it scared anyone here, either.
Computers don't keep time that way, and there was nothing special about the decimal year 1999->2000 other than formatting on some reports, that may make it messy for humans to read, but other than that, time_t is a C++ long word, and won't do anything special until January 19, 2038, when it will roll over. [wikipedia.org]
Any thoughts on that one? Is GCC prepared so those of us who have coded for Arduinos will roll over nicely?
"Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
(Score: 1) by shrewdsheep on Monday July 30 2018, @08:48AM
You were then probably not involved in any user facing project at the time. The internal coding was pretty much irrelevant. It were the user facing forms and the processing of the input (AKA data += 1900). Also at the time you did not have C++ all over the place. A lot of custom code/languages were around which may or may not have had a second based date measurement - it was before open source really took off.
(Score: 3, Informative) by canopic jug on Monday July 30 2018, @09:10AM (7 children)
Computers don't keep time that way, and there was nothing special about the decimal year 1999->2000 other than formatting on some reports, that may make it messy for humans to read, but other than that, time_t is a C++ long word, and won't do anything special until January 19, 2038, when it will roll over.
That will hit embedded systems the hardest. They are cheapest, meaning usually 32-bit, and have long life cycles, often measured in decades, and they are often forgotten about. So the sooner 32-bit time is deprecated the better. We're already at high risk by having put off fixing time_t so long. Systems still in will have to be found, tracked, and audited by 2037 and a real panic will occur because swapping 32-bit time for 64-bit time is not a simple formatting change like with Y2K. Many embedded systems are simply forgotten about and those using 32-bit time will make their presence known in 2038 if they are.
I'm not sure the compiler is relevant, whether GCC, clang, or LLVM. Where the problem lies is the code compile. OpenBSD went through their own code and removed 32-bit time back four years ago [undeadly.org] already. Theo DeRaadt did several presentations on the time change [openbsd.org]. If I recall correctly their efforts got everything in place by OpenBSD 5.5. I'm sure they would encourage learning from their Y2038 work.
Money is not free speech. Elections should not be auctions.
(Score: 1) by anubi on Monday July 30 2018, @10:10AM (4 children)
My concern is I do things with Arduino - compatibles, with the intent to embed. It uses C++ time_t. How it will handle 2038 is of much concern for me.
Before I go off on a wild fling trying to patch, I want to see what others are doing.
Like you say, its not a trivial changeout. Hence the query. If GCC will handle it, that will be great for me. Recompile and I'm good to go!
I am trying to make this stuff to last hundreds of years. To be embedded in something else. And just work. Without requiring anyone else's permission. Ever.
"Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
(Score: 2) by Alfred on Monday July 30 2018, @01:53PM (3 children)
(Score: 1) by anubi on Tuesday July 31 2018, @04:53AM (2 children)
You are right about not having a "real" clock. I use a DS1307, supposedly good until the year 2107, but I also have some legacy POSIX time_t variables which I am coupling to an "epoch" word to extend the time, and had to re-write some time display and compare routines.
32 bits was not enough, 64 seems way overkill. I wanted to use the sign bit, but that would screw everything up for me.
My implementation is quite messier than I really like, and was curious if anyone here was doing similar stuff - just so I did not waste a lot of my time coming up with something other than what everyone else was going to do.
I am trying to put these things into places where people expect to program them, then leave 'em be until they want to change something.
"Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
(Score: 2) by Alfred on Tuesday July 31 2018, @09:12PM (1 child)
In microcontrollers class we had a similar problem, because we were using 8 bit chips. We had to use two bytes where on was the overflow from the other. We were making a 16-bit int the hard way. I don't think we were counting time so we used an interrupt for whatever we were counting. Using an interrupt for overflowing time could be bad.
A lot of it comes down to the chips behavior and feature set. If you don't need milli- or nano- second resolution maybe your chip, or external RTC, has a just plain whole seconds number you can use.
Just thoughts, use or discard at your discretion. I'm sure someone else has done this in a way better than i can imagine already.
(Score: 1) by anubi on Wednesday August 01 2018, @02:59AM
About unsigned INTs... sometimes I may to reference something that happened, which requires me to add in a negative time to get a pointer into the past. POSIX (C++ Unix Time_t ) uses an unsigned 32 bit for this. I wish to remain compatible.
Nearly all my stuff is POSIX compatible, even with the rollover - it won't lock up, but will display the time and date referenced back to the POSIX zero of Thursday, January 1, 1970, which is the negative 68 years... somewhere in the 1901 timeframe.
Just rattling the cage to see if anyone is using some sort of time_t extensions to get around the 2038 rollover.
Else I still use the UNIX time, but add offsets and fix the leap year and day of the week so it will display correctly.
What I came up with is sorely lacking in elegance.
That means my code only works on my stuff, and other people have to figure out what I did to code for it. Which goes against why I am basing my stuff on Arduino in the first place. It is my intention that your average high-school kid can program my stuff, with freely available tools.
Being I was brought up in the oil fields, where it was common to find stuff that had been in place for over a hundred years, still doing what it was supposed to do, I felt a need to build a technology that would do the same... simply do what it was told to do, until someone pulls it loose and tells it to do something else. The stuff I am seeing today is so full of rights enforcement and "customer lock-in" technologies that I would have a hard time designing this kind of stuff into any sort of industrial plant that has a design life of over a century. The kind of crap I am seeing today looks far more suited for something that's not designed to last more than five years. Like building temporary bleachers for a town event... why use concrete, steel, or treated wood for that kind of thing? Just use the off-the-shelf business-grade stuff of the day, and toss it when you are done.
"Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
(Score: 0) by Anonymous Coward on Tuesday July 31 2018, @06:30PM (1 child)
Lots of embedded systems might still be in the 1900s as far as they know.
Plenty of such systems don't need to know the current 4 digit year correctly.
It's just like those old VCRs or microwave ovens with the wrong time settings.
(Score: 0) by Anonymous Coward on Wednesday August 01 2018, @03:08AM
Quite right... I have a lot of stuff that has embedded microprocessors, and they are not going to die just because time_t overflows. I don't believe a one of them even knows what time_t even is. An integral part of my design involves timestamping events. Overflowing time_t does not crash anything.... but makes the time and date display as a little over 136 years ago.
Maybe I abandon POSIX for a 64_bit time word. Considering how insignificant using eight bytes to store a timestamp, I am sorely considering recoding it. I'm open for suggestions, before I coin yet another incompatible "standard".
(Score: 0) by Anonymous Coward on Monday July 30 2018, @10:32AM (2 children)
get their heads out of their collective asses and simply drop the 'year of the emperor' calendar entirely. Then they should join the rest of the world in use of a calendar where the numbering does not restart every time a current monarch dies.
But this is Japan. High-tech and ass-backwards at the same time. They will not do anything sane like giving up this asinine calendar system of theirs.
(Score: 4, Insightful) by AthanasiusKircher on Monday July 30 2018, @08:23PM (1 child)
Oh, absolutely. They should join the completely rational system based on the year some magician dude that raised himself from the dead was supposedly born... Er, not supposedly, because it's actually several years off because some other random dude several centuries later miscalculated.
Oh, yeah, that completely rational system that's a combination of bases 60, 24, and a random jumble of month lengths based on an old Roman system that was originally lunar but then was massively screwed up for political reasons and then standardized by imperial decree into the random jumble of 28 to 31-day months that never align with the 7-day cycles people actually organize their lives around. Oh except for every fourth year (except for every year in 100 except for every year in 400), when we add a day NOT at the end of the year (which might make some sense) but at the end of month 2, which was the last month of the year millennia ago, but the random mash-up of resurrected dudes, erroneous monks, and emperors creating a mess of things doesn't really care about stuff like that.
Yeah, Japan obviously should switch to that calendar.
[TL;DR -- Japan's calendar is just as screwy as the one the West seems to have settled on. You're just more used to the stupidity of the one familiar to you.]
(Score: 1) by anubi on Wednesday August 01 2018, @03:20AM
There is precedence... I understand you can trace a modern railroad's track spacing all the way back to the spacing of Roman Chariot wheels. [naciente.com]
I imagine reference points for keeping time are just as set in precedence. Look at ours... Anno Dominii. Year of our Lord. We have to drop a reference point somewhere, and we've been referencing that point in time ever since the Church laid it down over two millenia ago. Then we have POSIX time as well... referenced to January 1, 1970 , which is coming up on a 32-bit signed int overflow in 2038, which has been of significant concern to me, as I am designing some stuff I intend to last long beyond that.
"Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]