Stories
Slash Boxes
Comments

SoylentNews is people

posted by n1 on Friday July 14 2017, @02:40AM   Printer-friendly
from the I'll-second-that! dept.

Not one to let trivia pass unnoticed, the timing of this post has a mildly interesting significance.

Some of you may be old enough to recall the Y2K bug (or may have even helped in avoiding the predicted calamity). Thanks to an incredible effort, the world survived relatively unscathed.

So we're in the clear, now. Right?

Not quite. In the land of Unix timekeeping, there is another rollover bug coming up, when the number of seconds since the Unix Epoch (Jan 1, 1970) exceeds the space provided by a signed 32 bit number: 2147483647 (January 19, 2038 at 03:14:08 UTC). [See Wikipedia's Year 2038 problem entry for more details.]

The timing of this post marks our reaching 75% of that a milestone towards that rollover amount: 1,500,000,000 seconds since the Unix epoch which works out to 2017-07-14 02:40:00 UTC. (Queue Cue horns and fanfares.)

Besides taking note of a mildly interesting timestamp, I'd like to offer for discussion: Falsehoods programmers believe about time.

What memorable time (or date) bugs have you encountered?

I once worked at a company where the DBA (DataBase Analyst) insisted that all timestamps in the database be in Eastern Time. Yes, it would fluctuate when we entered/exited Daylight Saving Time. Even better, this was central database correlating inputs from PBXs (Private Branch Exchanges) across all four time zones in the US. No amount of discussion on my part could convince him otherwise. I finally documented the situation like crazy and left it to reality to provide the final persuasion. Unfortunately, a defect in the design of their hardware manifested at a very inopportune time, and the company ended up folding.


Original Submission

Related Stories

Y2038: It's a Threat 26 comments

Steven Bellovin, Professor of Computer Science at Columbia University writes briefly with a concrete example of how the Y2038 threat works.

[...] just as with Y2K, the problems don't start when the magic date hits; rather, they start when a computer first encounters dates after the rollover point, and that can be a lot earlier. In fact, I just had such an experience.

A colleague sent me a file from his Windows machine; looking at the contents, I saw this.

$ unzip -l zipfile.zip
Archive: zipfile.zip
Length Date Time Name
——— ——— ———
2411339 01-01-2103 00:00 Anatomy...
——— ———

Look at that date: it's in the next century! (No, I don't know how that happened.) But when I looked at it after extracting on my [MacOS] computer, the date was well in the past:

$ ls -l Anatomy...
-rw-r-r-@ 1 smb staff 2411339 Nov 24 1966 Anatomy...

Huh?

After a quick bit of coding, I found that the on-disk modification time of the extracted file was 4,197,067,200 seconds since the Epoch. That's larger than the limit! But it's worse than that. I translated the number to hexadecimal (base 16), which computer programmers use as an easy way to display the binary values that computers use internally. It came to FA2A29C0. (Since base 16 needs six more digits than our customary base 10, we use the letters A–F to represent them.) The first "F", in binary, is 1111. And the first of those bits is the so-called sign bit, the bit that tells whether or not the number is negative. The value of FA2A29C0, if treated as a signed, 32-bit number, is -97,900,096, or about 3.1 years before the Epoch. Yup, that corresponds exactly to the November 24, 1966 date my system displayed. (Why should +4,197,067,200 come out to -97,900,096? As I indicated, that's moderately technical, but if you want to learn the gory details, the magic search phrase is "2's complement".)

While attention is paid to desktops and servers, they are easy to replace and have very short lifetimes. In contrast embedded systems should be receiving special action already since they are seldomly or never updated and have lifespans measured in decades.

Previously:
Reducing Year 2038 Problems in curl (2018)
The Time Is... 1500000000 (2017)


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 1, Funny) by Anonymous Coward on Friday July 14 2017, @02:49AM (3 children)

    by Anonymous Coward on Friday July 14 2017, @02:49AM (#538948)

    (Queue horns and fanfares.)

    Why are the sounds getting into line?

    • (Score: 5, Funny) by isostatic on Friday July 14 2017, @03:07AM

      by isostatic (365) on Friday July 14 2017, @03:07AM (#538950) Journal

      They're British

    • (Score: 3, Interesting) by FatPhil on Friday July 14 2017, @07:11AM (1 child)

      because they're waiting to be played at the appropriate time?

      Apropos of nothing, etymology online has this insight:

      queue (n.)
              late 15c., "band attached to a letter with seals dangling on the free end," from French queue "a tail," from Old French cue, coe "tail" (12c., also "penis"), from Latin coda (dialectal variant or alternative form of cauda) "tail," of unknown origin. Also in literal use in 16c. English, "tail of a beast," especially in heraldry. The Middle English metaphoric extension to "line of dancers" (c. 1500) led to extended sense of "line of people, etc." (1837). Also used 18c. in sense of "braid of hair hanging down behind" (first attested 1748).

      It's curious to see:
      a) the french frenched up their own former simple spelling "cue". Fuck french spellings. Noah was right, english spelling sucks, and it's largely the fault of french.
      b) reference to a penis, I wasn't expecting that (though I was familar with the hair definition, having a 40+cm queue myself)
      --
      Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
      • (Score: 2) by Osamabobama on Friday July 14 2017, @09:49PM

        by Osamabobama (5842) on Friday July 14 2017, @09:49PM (#539360)

        Do you wear horns in your queue?

        --
        Appended to the end of comments you post. Max: 120 chars.
  • (Score: 4, Touché) by Mykl on Friday July 14 2017, @02:59AM (7 children)

    by Mykl (1112) on Friday July 14 2017, @02:59AM (#538949)

    1,500,000,000 / 2,147,483,647 = 69.85%, not 75%.

    • (Score: 2) by FatPhil on Friday July 14 2017, @07:14AM

      Indeed. But the real WTF is why we care about a decimal coincidence in an intrinsically binary context.
      When we reach 1610612736 seconds past the epoch (that being 3<<29, the real 75%), I might get less than half as excited as I did when 50% went past over a decade ago. Billion? Pah, non-event. Billion and a half? Less than half a non-event.
      --
      Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
    • (Score: 3, Informative) by martyb on Friday July 14 2017, @11:51AM (5 children)

      by martyb (76) Subscriber Badge on Friday July 14 2017, @11:51AM (#539073) Journal

      1,500,000,000 / 2,147,483,647 = 69.85%, not 75%.

      Oh? Oh. Doh! Story updated... good catch!

      And, for those who might be curious WHEN we will reach 75% for real... http://timestamp.online/countdown/1610612735 [timestamp.online] which, at the time of writing this, reported:

      • 110,579,858 seconds
      • 1,842,997  minutes 38 seconds
      • 30,716 hours 37 minutes 38 seconds
      • 1,279 days 20 hours 37 minutes 38 seconds
      --
      Wit is intellect, dancing.
      • (Score: 2) by tangomargarine on Friday July 14 2017, @03:49PM (3 children)

        by tangomargarine (667) on Friday July 14 2017, @03:49PM (#539167)

        ...and to everybody who is *still* wondering when we'll reach 75%, that's January 13, 2021 :P

        This site can’t be reached

        timestamp.online took too long to respond.

        --
        "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
        • (Score: 2) by martyb on Friday July 14 2017, @07:55PM (2 children)

          by martyb (76) Subscriber Badge on Friday July 14 2017, @07:55PM (#539298) Journal
          Strange... I just tried the link and it worked fine for me!!?!!
          --
          Wit is intellect, dancing.
          • (Score: 2) by tangomargarine on Friday July 14 2017, @09:11PM (1 child)

            by tangomargarine (667) on Friday July 14 2017, @09:11PM (#539335)

            In Chrome I'm getting timeouts, and IE11 just flat-out refuses to load the page. Don't have Firefox installed on the work machine.

            If the site worked for you, you could've just pasted the date for us into your comment, was my point.

            --
            "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
            • (Score: 2) by martyb on Saturday July 15 2017, @03:55AM

              by martyb (76) Subscriber Badge on Saturday July 15 2017, @03:55AM (#539458) Journal

              In Chrome I'm getting timeouts, and IE11 just flat-out refuses to load the page. Don't have Firefox installed on the work machine.

              If the site worked for you, you could've just pasted the date for us into your comment, was my point.

              Ahh, I see what you mean now. I had been concerned as to how long a wait until the time passed... you wanted to know what that date/time would be. I apologize for my confusion, and not answering your finally-now-clear-to-me question!

              As of THIS writing, the countdown to the 75% completed point reports [timestamp.online]:

              How Much Remains To Unix Time 1610612735 (Thu, 14 Jan 2021 09:25:35 +0100)

              -    110,521,994 seconds
              -    1,842,033  minutes 14 seconds
              -    30,700 hours 33 minutes 14 seconds
              -    1,279 days 04 hours 33 minutes 14 seconds

              AKA: Thu, 14 Jan 2021 08:25:35 UTC.

              --
              Wit is intellect, dancing.
      • (Score: 3, Insightful) by maxwell demon on Saturday July 15 2017, @04:33AM

        by maxwell demon (1608) on Saturday July 15 2017, @04:33AM (#539466) Journal

        However note that this is merely a prediction. Since the site cannot know future leap seconds, they cannot be correctly accounted for.

        --
        The Tao of math: The numbers you can count are not the real numbers.
  • (Score: 2) by isostatic on Friday July 14 2017, @03:09AM (17 children)

    by isostatic (365) on Friday July 14 2017, @03:09AM (#538951) Journal

    I remember staying up late (uk time), back on 9/9/01, watching the seconds on my 17" 4x3 crt, running on Debian tick over to 1e9 back in September 2001, with slashdot open on the side.
    Now, 500 million seconds later I watched the counter on an ubuntu laptop. Not quite as late as 16 years ago - I'm in Washington DC on business

    It amazes me how much changes, but also how little things change.

    • (Score: 2) by kaszz on Friday July 14 2017, @04:08AM (12 children)

      by kaszz (4211) on Friday July 14 2017, @04:08AM (#538963) Journal

      Technology changes. People a lot less..
      The trend to observe. Static humans handling fast developing technology.

      Funny thing is that in the past people were clueless about computers because they lacked exposure. Today they have shiny computers with 4-core gigahertz processor, gigabyte of dram and gigabytes of storage for the price of bicycle. People are still just as clueless.

      • (Score: 3, Interesting) by FatPhil on Friday July 14 2017, @07:26AM (4 children)

        (minor nit - you probably meant terabytes of storage)

        One might say that people are if anything even more stupid, as there's way more available for them to know, so they know even less of what there is to know.

        And they even know less about some things than people used to know. I bet you an average teenager can't perform the kind of mental arithmetic that a baby-boomer could have done at the same age. I've seen people pull up their cellphone's calculator app to do simple things like multiplying 1.70 by 3, or subtract 5.10 from 10.00, or, worse, subtract 5.10 from 10.10. Frequently. (You may correctly conclude from that that I like giving exact change when I pay by cash.)

        Them youngsters can only come onto my lawn when they've calculated its area using Pappas' theorem in their head!
        --
        Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
        • (Score: 2) by isostatic on Friday July 14 2017, @02:53PM (1 child)

          by isostatic (365) on Friday July 14 2017, @02:53PM (#539130) Journal

          One might say that people are if anything even more stupid, as there's way more available for them to know, so they know even less of what there is to know.

          Back before 1E9:

          Richard Nixon's Head: That's it! You're all going to jail, and don't expect me to grant a pardon like that sissy, Ford.
          Turanga Leela: You'll never pardon anyone because you'll never get elected president. The voters of Earth aren't the pea-brained idiots they were in your time.
          Richard Nixon's Head: Oh, no? Well, listen here, missy. Computers may be twice as fast as they were in 1973, but the average voter is as drunk and stupid as ever. The only one who's changed is me. I've become more bitter and, let's face it, crazy over the years. And when I'm swept into office, I'll sell our children's organs to zoos for meat, and I'll go into people's houses at night and wreck up the place!

          • (Score: 2) by Thexalon on Friday July 14 2017, @09:04PM

            by Thexalon (636) on Friday July 14 2017, @09:04PM (#539330)

            The average voter is as drunk and stupid as ever. The only one who's changed is me. I've become more bitter and, let's face it, crazy over the years. And when I'm swept into office, I'll sell our children's organs to zoos for meat, and I'll go into people's houses at night and wreck up the place!

            So that's where Trump got his campaign pitch!

            --
            The only thing that stops a bad guy with a compiler is a good guy with a compiler.
        • (Score: 2) by kaszz on Friday July 14 2017, @05:31PM

          by kaszz (4211) on Friday July 14 2017, @05:31PM (#539215) Journal

          Actually I meant gigabytes, because that's what mobile phones and flash memories are. And is a technology that is cheap and portable. Sure you can get more using a mechanical disc. The point is however that storage is not a problem. Unlike when you need to do it with 300 bit/s onto flimsy tapes that requires technical skills to get it right and that being demanded by non-adults without any help.
          The real point is that equipment is powerful AND cheap. So there's not real excuse to be clueless on technology.

          That people know less than there is to know is just a consequence of the innovation being faster than any individual. But absolute knowledge levels could be somewhat better one could have hoped. Instead it seems the better opportunities just exposes human nature even better. Because even I noticed the "duh? eh bzz bzz from the wall?" type of humans in the past. Evolution is obviously slow. They exist now too, it's just that the expression of the same mental capabilities will turn out differently, not better.

        • (Score: 0) by Anonymous Coward on Saturday July 15 2017, @01:55AM

          by Anonymous Coward on Saturday July 15 2017, @01:55AM (#539430)

          > I like giving exact change when I pay by cash.

          You probably have had akin to this scenario:
          Cashier: "$5.15 please"
          You: "Ok, here's a $10 and a quarter"
          Them: "Oh that's too much, here's your quarter back, my machine says from your 10 you get back $4.85"
          You: "Uh, I kind of want bills back."
          Them: "Um." (closes the till) "Next?"

      • (Score: 3, Insightful) by TheRaven on Friday July 14 2017, @08:30AM (6 children)

        by TheRaven (270) on Friday July 14 2017, @08:30AM (#539020) Journal
        They still lack access. The sorts of computers I grew up with dropped you in a programming environment as soon as you turned them on. If you wanted to plug in new hardware, it required understanding a chunk of how the OS worked to be able to configure it. Now, they're given computers that either don't come with any kind of programming environment, or act as if they're ashamed of it and hide it, and where 99% of what they want to do can be expected to just work. It's only when they encounter the 1% that doesn't that they might start digging.
        --
        sudo mod me up
        • (Score: 0) by Anonymous Coward on Friday July 14 2017, @01:06PM (4 children)

          by Anonymous Coward on Friday July 14 2017, @01:06PM (#539100)

          Thanks to companies like Apple, it has become fashionable to actively lock users out of programming their own machines; it's not just hidden away in shame, but hidden away in malice.

          • (Score: 2) by isostatic on Friday July 14 2017, @02:50PM (3 children)

            by isostatic (365) on Friday July 14 2017, @02:50PM (#539127) Journal

            Thanks to companies like Apple, it has become fashionable to actively lock users out of programming their own machines; it's not just hidden away in shame, but hidden away in malice.

            In the days of DOS microsoft issued qbasic, which allowed people to play nibbles or gorrila and have a fiddle with the coding. They stopped that in the 90s.

            OSX comes with perl, python, even ruby, by default. Type 'gcc' and it prompts you to install xcode, not sure if that costs money, but it's certainly not as much of a mountain to climb as with microsoft. OSX is also a gateway drug to a real OS.

            • (Score: 2) by tangomargarine on Friday July 14 2017, @03:46PM (1 child)

              by tangomargarine (667) on Friday July 14 2017, @03:46PM (#539164)

              but it's certainly not as much of a mountain to climb as with microsoft.

              Download Visual Studio, blindly click "OK" ten or eleven times, and wait twenty minutes? Way too complicated! I need to be able to just beat my face against the monitor until it figures out what I want!

              If you want specifically perl, python, and ruby, that's rather hyperbolic. It's easy to crank out an arbitrary executable on Windows as long as you use C++/C#/VB/etc.

              --
              "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
              • (Score: 2) by isostatic on Friday July 14 2017, @10:48PM

                by isostatic (365) on Friday July 14 2017, @10:48PM (#539379) Journal

                I haven't "downloaded" software for wepla decade - that's what apt is for. In the 90s programming certainly wasn't free in Microsoft - you had to pay big bucks for a compiler. I suspect that Vs was made free (if indeed it is) after apple released OS X and what appears to be a one click Xcode installer.

                Having a set of useful easy to program tools to make the novice realise that "automating this computer is actually quite easy" is native in both Linux and OS X. It's Microsoft who consider "developers" to be a different "class" of people form users, have done for years.

            • (Score: 3, Interesting) by jrmcferren on Friday July 14 2017, @07:12PM

              by jrmcferren (5500) on Friday July 14 2017, @07:12PM (#539275) Homepage

              Xcode is free and can be downloaded from the app store and has support for many programming languages. While the old fashioned BASIC the old time machines started up with isn't included there are a few ways to get BASIC on the Mac..

              For those interested in BASIC:

              Chipmunk BASIC is an interpreter, it is available on multiple platforms. I don't think it is open source. On Mac (and possibly windows as well) you receive two executables, one is for the terminal and the other is for the GUI. The terminal app goes into a folder in the path that you can write to, the GUI file goes into the Applications folder, and I haven't figured out how to handle the manual since I can't write to the manual folder even using sudo, but the idea is you copy that file to the appropriate point and you have the manual as well. Chipmunk BASIC is like the BASIC the old time machines booted with. It does require line numbers unless you use an external editor. It also has some platform specific capabilities, most notably on Mac is Text to Speech and Text to (audible) Morse.

              QB64 requires XCode as if I understand correctly it may be compiled from source on the Mac. QB64 is a Microsoft QuickBasic Clone and is designed to create executables on modern 64 bit systems.

        • (Score: 2) by kaszz on Friday July 14 2017, @05:35PM

          by kaszz (4211) on Friday July 14 2017, @05:35PM (#539222) Journal

          You have a point. But it also seem many people just lack the curiosity. Even young ones.
          (which some studies have linked to depression and stress)

          It's like assembler on machines with BASIC. You had to do something actively to get there. And assembler would do just about anything to put you off even if you got there.

    • (Score: 2) by maxwell demon on Friday July 14 2017, @05:59AM (3 children)

      by maxwell demon (1608) on Friday July 14 2017, @05:59AM (#538981) Journal

      back on 9/9/01

      Which of the two nines is the month? ;-)

      --
      The Tao of math: The numbers you can count are not the real numbers.
      • (Score: 0) by Anonymous Coward on Friday July 14 2017, @07:03AM

        by Anonymous Coward on Friday July 14 2017, @07:03AM (#539001)

        Which one of the 1s is the year?

      • (Score: 2) by Osamabobama on Friday July 14 2017, @10:00PM (1 child)

        by Osamabobama (5842) on Friday July 14 2017, @10:00PM (#539365)

        Mostly, it depends on where you are. Or, in this case, were. And, by "you," I don't mean you, but rather isostatic, unless you were together at the time, in which case I mean y'all.

        I hope that clears it up.

        --
        Appended to the end of comments you post. Max: 120 chars.
        • (Score: 0) by Anonymous Coward on Saturday July 15 2017, @01:51AM

          by Anonymous Coward on Saturday July 15 2017, @01:51AM (#539429)

          +1 to UID#5842 for very rare correct use of "y'all" instead of "one"

  • (Score: 0) by Anonymous Coward on Friday July 14 2017, @03:10AM (1 child)

    by Anonymous Coward on Friday July 14 2017, @03:10AM (#538952)

    That's just barely enough time to organize a party for the next one.

    • (Score: 0) by Anonymous Coward on Friday July 14 2017, @09:33AM

      by Anonymous Coward on Friday July 14 2017, @09:33AM (#539038)

      That's still more than Pi years!

  • (Score: 2) by krishnoid on Friday July 14 2017, @03:17AM (4 children)

    by krishnoid (1156) on Friday July 14 2017, @03:17AM (#538954)

    There are also lists of falsehoods that programmers believe about addresses [mjt.me.uk], names [kalzumeus.com], and other descriptors. I'd like to have three case studies/examples of each falsehood to refer to when decision- and schedule-makers inevitably choose to assert that these problems don't exist, don't come up in the real world, or that their behavior can be left undefined.

    We could then determine which of these issues to make a conscious decision to gloss over. Or at the least, introduce a little padding into your typical project schedule while stakeholders bikeshed the issue to death.

    • (Score: 3, Interesting) by fishybell on Friday July 14 2017, @03:41AM (2 children)

      by fishybell (3156) on Friday July 14 2017, @03:41AM (#538957)

      Gaaaaah. The addresses thing messes me up all the time.

      I live in Utah, where almost every address is a grid location (ex. 500 N 300 E), not a number plus street. (ex. 500 Main).

      Just today I couldn't pay a medical bill online because they wanted my street number, plus the street name rather than just the whole thing together.

      • (Score: 2) by kaszz on Friday July 14 2017, @04:13AM

        by kaszz (4211) on Friday July 14 2017, @04:13AM (#538965) Journal

        Street: 300E
        Number: 500N ? ;-)

        Or just make one up? "500 Gigajoules, Utah".
        Oh it doesn't exist? must be wrong on your computer. Hit it hard! If it doesn't work it might be that the Coriolis effect is upclocking your mains frequency so your Ze-Pe-You can't think fast enough!

      • (Score: 2) by VLM on Friday July 14 2017, @02:06PM

        by VLM (445) on Friday July 14 2017, @02:06PM (#539114)

        Its the modern version of refusing to do business with "New Mexico" because we don't do international.

        I recall once seeing an address field that refused to accept no apartment number, you had to enter 0000 to be accepted, admittedly a very long time ago.

        A surprising common mistake is refusal to accept city names containing spaces.

        Another good example is zip codes, it seems roughly 50/50 if zip+4 will be mandatory or forbidden

        I would guess less than 1% of programmers working with addresses know about https://www.usps.com/nationalpremieraccounts/manageprocessandaddress.htm [usps.com]

    • (Score: 1, Interesting) by Anonymous Coward on Friday July 14 2017, @07:40AM

      by Anonymous Coward on Friday July 14 2017, @07:40AM (#539007)
      Most of the name myths are true. It's just that people try to substitute in other concepts, call them a "name" when really they aren't, and then claim that they aren't handled correctly. Yes that is the NTSF, I take no shame in that.

      If you don't have something which conforms to the simplified concept of a name, then you don't have a name. If your culture doesn't have something which conforms to the simplified concept of a name, then your culture doesn't have "names". Sucks to be you. Go invent your own internet to handle the contrivances that you still cling to despite their illogic.

      You know what happened when Swedish census takers asked their Finnish underlings for their surnames - "what's a surname?". There was no such concept. But the Swedes demanded surnames. So the Finns just invented them (most being topographic/geographic/otherwise-place-related). Tada - henceforth in every subsequent census, and elsewhere in life, everyone had a surname. Problem solved (from the perpective of the Swedes). If your "name" isn't useful enough, get a more useful one, stop going "wah, wah, wah, our naming scheme is special, make exceptions for us, wah, wah, wah".
  • (Score: 0) by Anonymous Coward on Friday July 14 2017, @03:18AM (1 child)

    by Anonymous Coward on Friday July 14 2017, @03:18AM (#538955)

    I experienced Y2K. On a cheapie PC with a Cyrix processor (that was supposed to be equivalent to a 486), the CMOS clock would change from 2000 to 1980 after rebooting.

    Some software broke on September 9, 2001 because that was 1,000,000,000 seconds since the beginning of the UNIX epoch. I didn't experience the breakage myself.

    • (Score: 2) by maxwell demon on Friday July 14 2017, @06:16AM

      by maxwell demon (1608) on Friday July 14 2017, @06:16AM (#538988) Journal

      The most stupid Y2K bug was programs that wouldn't recognize the year 2000 as leap year, in devices that would never have to deal with the past (the case I experienced it in was a VCR). The irony is that if they had just gone with the simple divisible-by-4 rule, the first failure would have been in 2100, that is 100 years later.

      --
      The Tao of math: The numbers you can count are not the real numbers.
  • (Score: 0) by Anonymous Coward on Friday July 14 2017, @03:46AM (1 child)

    by Anonymous Coward on Friday July 14 2017, @03:46AM (#538960)

    I posted a comment in the First-Half... story but date (2017-09-31) on the sidebar is still not changed. And since we're talking about time here. Thought this might be on topic.

    https://soylentnews.org/meta/comments.pl?noupdate=1&sid=20497&commentsort=0&mode=threadtos&threshold=0&highlightthresh=-1&page=1&cid=537872#commentwrap [soylentnews.org]

    • (Score: 2) by martyb on Friday July 14 2017, @11:59AM

      by martyb (76) Subscriber Badge on Friday July 14 2017, @11:59AM (#539077) Journal

      I missed the earlier comment — thanks for the reminder! I have corrected the funding goal end date.

      --
      Wit is intellect, dancing.
  • (Score: 4, Informative) by kaszz on Friday July 14 2017, @04:52AM (12 children)

    by kaszz (4211) on Friday July 14 2017, @04:52AM (#538972) Journal

    Some software may fail to recognize 2000 as a leap year and thus a day might be missing etc.

    The Unix epoch signed 32-bit number problem will happen on 19 Jan 2038 02:14:07 UTC,
    (and so will SQL, PHP, Perl, Python, Java interpreters etc)

    IBM mainframes running z/OS relying on the 64-bit integer time will fail 17 September 2042.

    Mac OSes prior to 10.4 will fail on 6 February 2040.

    • (Score: 2) by maxwell demon on Friday July 14 2017, @06:27AM (2 children)

      by maxwell demon (1608) on Friday July 14 2017, @06:27AM (#538992) Journal

      IBM mainframes running z/OS relying on the 64-bit integer time will fail 17 September 2042.

      Given that a 64-bit integer is sufficient to count the seconds of the estimated time between big bang and big rip, I wonder what definition they used so that it will fail that early.

      --
      The Tao of math: The numbers you can count are not the real numbers.
      • (Score: 2) by FatPhil on Friday July 14 2017, @06:57AM

        Don't they use nano-second granularity? (which should give you just over 4 times the range, but I don't know when their epoch is. Or signedness either, for that matter.)
        --
        Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
      • (Score: 2) by kaszz on Friday July 14 2017, @05:17PM

        by kaszz (4211) on Friday July 14 2017, @05:17PM (#539210) Journal

        Some sources specify the z/OS time as microseconds since Jan 1, 1900 in a 64-bit integer. But that doesn't add up. Wikipedia [wikipedia.org] says it's units of 0.244 nanoseconds since 1 January 1900. Which does add up. Now that is a weird unit of time!

    • (Score: 2) by TheRaven on Friday July 14 2017, @08:47AM (1 child)

      by TheRaven (270) on Friday July 14 2017, @08:47AM (#539027) Journal

      Mac OSes prior to 10.4 will fail on 6 February 2040.

      This seems off to me, macOS uses 1 January 2001 as the epoch date, so should have an extra 31 years after most UNIX systems to kill off 32-bit code. It should also not be limited to prior to 10.4, as any 32-bit apps still running on the system will suffer from epoch rollover. It's largely not an issue, because OpenStep uses double for NSTimeInterval, so most application code that's storing epoch-relative times will have sub-millisecond precision for around 10,000 years and can be migrated to use 128-bit long doubles after that.

      --
      sudo mod me up
    • (Score: 2) by tangomargarine on Friday July 14 2017, @03:37PM (2 children)

      by tangomargarine (667) on Friday July 14 2017, @03:37PM (#539157)

      The Year 2038 problem is an issue for computing and data storage situations in which time values are stored or calculated as a signed 32-bit integer, and this number is interpreted as the number of seconds since 00:00:00 UTC on 1 January 1970 (the epoch).

      One would hope that at some point in the next 19 years we'll universally be on 64-bit computers.

      Isn't this a problem that can be solved right now by just upgrading it to a 64-bit int?

      --
      "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
      • (Score: 2) by kaszz on Friday July 14 2017, @05:42PM

        by kaszz (4211) on Friday July 14 2017, @05:42PM (#539228) Journal

        64-bit integer seconds pretty much solves it. Provided that you count seconds and not nanoseconds like say IBM do. The other catch is interfacing to other systems. Even if you don't collect time from them. If they fail because of timing issues they might deliver other bad data or not at all.

      • (Score: 1) by toddestan on Friday July 14 2017, @11:31PM

        by toddestan (4982) on Friday July 14 2017, @11:31PM (#539399)

        You may think 19 years is a really long time away, but it's not that uncommon to see stuff from the 90's still in use, which wouldn't be too different from something that exists right now still being used when 2038 rolls around.

        You're right that using a 64-bit int solves the problem, but it breaks backwards compatibility with existing software that expects a 32-bit int. Some systems have decided to break compatibility, while others have decided to maintain it, at least for now. Luckily when going to 64-bit most systems decided to expand to 64-bit because everything had to be recompiled anyway, so a lot is limited to 32-bit stuff. But that doesn't mean there aren't other assumptions made somewhere that might break.

    • (Score: 2) by tibman on Friday July 14 2017, @05:04PM (3 children)

      by tibman (134) Subscriber Badge on Friday July 14 2017, @05:04PM (#539207)

      64 bit php has infinite time pretty much. It will exceed the age of our universe. I'd guess 64 bit versions of most interpreters will be the same.

      --
      SN won't survive on lurkers alone. Write comments.
      • (Score: 2) by kaszz on Friday July 14 2017, @05:51PM (2 children)

        by kaszz (4211) on Friday July 14 2017, @05:51PM (#539236) Journal

        Provided it's a 32-bit php implementation running and the code makes use of it..

        • (Score: 2) by tibman on Friday July 14 2017, @07:02PM (1 child)

          by tibman (134) Subscriber Badge on Friday July 14 2017, @07:02PM (#539269)

          True! There are some very legacy PHP installs out there.

          --
          SN won't survive on lurkers alone. Write comments.
          • (Score: 2) by kaszz on Friday July 14 2017, @09:47PM

            by kaszz (4211) on Friday July 14 2017, @09:47PM (#539358) Journal

            Makes me wonder how common vulnerable installations are. And especially if any important ones are exposed. The nasty aspect is that it will go bad at the same time on multiple services without warning. So the only remedy is to be pro-active which works bad with standard human response.

  • (Score: 2) by FatPhil on Friday July 14 2017, @06:54AM (6 children)

    When I got a 15-year mortgage in 1995, it didn't expire 85 years in the past, but 15 years in the future.
    Therefore, there wasn't a Y2K bug in 1995.

    I presume people getting 30 year mortgages in 1970 had an equal level of confusion, i.e. none, otherwise it would have been noticed and fixed.
    Therefore there wasn't much of a Y2K bug in 1970 either.

    Quite when were these mythical bugs actaully having a real-world not-purely-cosmetic effect?
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
    • (Score: 2) by sjames on Friday July 14 2017, @08:36AM (4 children)

      by sjames (2882) on Friday July 14 2017, @08:36AM (#539022) Journal

      The ones that manifested in the '70's got fixed in the '70's, DUH!

      The others would have hit right around Jan 1, 2000.

      Fortunately, the serious ones got ironed out in time, just leaving some perl CGI claiming it was Jan 1st, 19100 and such. See Wikipedia [wikipedia.org] for more examples.

      • (Score: 2) by FatPhil on Friday July 14 2017, @10:36AM (1 child)

        But the whole panic about Y2K was that it was things kicking in *on* 2000-01-01, not *because of* 2000-01-01.
        Almost everything which cares about an interval of time was fixed at/before 2000 minus the maximum value of that time interval.
        --
        Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
        • (Score: 2) by sjames on Monday July 17 2017, @09:41PM

          by sjames (2882) on Monday July 17 2017, @09:41PM (#540586) Journal

          And all the rest was fixed before it got a chance to kick in *ON* 2000-01-01. It was a mad scramble in many cases and even saw former programmers who had retired as managers dragged out of retirement at many times their former pay to bugfix dusty cobol decks.

          So, quit claiming it was all for nothing. You apparently weren't there.

      • (Score: 3, Informative) by gidds on Friday July 14 2017, @12:39PM (1 child)

        by gidds (589) on Friday July 14 2017, @12:39PM (#539090)

        Fortunately, the serious ones got ironed out in time

        That's the point.  Except that it wasn't just ‘fortunately’…

        If all the systems (everything from major mainframe systems down to tiny embedded ones) that were running around 1995 had continued to run unchanged, then there would probably have been widespread disruption: aircraft falling out of the sky, salaries not paid (or paid wrongly), alarms going off for no reason, lifts jamming (or dropping), and tons more.  So it's a good thing that a lot of fuss was made: it woke companies up to the dangers, and forced them to expend time and effort checking and fixing things.

        Of course, because a lot of fuss was made, most of them got fixed, and so when the time came, there was (thankfully) almost no disruption).  So the general public saw it as a damp squib, and thought all the fuss was unnecessary.  Which is a shame — but far better than the alternative of making no fuss and suffering terrible consequences!

        (I was slightly involved in this, as I'd done some work on mainframe systems around 1994; I got called up a couple of years later and quizzed about the risks.  I was able to tell them that yes, I'd used 4-digit years, because I had this amazing ability to foresee the 6 years into the future… Unlike, apparently, many of my colleagues!)

        --
        [sig redacted]
        • (Score: 2) by maxwell demon on Friday July 14 2017, @03:23PM

          by maxwell demon (1608) on Friday July 14 2017, @03:23PM (#539148) Journal

          Well, some people thought about the issues much earlier. [york.ac.uk] :-)

          --
          The Tao of math: The numbers you can count are not the real numbers.
    • (Score: 0) by Anonymous Coward on Friday July 14 2017, @09:28AM

      by Anonymous Coward on Friday July 14 2017, @09:28AM (#539037)

      Would a VCR no longer being able to record correctly programmed TV shows qualify?

  • (Score: 2) by bradley13 on Friday July 14 2017, @09:54AM (2 children)

    by bradley13 (3053) on Friday July 14 2017, @09:54AM (#539042) Homepage Journal

    Date formats. Not only where users can enter them, that's bad enough, but also in databases and programs, sometimes different, sometimes leading to exciting and hard-to-diagnose problems. Times (and dates) with and without time zones, so it matches expectations for several months, and suddenly doesn't anymore.

    Dates and times are a source of nightmares when you have to interoperate with other pieces of software, and you cannot find out what assumptions they have made...

    --
    Everyone is somebody else's weirdo.
    • (Score: 0) by Anonymous Coward on Friday July 14 2017, @04:24PM (1 child)

      by Anonymous Coward on Friday July 14 2017, @04:24PM (#539180)

      I refuse to use any datetime data types in SQL for most projects. For most business applications, storing the number of seconds since the Unix epoch (Unix time) and relying on a library like Joda Time to represent the instant as something an end user can read (local time) just saves so many headaches.

      (I usually also set up stored procedures/functions/whatever's most appropriate for the RDBMS being used to convert these to the RDBMS' datetime type, but converting to local time in my view is an artifact of the presentation layer [view in MVC].)

      Going from int to bigint easily solves the Y2.038k problem this might otherwise present, and if bigint isn't big enough, I don't know, use a decimal type. If that's not big enough, well, I did say business applications. I don't see why it couldn't scale by adding on bit after bit until the type is wide enough. Maybe some kind of vardecimal, I don't know. I haven't found a situation this basic approach won't work for, even if we need to decide we're storing milliseconds or whatever instead of seconds.

      If you ever think you don't need time zones, you're wrong.

      • (Score: 2) by kaszz on Friday July 14 2017, @05:48PM

        by kaszz (4211) on Friday July 14 2017, @05:48PM (#539232) Journal

        When would you say the builtin SQL dates will fail? because some people are likely to use them..

        What is the approach to handle stuff like leap seconds? because a day isn't always 86400 seconds long.

  • (Score: 2, Interesting) by John Bresnahan on Friday July 14 2017, @12:52PM

    by John Bresnahan (5989) on Friday July 14 2017, @12:52PM (#539093)

    I worked on two major systems for Fortune 100 companies that had to be turned off during Daylight Saving Time changes, since the systems used local time, rather than GMT or epoch time. The spring shutdown only had to last a few minutes, but every fall the systems had to be shut down for just over an hour to make sure the system didn't see the "same" time twice in one day.

    The other major problem I saw was when porting a legacy system to run in a virtual machine. The legacy system had been the master time server for a group of machines, and it was decided to keep that functionality on the legacy system. Unfortunately, the host computer which ran the virtual machine containing the time server used that time server to set its own clock, so any time the clock was adjusted, it would cause a feedback loop between the host and virtual machines (that is, the virtual machine would tell the host to change the time, which would cause the virtual machine's notion of the current time to change, which the vm would then treat as another time change, repeating the process).

(1)