Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Friday December 04 2015, @06:52PM   Printer-friendly
from the Is-that-you-Jonathan? dept.

Swift Is Open Source

Swift, Apple's hot new programming language, is now open source. It is available (or will be once the web site isn't so overwhelmed!) for Mac and Linux under an Apache 2.0 license.

"Swift is now open source! We are excited by this new chapter in the story of Swift. After Apple unveiled the Swift programming language, it quickly became one of the fastest growing languages in history. Swift makes it easy to write software that is incredibly fast and safe by design. Now that Swift is open source, you can help make the best general purpose programming language available everywhere. "

Apple Open-Sources Swift and Releases Linux port, as Promised

Apple's Swift programming language may eventually replace the respected but arcane Objective C as the native language for OS X and iOS development, but if you don't have a Mac you might be forgiven for not having taken an interest so far.

However, as MacRumors now reports, Apple have now delivered on their promise to open-source Swift and release a Linux port. It doesn't sound as if the Linux port is quite ready for production use just yet, but the source is out there. Does this mean that Swift is now a contender for general purpose programming?

(Note: at the time of writing, the servers at Swift.org are failing to live up to their name.)


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Funny) by dyingtolive on Friday December 04 2015, @07:02PM

    by dyingtolive (952) on Friday December 04 2015, @07:02PM (#271906)

    They're learning from Microsoft, I see.

    --
    Don't blame me, I voted for moose wang!
    • (Score: 1, Insightful) by Anonymous Coward on Friday December 04 2015, @08:17PM

      by Anonymous Coward on Friday December 04 2015, @08:17PM (#271924)

      Microsoft doesn't create such fuckugly languages. Might as well be doing Mac development in PHP.

      • (Score: 0) by Anonymous Coward on Friday December 04 2015, @08:26PM

        by Anonymous Coward on Friday December 04 2015, @08:26PM (#271931)

        Maybe Miguel was about to announce a new project.

      • (Score: 2) by dyingtolive on Friday December 04 2015, @08:26PM

        by dyingtolive (952) on Friday December 04 2015, @08:26PM (#271932)

        I have no experience with Swift, myself, and C# is surprisingly nice, I'll give you that. I don't know if you'd count it though, but what about Powershell?

        --
        Don't blame me, I voted for moose wang!
  • (Score: 1) by danaris on Friday December 04 2015, @08:29PM

    by danaris (3853) on Friday December 04 2015, @08:29PM (#271933)

    IBM has put up a sandbox page [bluemix.net], where you can try out Swift and see its results quickly without having to install any toolchain yourself.

    Obviously, it's a good idea to at least take a look at the language docs before trying it :-)

    Dan Aris

    • (Score: 2) by takyon on Friday December 04 2015, @08:34PM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday December 04 2015, @08:34PM (#271938) Journal
    • (Score: 0) by Anonymous Coward on Saturday December 05 2015, @12:12AM

      by Anonymous Coward on Saturday December 05 2015, @12:12AM (#272015)

      func fibonacci(i: Int) -> Int {
              if i <= 2 {
                      return 1
              } else {
                      return fibonacci(i - 1) + fibonacci(i - 2)
              }
      }

      No parenthesis around if-conditions? Blasphemy!

      • (Score: 2) by BasilBrush on Saturday December 05 2015, @10:40PM

        by BasilBrush (3994) on Saturday December 05 2015, @10:40PM (#272287)

        Yeah, swift tends to drop superfluous constructs. No need to use semicolons where newlines separate statements anyway. No need to declare the types of variables in or before an assignment if the type is implicit in the RHS etc. It makes for clean code.

        --
        Hurrah! Quoting works now!
        • (Score: 2) by vux984 on Monday December 07 2015, @04:49AM

          by vux984 (5045) on Monday December 07 2015, @04:49AM (#272745)

          It makes for clean code.

          It makes for brittle code. The extra structure in other languages isn't superfluous, its 'error correction code'; and it helps catch errors where they happen rather then where things finally break in the code.

          if I write

          int x = 11

          And then 12 years later someone comes along and writes int x=11.5; instant compile error. Right where the change was made. if all I wrote was:

          x=11

          and the compiler automatically assigns it as int because "its obvious that's what I want" and then 12 years later someone comes along and writes

          x=11.5 and the compiler automatically assigns it as float or double because "its obvious that's what I want", and then all sorts of bizzare errors crop up... suddenly some client/server IPC breaks somewhere because the data being exchanged is the wrong size and type.

          People underestimate the value of there being some structural redundancy. Yes too much and the language just becomes needlessly verbose -- not enough and the language trips over itself helping you, causing you more trouble than it saves.

          Python's whitespace is like this too -- it saves you some curly braces but I find it makes the code a lot more brittle (less maintainable) and much easier to botch the program structure during maintenance refactoring and editing especially if you are copy and pasting things around. Plus I admit I like being able to just make a mess of the code, drop some curly braces in, and then 'autoformat' and it sets up all the indentation and then I double check it. I much prefer that to manually re-doing all the indentation myself.

          • (Score: 2) by BasilBrush on Tuesday December 08 2015, @01:33AM

            by BasilBrush (3994) on Tuesday December 08 2015, @01:33AM (#273139)

            https://en.wikipedia.org/wiki/Don%27t_repeat_yourself [wikipedia.org]

            suddenly some client/server IPC breaks somewhere because the data being exchanged is the wrong size and type.

            No because APIs define their parameter types.

            Plus I admit I like being able to just make a mess of the code, drop some curly braces in, and then 'autoformat' and it sets up all the indentation and then I double check it.

            Human beings do not understand code block structures by the number of braces, but by the indentation. You're dropping curly braces in order to get the indentation (and therefore the block structure) right. Which is backwards. Again you are suffering from repeating yourself. Mentally converting from one thing to another, then getting the editor to fix up the first thing from that.

            With Python you just deal with the primary issue - make the indentation show the block structure you want.

            I much prefer that to manually re-doing all the indentation myself.

            All you are doing is manually editing indentation via a more convoluted method.

            --
            Hurrah! Quoting works now!
            • (Score: 2) by vux984 on Wednesday December 09 2015, @05:26AM

              by vux984 (5045) on Wednesday December 09 2015, @05:26AM (#273811)

              No because APIs define their parameter types.

              Nitpicking. If you can't think of a situation where automatic type inference failed, or created problems down the road in maintenance you aren't trying.

              You're dropping curly braces in order to get the indentation

              Essentially yes.

              Which is backwards.

              I disagree.

              Again you are suffering from repeating yourself.

              No, I only enter the braces, the editor does the indentation. The information is 'repeated' but its not extra work for me.

              With Python you just deal with the primary issue - make the indentation show the block structure you want.

              Except that whitespace is incredibly fragile; and poorly preserved by all kinds of common text editing applications and operations. And you don't really edit it, you use it to 'push' other things around.

              Mentally converting from one thing to another

              Entering whitespace is no less a mental conversion. You want the code to be 'here' so you hit 'tab' and 'space' and 'delete' to push and pull it around, because you can't just tell it ... be HERE. Proper delimiters are at least less work to enter, and can automatically reform to an indented state if the whitespace/indenting gets mangled. And if you delete one by accident, its trivially detected that they aren't balanced. Delete an indent level somehwere by accident and the code just does something different.

              • (Score: 2) by BasilBrush on Thursday December 10 2015, @03:58PM

                by BasilBrush (3994) on Thursday December 10 2015, @03:58PM (#274493)

                Nitpicking. If you can't think of a situation where automatic type inference failed, or created problems down the road in maintenance you aren't trying.

                It's ALWAYS possible to create bugs by editing code. The point is that repeating yourself makes it more likely rather than less.

                Except that whitespace is incredibly fragile; and poorly preserved by all kinds of common text editing applications and operations.

                Then don't use broken tools. We're not in the 1970s anymore.

                Entering whitespace is no less a mental conversion. You want the code to be 'here' so you hit 'tab' and 'space' and 'delete' to push and pull it around, because you can't just tell it ... be HERE.

                Again you have a tool problem. Or you need to learn it. You suggest the editor is clever enough to reformat code based on braces. But it doesn't have indent and unindent operations???

                --
                Hurrah! Quoting works now!
                • (Score: 2) by vux984 on Thursday December 10 2015, @07:23PM

                  by vux984 (5045) on Thursday December 10 2015, @07:23PM (#274590)

                  The point is that repeating yourself makes it more likely rather than less.

                  More likely that you make a simple compile time error; but less likely that you make a program logic error. Compile time errors are trivial to fix by comparison and worst case make you re-think program logic which is a good thing.

                  Then don't use broken tools. We're not in the 1970s anymore.

                  Quite. But I consider it a hallmark of a good language that I can work with it in notepad / vi /vim / textedit if I have to.
                  A language that can only be efficiently worked on in a special tool is broken by design. Almost as bad as languages which have proprietary non-plaintext formats for the source.

                  Again you have a tool problem. Or you need to learn it. You suggest the editor is clever enough to reformat code based on braces. But it doesn't have indent and unindent operations???

                  Not at all. Entering curly braces just requires typing single characters where needed, and then hitting the button to reformat. If I'm moving code around between indent levels but not changing the logical structure (e.g. I'm copying code from outside a loop to inside the loop, I can just copy/paste and reformat.

                  With white space, I have to highlight the blocks that need to be indented first, that's more effort than just dropping a single curly brace. And when I'm copying code around but not changing the program structure -- if the code is being moved between indent levels I have to fix them manually.

                  Plus if I copy/paste some code snippet or function to a forum or a word document or an email or chat messenger tool or IRC or skype anything else that ISN'T a "programming tool", and which is often quite hostile to whitespace I infinitely prefer to have the code retain its correctness so that the recipient can copy it back out and it just works, and they can hit a hotkey to reformat it.

                  Whitespace should not used as anything more than an elementary delimiter. Whitespace as program flow semantics is idiotic.
                  The improvement in "readability" by not having braces is minimal, and not worth anything to me, while the costs imposed by working with it are significant and unjustifiable.

                  • (Score: 2) by BasilBrush on Friday December 11 2015, @10:09PM

                    by BasilBrush (3994) on Friday December 11 2015, @10:09PM (#275176)

                    You quoted but didn't understand my point. You take it as an assumption that there is a "format button" that knows how to auto-format code in your chosen language. But you also take it as an assumption that the editor doesn't make indenting and unindenting and copy/pasting between indent levels easy.

                    This means that you only have experience of editors that understand C but not editors that understand Python. Or that you've never actually used Python. Either way it's a lack of experience on your part.

                    --
                    Hurrah! Quoting works now!
  • (Score: 2) by PizzaRollPlinkett on Friday December 04 2015, @08:40PM

    by PizzaRollPlinkett (4512) on Friday December 04 2015, @08:40PM (#271939)

    This Swift stuff and the Microsoft Rosalyn open-source stuff would be kind of neat to play with, if you didn't have to build it from scratch. Will distros like Fedora or any of their add-on repositories ever build this stuff as an rpm you can just install? I looked at Swift, and don't want to learn it badly enough to figure out everything they need you to do to get it up and running. I guess I've been spoiled by package managers, because I can remember the days when I had to build emacs, gcc, and everything from tarballs on new machines.

    --
    (E-mail me if you want a pizza roll!)
    • (Score: 2) by ticho on Friday December 04 2015, @10:54PM

      by ticho (89) on Friday December 04 2015, @10:54PM (#271986) Homepage Journal

      It's been released under the Apache license, so my guess is that it will eventually be packaged by distributions. Like everything else in opensource, it will depend on when this itch will itch someone strongly enough to scratch it.

  • (Score: 0) by Anonymous Coward on Friday December 04 2015, @08:45PM

    by Anonymous Coward on Friday December 04 2015, @08:45PM (#271942)

    I've showed up here in the past, and always benefited from your collective wisdom. Here I return to ask another question.

    I'm having trouble punching through the $90K barrier (in the midwest, so adjust currency values accordingly) using my chosen skillset.

    The big issue is that I'm firmly entrenched in the GNU/Linux ecosystem (C, gcc, Perl, Python, MySQL/Postgres) doing web development (middleware) stuff. I've thought about hopping on the Mac bandwagon purely for the money, but Objective-C was horrifying.

    Although it would feel like selling out, I wonder if now is the time. Learn Swift, get up to speed on iOS development, and essentially become "one of them." I'll bet the money and job opportunities are better.

    Surely Swift is less a nightmare than Objective-C, and since there's a Linux port available, might be something I could spend some time with.

    Soylentils, is it worth it?

    • (Score: 1, Insightful) by Anonymous Coward on Friday December 04 2015, @09:10PM

      by Anonymous Coward on Friday December 04 2015, @09:10PM (#271948)

      In a couple months, the automated resume filters are going to be expecting 5+ years of production-level experience with Swift, or your CV will hit the bit bucket. So you better get started right away.

      • (Score: 0) by Anonymous Coward on Friday December 04 2015, @09:20PM

        by Anonymous Coward on Friday December 04 2015, @09:20PM (#271953)

        Hey, I can lie on resumes with the best of 'em. Who's going to interview me? Some milennial node.js kid? Bring it on!

      • (Score: 1, Informative) by Anonymous Coward on Friday December 04 2015, @10:12PM

        by Anonymous Coward on Friday December 04 2015, @10:12PM (#271964)

        The "sad but true" mod is "insightful"?

        I think the average soylentil may be cynical. (Or I may be just projecting.)

    • (Score: 2) by theluggage on Friday December 04 2015, @10:08PM

      by theluggage (1797) on Friday December 04 2015, @10:08PM (#271963)

      Surely Swift is less a nightmare than Objective-C, and since there's a Linux port available, might be something I could spend some time with.

      Well, if you're experienced in C++ Python then learning the Swift language should be easy - its much closer to "C meets Python" than Objective-C's "C meets Smalltalk". The hard (or at least time-consuming) bit will be getting up to speed on the iOS API. For that, I'm afraid, you'll still need a Mac.

    • (Score: 0) by Anonymous Coward on Saturday December 05 2015, @12:20AM

      by Anonymous Coward on Saturday December 05 2015, @12:20AM (#272021)

      Is there much call for app development in the midwest?

      My understanding is that something like 95% of apps are money losers for the developers. If you are doing custom software for internal use by clients that might be a better market. But how many business clients would put their system on the ios platform instead of android? My guess is that android would be a lot more attractive since the phones and tablets are so much cheaper. But I don't work in that market at all so what do I know?

    • (Score: 2) by fleg on Saturday December 05 2015, @02:21AM

      by fleg (128) Subscriber Badge on Saturday December 05 2015, @02:21AM (#272057)

      >Surely Swift is less a nightmare than Objective-C

      havent done obj-c but i've had to learn to read a fair bit of it when looking up ways
      to do stuff in swift (cos the examples and tutorials hadnt been updated for swift yet).

      so on that basis, yes, it is nowhere near the eye-bleeding nightmare the obj-c syntax is.

      i like swift a lot, with this announcement i'm kind of hoping it will replace
      c++ long term too.

      as to yourself AC if you have c, python etc i dont think you'll have too much trouble, and
      xcode + ios make the sweetest embedded development environment i've ever seen.

      • (Score: 2) by linuxrocks123 on Saturday December 05 2015, @03:43AM

        by linuxrocks123 (2557) on Saturday December 05 2015, @03:43AM (#272074) Journal

        I doubt it will replace C++, and I certainly hope not. The raw power and runtime efficiency you can get out of C++ isn't matched by anything else out there, and it's getting better with each new standard, yet the changes are evolutionary, and done in a way where the language maintains its maturity. C++ templates are an absolute marvel; Java and C# "generics" are pale imitations in comparison.

        There's been a glut of new languages in the past few years. I'm sure some will stick around, but C++ is in a league by itself.

        • (Score: 2) by BasilBrush on Saturday December 05 2015, @10:52PM

          by BasilBrush (3994) on Saturday December 05 2015, @10:52PM (#272289)

          C++ is only faster if you ignore security and defensiveness. With no overflow or bounds checking C++ is much faster than Swift. But compile Swift with -Ofast, which also tuns off overflow and bounds checking, and Swift is just as fast as c++.

          http://www.helptouser.com/code/24101718-swift-performance-sorting-arrays.html [helptouser.com]

          --
          Hurrah! Quoting works now!
          • (Score: 3, Informative) by linuxrocks123 on Sunday December 06 2015, @12:19AM

            by linuxrocks123 (2557) on Sunday December 06 2015, @12:19AM (#272304) Journal

            http://www.primatelabs.com/blog/2014/12/swift-performance/ [primatelabs.com]

            Turning off overflow and bounds checking definitely helps, but Swift is still slower.

            Now, part of this could be because Swift's compiler is not as good. Over time, the gap might close a little more. But, for instance, Swift isn't doing generics right -- it's following the Java/C# method of not resolving them at compile time. So that will always handicap it versus C++.

            The only languages out there that can hope to match C++'s performance are C and Fortran. C because it doesn't provide conveniences that slow things down, and Fortran because it match's C's simplicity and also takes pains to eliminate the aliasing problem for compilation.

            In fact, the aliasing problem is one of the most problematic cases for C/C++ optimization, and Fortran's advantage here is enough to make Fortran code faster than C and C++ for a significant number of workloads. C and C++ are finally catching up, though, with restrict. Here's a C++ Standards Committee whitepaper on the problem: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3635.pdf [open-std.org]

            The primary advantage of C++ is that the language is almost Borglike in its appropriation of other languages' features, but it always does so in a way that is as high-performing as possible, and that doesn't penalize code that does not use the new feature. My view is that any language that wants to replace C++ will have to both match its features and also provide something new, something that C++, for some reason, can't appropriate. That will be a tall order.

            Oh, regarding bounds checking, you can definitely turn on bounds checking for vectors if you want to, as a debugging tool. I forget the details on how to do that; it's a macro or something. One thing I would like to see in C++ is a way to run all the code in an interpreter that traps invalid pointer accesses and the like. Someday, I may write an interpreter that runs off the Clang AST.

            • (Score: 2) by BasilBrush on Sunday December 06 2015, @02:45PM

              by BasilBrush (3994) on Sunday December 06 2015, @02:45PM (#272481)

              Your link is Dec 2014, therefore is for Swift 1.x. I specifically gave you a link for Swift 2.x, as 1.x was always about functionality more than speed. And for sure, Swift will continue to gain performance faster than c++, if only because c++ has already been optimised for so many years.

              I can see you're a true believer. But your comment "Swift isn't doing generics right" is based on the belief that C++ is right. However what c++ does is well known, as is what Java and c# does. And Chris Lattner's specialism is compiler design. If C++ templates were the acme of generic programming then Swift would be done that way - Swift takes the best from older languages. But it's not.

              In your belief that C++ is inevitably faster than Swift, what you are missing is that one of the major design goals of Swift was to enable compiler optimisations. For example the Swift compiler tends to knows better what is constant. And what pointers may or may not be nil. And those make for optimisations.

              In fact optionals as a first class feature of the language, with flow control support rather than simply a template in the standard library is one of the great advantages of Swift over C++. It tackles that other biggest source of bugs and security defects in C/C++ besides buffer overflows. Dereferencing null.

              --
              Hurrah! Quoting works now!
              • (Score: 2) by linuxrocks123 on Sunday December 06 2015, @04:11PM

                by linuxrocks123 (2557) on Sunday December 06 2015, @04:11PM (#272497) Journal

                Your link is Dec 2014, therefore is for Swift 1.x.

                It was the most complete benchmark I could find. Yours is just sorting arrays. It's rather hard to screw up sorting arrays; even Java manages to get that right. You'd have to move to scripting languages to find significant performance degradation there.

                And Chris Lattner's specialism is compiler design. If C++ templates were the acme of generic programming then Swift would be done that way - Swift takes the best from older languages. But it's not.

                My specialization is also compilers, and I actually worked in the same group as Chris Lattner, after he'd left. Let's discuss technical merits rather than appealing to authority.

                From here [austinzheng.com], it appears that I was initially wrong, and Swift does do generics (mostly) right. That's good news. Unfortunately, it does appear Swift has made a similar mistake to C# by using structs as always value types and classes as always reference types. There is no reason to force this limitation on the programmer.

                For example the Swift compiler tends to knows better what is constant.

                With constexpr in C++ since 2011, I find that assertion unlikely to be true.

                In fact optionals as a first class feature of the language, with flow control support rather than simply a template in the standard library is one of the great advantages of Swift over C++. It tackles that other biggest source of bugs and security defects in C/C++ besides buffer overflows. Dereferencing null.

                C++ has optionals. They're called pointers. Dereferencing NULL is not generally a security problem, and Swift solves buffer overflows by doing runtime checks that kill performance unless you disable them with a compiler flag. This is not impressive.

                Swift, Rust, and Go all seem to be products of a "Let's recreate ML" craze. They're not by themselves bad languages, but they're also not particularly good. And, if you want ML, it's there for you, and it's fine. We don't all of a sudden need three new ML variants, it's just that Apple, Mozilla, and Google all just want to create their own pet company languages, probably from due to an overly developed organizational ego. This isn't likely to end particularly well for those organizations, as hipster languages tend to have short lifespans. Tell me, is node.js still in, or is that last month's fashion now?

                Don't get me wrong, they probably won't "die". It takes a lot for a language to "die". Apple can guide its lemmings off Objective C and onto Swift fairly easily. This will likely kill Objective C, since Apple is the last major user of that language. If and when Apple decides to move its lemmings elsewhere, Swift will then go the same way. But that will be up to Apple. Google can push Go on Android with the same effect. Mozilla, umm ... Mozilla can write Firefox in Rust I guess.

                • (Score: 2) by BasilBrush on Sunday December 06 2015, @10:06PM

                  by BasilBrush (3994) on Sunday December 06 2015, @10:06PM (#272597)

                  OK, you nearly sound like you know what you're talking about, even given you owning up to the generics mistake. But then you say this:

                  "C++ has optionals. They're called pointers."

                  And at that point it's clear you don't have a clue about Swift. Optionals are certainly NOT pointers. Rather they fix the problem of pointers - that whether or not an empty value is allowed is undefined. C/C++ Compilers always allow nil pointers, even when programmer logic assumes they don't. And yet they don't even have to be pointers. Swift optionals can be any value.

                  Actually C++ does support optionals buy only as std::optional. But that means you don't get any compiler support.

                  Finally you go off into a rant, featuring hipsters. At which point I can completely write you off as someone who's stuck in his ways. You simply think that C++ is the best at everything and that's that. Even though it's clear to pragmatic programmers that C++ has it's place and so do many other languages.

                  --
                  Hurrah! Quoting works now!
                  • (Score: 2) by linuxrocks123 on Monday December 07 2015, @07:07AM

                    by linuxrocks123 (2557) on Monday December 07 2015, @07:07AM (#272797) Journal

                    First, there's really no need to be rude. Flames are for Slashdot. Let's be civil.

                    Optionals are certainly NOT pointers. Rather they fix the problem of pointers - that whether or not an empty value is allowed is undefined. C/C++ Compilers always allow nil pointers, even when programmer logic assumes they don't. And yet they don't even have to be pointers. Swift optionals can be any value.

                    A more complete answer than "pointers are optionals" is that there are multiple ways to express a value that can't be NULL in C++. If you're writing a function and you want parameters that can't be NULL, you should use value parameters if the object is small and references if the object is large. Unlike pointers, references can't be null. If you want something that can be NULL, well, the easiest solution is to take a pointer and check if it's NULL. Since you don't "own" the pointer in this case, and it can be on the caller's stack rather than the heap, taking a pointer is not usually a problematic thing to do.

                    Now, if you want to return an optional, then there can be issues. You can still use pointers, of course, but now the pointer has to be on the heap, and the callee has to delete it. You can use reference-counted pointers (shared_ptr), and that's pretty good, but somewhat inefficient. So, not ideal, especially for a maniacally-performance-focused language like C++. This is why C++ does have experimental support for std::optional.

                    But that means you don't get any compiler support.

                    optional is a template, so, if you violate the type rules for optional, you'll get a compiler error. There is no reason the language has to be extended for something easily implementable as a template, which is why the standards committee didn't do that. Putting optional in the standard library is the right approach.

                    Finally you go off into a rant, featuring hipsters. At which point I can completely write you off as someone who's stuck in his ways. You simply think that C++ is the best at everything and that's that. Even though it's clear to pragmatic programmers that C++ has it's place and so do many other languages.

                    Okay, the hipsters thing might have been uncalled for. But I do think Swift, Rust, and Go are three languages in a space where there really only needs to be 1, and that 1 is probably one of Delphi, Object Pascal, or OCaML.

                    There are, in general, really just too many languages. For instance, Python and Ruby are, for all practical purposes, the same language, yet there are two communities with two sets of duplicated standard libraries and two teams working on trying to optimize two interpreters. It's just silly. Yes, we needed a "better Perl". We didn't need two of them. And, honestly, we probably didn't need Python or Ruby, because we already had TCL.

                    So, now, some people have decided they want Java, but faster and natively compiled, so we have three separate attempts to create a language in that space. They could have found one of the many existing natively-compiled safe languages and started from there. Or, at a bare minimum, they could have worked together to create this language they think needs to exist. But, no, we have three different communities triplicating their effort to create three different languages from scratch that will not be learning from past mistakes in the space they're "innovating" in, three times over.

                    Ultimately, I think it's going to be C++ that fills the "safe C++" space. If you follow a few simple coding conventions, you can already eliminate C++'s lack of safety from new code. Enforcing those conventions through syntax rules, and having a C#-like "unsafe" construct for when the rules need to be broken by old code or libraries is a logical next step. Optional safety in C++, plus an interpreter to aid development, and C++ becomes safe, fast, and easy to develop in. Best of all, or at least most, worlds.

                    That's why I'm going to write a C++ interpreter. C++ is the Borg, and I will help it assimilate interpreted scripting languages :). Mwa ha ha ha.

                    • (Score: 2) by BasilBrush on Thursday December 10 2015, @04:02PM

                      by BasilBrush (3994) on Thursday December 10 2015, @04:02PM (#274495)

                      optional is a template, so, if you violate the type rules for optional, you'll get a compiler error. There is no reason the language has to be extended for something easily implementable as a template

                      Again that's flawed thinking. An optional is not a template. It happens to be implemented as a template in C++. But that doesn't mean that doesn't reduce the functionality of the optional, due to the limitations of the template implementation. It does. C++ doesn't support half of the functionality of optionals in Swift.

                      The problem is you haven't really looked at Swift.

                      --
                      Hurrah! Quoting works now!
                      • (Score: 2) by linuxrocks123 on Thursday December 10 2015, @11:13PM

                        by linuxrocks123 (2557) on Thursday December 10 2015, @11:13PM (#274692) Journal

                        Again that's flawed thinking. An optional is not a template. It happens to be implemented as a template in C++. But that doesn't mean that doesn't reduce the functionality of the optional, due to the limitations of the template implementation.

                        This is an annoying thing to say. What, specifically, do you think C++ loses by implementing optional as a template? This is really a moot argument anyway, because optionals frankly aren't very important or interesting, but why don't you say, specifically, what you think C++ is missing here?

                        The problem is you haven't really looked at Swift.

                        The problem is you haven't surveyed the history of programming languages and so aren't aware of the vast number of languages already designed in the "imperative, non-GC language without pointers" space. Look at Ada, and then tell me what Swift has that's new. Now that I think of it, I was wrong to be talking about Pascal and OCaml. The small field there is so crowded, I forgot about the queen. Ada is the safe imperative language, and it's a quite good one that never really got the pickup it deserved.

                        I've looked at Swift more, in this conversation, and my initial assessment of it hasn't changed much. Ada is an excellent -- stellar -- safe, imperative language. And it got picked up by exactly the people who wanted that. Which was not many people -- basically the military and designers of software for airplane cockpits and life-critical medical equipment, and no one else. And now many of them are moving to C++, which actually makes me kind of sad, because there should probably be at least one high-performance imperative language that's not C/C++ or Fortran. But the Ada community is still small, even after all these years, because not enough people care about safe, high-performance imperative programming to make Ada's community thrive.

                        Seriously, look at Ada, then tell me again how new and awesome Swift is. I'm so glad I thought of Ada, because I was flailing around trying to find a good language to compare Swift to. Now I've got one. And I'm going home, so no link to prove it, but I would be extremely surprised to learn Ada doesn't have a way of dealing with optional values.

                        • (Score: 2) by BasilBrush on Friday December 11 2015, @02:03AM

                          by BasilBrush (3994) on Friday December 11 2015, @02:03AM (#274760)

                          Perfect.

                          Consider this article:"Null Considered Harmful"
                          http://www.adacore.com/adaanswers/gems/ada-gem-23/ [adacore.com]

                          Up until Ada 2005 recommended practice was to document whether a function could return a null. Then they extended the language to the compiler/run time could do checks.

                          Swift doesn't claim to do anything new, but rather to pick up best modern practice, and improve on it. And so it does, optionals cover this issue and lots more that come under the title "Null Considered Harmful".

                          What does Swift do, over and above?

                          In C/C++ you often have the construct:

                          if (foo) {
                          // do something with foo
                          }

                          If foo is a value that can't be nil (any non optional in Swift) then that construct is superfluous. Swift complains if you try. Very often C/C++ code checks "just in case". There's a reason for Swift to be faster in average code right there. In addition to the code being clearer.

                          If foo is an optional, then trying trying a simple dereference is an error. Because you haven't said what you want to happen if it's a nil. There are other things you can do. cover the nil in an else. Tell Swift to not do an operation if it's a nil. Or in extremis you can guarantee that although the type says it could be a nil, you happen to know it's not.

                          In other words, C++ std::optional gives you the datatype, but none of the checking. It still relies on you to remember to check for nil. In other words it has no equivalent of "if let", "!" or "?".

                          Worse. virtually all existing code and practice uses pointers in C++, whether they can conceptional contain null or not. At best they are documented, just like ADA pre 2005.

                          Now you can argue against some of this, and no doubt you will. But the fact is that a feature that is a central part of the language from it's conception is bound to be better than ADA or C++'s lesser facilities, tacked on as an afterthought, and in C++'s case not even part of the language. Yet the fact that C++ and ADA have tacked these things on, illustrates the need.

                          And frankly I'm surprised that you, having been programming for some years, haven't felt frustration at some points because you don't know whether a pointer you've been given could in some circumstances be nil. Or indeed whether some poorly documented API you are calling will accept nil for parameters you don't care about. Your love for C++ seems to have blinded you to it's limitations.

                          --
                          Hurrah! Quoting works now!
                          • (Score: 2) by linuxrocks123 on Saturday December 12 2015, @03:56AM

                            by linuxrocks123 (2557) on Saturday December 12 2015, @03:56AM (#275292) Journal

                            So the best complaint you can come up with about Ada versus Swift is something they fixed in 2005? Okay then.

                            If foo is a value that can't be nil (any non optional in Swift) then that construct is superfluous. Swift complains if you try. Very often C/C++ code checks "just in case". There's a reason for Swift to be faster in average code right there. In addition to the code being clearer.

                            In C++, you should never pass a pointer as a parameter that is guaranteed not to be NULL. That's what references are for. C/C++ code checking for NULL when a pointer is guaranteed to not be NULL is just bad code. You can write bad code in any language.

                            If foo is an optional, then trying trying a simple dereference is an error. Because you haven't said what you want to happen if it's a nil. There are other things you can do. cover the nil in an else. Tell Swift to not do an operation if it's a nil. Or in extremis you can guarantee that although the type says it could be a nil, you happen to know it's not.

                            This is exactly equivalent to optionals in C++. If you don't know if the optional is empty, the value() and value_or() functions allow you to check. If you happen to somehow know it's not, then you can dereference it without suffering the performance penalty checking for an empty optional implies. Exactly the same as you describe in Swift.

                            In other words, C++ std::optional gives you the datatype, but none of the checking. It still relies on you to remember to check for nil.

                            This is merely a syntax complaint. Dereferencing an optional in C++ is saying, "I know this optional isn't empty and am overriding the checking." If you want the checking, you use value() or value_or(). If, instead of overloading operator*, the C++ library instead provided a function named unsafe_dereference(), you would not be complaining. Well, C++ named the "unsafe_dereference()" function as "operator*" instead. Same function, just a different name. Complaining about the name of a function is bikeshedding.

                            In other words it has no equivalent of "if let", "!" or "?".

                            Yes, it does. ! and ? work exactly as they should, because optionals can be implicitly converted to bool. The converted optional will be true if the optional has a value and false if it does not. You do "if let" like this:

                            try
                            {
                                    auto& varname = opt_value.value();
                                    //...
                            }
                            except(std::bad_optional_value e) {}

                            Worse. virtually all existing code and practice uses pointers in C++, whether they can conceptional contain null or not. At best they are documented, just like ADA pre 2005.

                            Much C++ code was originally C, which does not have references. And, yes, good C code -- actually, good ... code -- documents what its parameters can be when it's not obvious from the function name or general description. This doesn't really affect modern C++'s comparison with Swift, except in C++'s favor, since C++ can use these legacy APIs without a foreign function interface.

                            Now you can argue against some of this, and no doubt you will. But the fact is that a feature that is a central part of the language from it's conception is bound to be better than ADA or C++'s lesser facilities, tacked on as an afterthought, and in C++'s case not even part of the language. Yet the fact that C++ and ADA have tacked these things on, illustrates the need.

                            I just established that, yes, C++'s implementation of optional is basically equivalent to Swift's. That C++ is flexible enough to fully express the concept of optional without needing to change the language itself is a testament to the language's flexibility. But not that big a one, because, at its core, this optional concept you keep going on about is a struct containing memory to hold an instance of an arbitrary datatype and a bool tag indicating whether that memory is a valid object. If C++, with its strong focus on first class user-defined types and generic programming, were yet so brittle it needed to make the concept of "a struct tagged with a bool indicating whether it's initialized" part of the language itself, that would be quite concerning indeed.

                            Yet the fact that C++ and ADA have tacked these things on, illustrates the need.

                            I still don't know why you are so fixated on this trivial and only marginally useful concept.

                            C++98 was expressive enough so that optional could have been added to the standard library then. The reason it only made it into std::experimental ~20 years later is because the use cases for this concept in C++ are vanishingly small. They are basically limited to places where you don't want to pass a reference as a parameter because the object is so small that the overhead of passing by reference is inefficient compared to passing a copy of the object itself plus 8 bits for a bool tag, and to places where you want to return a value from a function that may fail. In the case of a function that may fail, you can actually use std::pair instead, so optional isn't necessary in C++ anywhere. But optional is slightly more efficient and syntactically "cleaner" in a few cases. So, Boost designed an optional template, because Boost's purpose in life is to add random crap to the C++ standard library, and the standards committee shrugged and imported it into the standard after Boost showed over the course of a decade or so that the template was sometimes useful and didn't have serious flaws or cause unexpected problems.

                            std::optional doesn't need to be part of the language in C++. It only barely deserves to be part of the standard library.

                            And frankly I'm surprised that you, having been programming for some years, haven't felt frustration at some points because you don't know whether a pointer you've been given could in some circumstances be nil. Or indeed whether some poorly documented API you are calling will accept nil for parameters you don't care about.

                            It is a very poor API indeed that doesn't tell you what it returns or what arguments it takes. I haven't run into many of those. When I do, I blame the library author, not the language, whatever the language happens to be.

                            Your love for C++ seems to have blinded you to it's limitations.

                            Saying things like this is not constructive.

                            • (Score: 1) by linuxrocks123 on Saturday December 12 2015, @03:59AM

                              by linuxrocks123 (2557) on Saturday December 12 2015, @03:59AM (#275293) Journal

                              Oh, in case comments close on this and you want to keep talking, I made a journal for us here: https://soylentnews.org/comments.pl?sid=11091 [soylentnews.org]

                            • (Score: 2) by BasilBrush on Saturday December 12 2015, @11:57PM

                              by BasilBrush (3994) on Saturday December 12 2015, @11:57PM (#275615)

                              I still don't know why you are so fixated on this trivial and only marginally useful concept.

                              Because it's neither trivial nor marginally useful. One of the most common bugs in C derived languages is dereferencing null. You don't see the problem because your language of choice is archaic, but you are used to it.

                              So the best complaint you can come up with about Ada versus Swift is something they fixed in 2005? Okay then.

                              No. From that article ADA had added a facility to mark a pointer argument as not being null. That is not the complete support for optionals that Swift has.

                              allow you to check. If you happen to somehow know it's not, then you can

                              "allow", "can". That's what you get with a facility tagged on as an afterthought. All Swift code uses optionals rather than nil pointers or dummy values. And the language enforces good practice with them. That's not true of C++.

                              But it's clear that your love for C++ (and apparently Linux) won't let you see where other languages have bettered it. So this is going nowhere.

                              --
                              Hurrah! Quoting works now!
                              • (Score: 2) by linuxrocks123 on Sunday December 13 2015, @05:59AM

                                by linuxrocks123 (2557) on Sunday December 13 2015, @05:59AM (#275700) Journal

                                You seem to have missed something I said.

                                All Swift code uses optionals rather than nil pointers or dummy values. And the language enforces good practice with them.

                                This appears inconsistent with:

                                in extremis you can guarantee that although the type says it could be a nil, you happen to know it's not.

                                Assuming the "in extremis" line is correct, here is the exact equivalence between Swift optionals and C++ optionals. This is close to a constructive proof on this point.

                                - Swift operator! is equivalent to C++ std::optional::operator*.
                                - Swift "if let" / operator? equivalent to C++ the try{...} construct I mentioned previously.
                                - Swift's optional-in-if-statement syntax is equivalent to C++'s std::optional::operator bool().

                                The two are mathematically equivalent, man. C++ didn't add optionals to the language because C++ didn't need a separate language construct for this concept. That's what happens when a language is designed with the explicit goal of having true first class user-defined types: you can put big changes in the standard library rather than needing to add new language constructs. Admit you're on wrong on this point -- or don't -- but, either way, let's move on to a different point of comparison.

                                But it's clear that your love for C++ (and apparently Linux) won't let you see where other languages have bettered it. So this is going nowhere.

                                This is going nowhere because you don't appear to really be reading what I'm saying. And, I'm never going to respond to statements like this by defending myself; they are too silly to deserve a response. You have no idea what languages I've worked with in the past, or what my perspective is on programming languages in general. Even if you were right about what you're saying, the fundamental problem here is you think ad hominem attacks are logically coherent arguments. They are not [yourlogicalfallacyis.com].

                                I thought, after pointing out you were making pointless ad hominem attacks, that you would engage in the minimal amount of introspection necessary to determine that you were and would contribute more productively to the conversation from that point on. My experience in the past has been that this usually works. But you obviously need a little more XP to complete that maturity level-up. I'm at a loss as to how to proceed on this point, so, please tell me, how can I help you with that?

    • (Score: 2) by TheRaven on Saturday December 05 2015, @10:56AM

      by TheRaven (270) on Saturday December 05 2015, @10:56AM (#272141) Journal

      Objective-C was horrifying.

      Without knowing what you don't like about Objective-C, it's hard to answer. Objective-C is a very small set of extensions to C that provides a Smalltalk-like object model (late-bound dynamic dispatch, duck typing, full introspection, closures as first class objects) and Smalltalk-like syntax (named parameters everywhere) and a few extensions. If you have objections to duck typing, then you may like Swift a bit more (thought the Objective-C bridging makes it leak through in places). If you have objections to reference counting with explicit weak references for cycle detection as a memory management strategy then Swift will be just as bad (so will C++). If you have an objection to object orientation in general then, again, Swift probably isn't for you. If your objection to Objective-C is that it's a simple language that has a small set of well-defined semantic and syntactic extensions to C, then you'll probably like Swift. If your objection to Objective-C is that it inherits all of the horrible design decisions in C, then you may like Swift.

      If your objection is 'OMG, syntax that is not exactly the same as C', then you're an idiot and no advice will help you.

      --
      sudo mod me up
      • (Score: 2) by BasilBrush on Saturday December 05 2015, @10:59PM

        by BasilBrush (3994) on Saturday December 05 2015, @10:59PM (#272292)

        I think most people's objection to Obj-C is that it's very verbose and filled with square brackets. And there's truth in that. Obj-C tends to be easy to understand because of the conventions for verbose identifiers and method names with always named parameters. But the verbosity does mean you tend to need several lines to accomplish what you can do in other languages with one line.

        Personally I quite like Obj-C. But Swift is an improvement in many ways.

        --
        Hurrah! Quoting works now!
  • (Score: -1, Troll) by Anonymous Coward on Friday December 04 2015, @09:03PM

    by Anonymous Coward on Friday December 04 2015, @09:03PM (#271946)

    #if os(OSX) || os(iOS) || os(watchOS) || os(tvOS)
            import Darwin
    #else
            import Glibc
    #endif

    Come on code in comments?? HMTL was broken for these very reason. Compiler directives (it is still code) should a compoiler directive, not a hyper load of comment.

    It is why children should not be running the tech world.

    • (Score: 0) by Anonymous Coward on Friday December 04 2015, @09:54PM

      by Anonymous Coward on Friday December 04 2015, @09:54PM (#271962)

      HMTL

      Hypermarkup Text Language?!

    • (Score: 0) by Anonymous Coward on Friday December 04 2015, @10:16PM

      by Anonymous Coward on Friday December 04 2015, @10:16PM (#271966)

      The code in the comments was tacked on later. HTML is for rendering rich text, not running office suites loaded from a web-server.

      • (Score: 2) by Nerdfest on Friday December 04 2015, @10:25PM

        by Nerdfest (80) on Friday December 04 2015, @10:25PM (#271971)

        If think he's referring to the annoying but ubiquitous tests for the various versions of the horror that is IE.

    • (Score: 5, Informative) by bryan on Friday December 04 2015, @10:17PM

      by bryan (29) <bryan@pipedot.org> on Friday December 04 2015, @10:17PM (#271967) Homepage Journal

      Comments in Swift [apple.com] follow the C style (/* multi-line*/) and C++ style (// single-line) form.

      The "#" is a type of preprocessor directive [apple.com] and not a comment.

      • (Score: 2) by BasilBrush on Saturday December 05 2015, @11:12PM

        by BasilBrush (3994) on Saturday December 05 2015, @11:12PM (#272300)

        Thankfully they've only implemented conditional compilation, not #defines.

        --
        Hurrah! Quoting works now!
  • (Score: 0) by Anonymous Coward on Saturday December 05 2015, @12:08AM

    by Anonymous Coward on Saturday December 05 2015, @12:08AM (#272010)

    in . . . 3 . . . 2 . . . 1 . . .

    • (Score: 0) by Anonymous Coward on Saturday December 05 2015, @12:25AM

      by Anonymous Coward on Saturday December 05 2015, @12:25AM (#272023)

      I'd just like to interject for a moment. What you’re referring to as open source, is in fact, free software, or as I’ve recently taken to calling it, libre software.

      Many computer users run libre software every day, without realizing it. Through a peculiar turn of events, libre software which is widely used today is often called 'open source', and many of its users are not aware that it is basically libre software, which guarantees your freedumbs. There really is open source software, and these people are using it, but that is because open source is a superset of libre software.

      t. richard m stallmang

      • (Score: 0) by Anonymous Coward on Saturday December 05 2015, @02:13AM

        by Anonymous Coward on Saturday December 05 2015, @02:13AM (#272052)
        Not true. The true difference between open source and Free Software is a philosophical one. There are no licences that satisfy the Open Source Definition that do not satisfy the FSF as being Free Software. The principal difference between the two movements is philosophical: the Open Source movement emphasises the efficiencies achieved from having freedom and openness, while the Free Software movement emphasises the freedom aspect.
        • (Score: 1, Informative) by Anonymous Coward on Saturday December 05 2015, @02:57PM

          by Anonymous Coward on Saturday December 05 2015, @02:57PM (#272174)

          There are no licences that satisfy the Open Source Definition that do not satisfy the FSF as being Free Software.

          This part is wrong. E.g. https://en.wikipedia.org/wiki/Sybase_Open_Watcom_Public_License [wikipedia.org]

    • (Score: 0) by Anonymous Coward on Saturday December 05 2015, @01:49AM

      by Anonymous Coward on Saturday December 05 2015, @01:49AM (#272044)

      RMS will definitely call it Free Software though and this development will probably please him greatly. It's under the Apache 2.0 License which is blessed by the FSF [gnu.org].

      This is a free software license, compatible with version 3 of the GNU GPL.

      Please note that this license is not compatible with GPL version 2, because it has some requirements that are not in that GPL version. These include certain patent termination and indemnification provisions. The patent termination provision is a good thing, which is why we recommend the Apache 2.0 license for substantial programs over other lax permissive licenses.

      (emphasis added)

      • (Score: 2) by darkfeline on Saturday December 05 2015, @05:14AM

        by darkfeline (1030) on Saturday December 05 2015, @05:14AM (#272089) Homepage

        Except that Swift only compiles for Apple's proprietary OSes, which makes it kind of pointless.

        It's like open source coffee beans, except these coffee beans can only be used in Apple's proprietary iCoffeeMakers. Kind of makes the open source or freedom aspect a moot point, don't you think?

        "Here's some FOSS code. Oh by the way, it only runs on this proprietary OS that requires you to sell us your soul. Sorry about that. Oh by the way, the proprietary OS code isn't FOSS, so you have no idea if what you think the FOSS code is doing is actually what it's doing. Sorry about that. No, we do not log all your syscalls and send them to our servers for telemetry analysis. Promise."

        --
        Join the SDF Public Access UNIX System today!
        • (Score: 2) by choose another one on Saturday December 05 2015, @10:43AM

          by choose another one (515) Subscriber Badge on Saturday December 05 2015, @10:43AM (#272137)

          Except that Swift only compiles for Apple's proprietary OSes, which makes it kind of pointless.

          And Linux was tied to the i386 and therefore also pointless, and I can quote a professor on that...

          One of the main points of FOSS, which you seem to have failed to grasp, is that if the developer goes bust, the hardware ceases production or the os goes out of support, source code remains useful in ways a binary does not. If it doesn't run on your device/os then you are free to port it, if you don't want to that is fine, the point is that you are free to do it and have the source.

          If all you want is a shiny new binary pre-built for os/device of choice, then you are looking for the wrong kind of free.

          • (Score: 2) by darkfeline on Saturday December 05 2015, @02:54PM

            by darkfeline (1030) on Saturday December 05 2015, @02:54PM (#272172) Homepage

            Right, but you don't have the source for OSX, so the exercise is pointless.

            FOSS must exist throughout the entire chain, or you still don't have freedom.

            In fact, I will email RMS right now and confirm his position on this matter.

            --
            Join the SDF Public Access UNIX System today!
            • (Score: 3, Insightful) by choose another one on Saturday December 05 2015, @09:04PM

              by choose another one (515) Subscriber Badge on Saturday December 05 2015, @09:04PM (#272270)

              rms (it's lower case btw, always has been) himself spent many years working on free software to run on proprietary OSs, I think he would be surprised to hear that it was pointless or that the users of the software didn't have freedom as a result.

            • (Score: 2) by BasilBrush on Saturday December 05 2015, @11:05PM

              by BasilBrush (3994) on Saturday December 05 2015, @11:05PM (#272296)

              What's the source code to OSX got to do with it? Swift is a language. It's a language that runs on OSX or Linux, or any other platform someone wants to port it to.

              --
              Hurrah! Quoting works now!
        • (Score: 3, Informative) by BasilBrush on Saturday December 05 2015, @11:03PM

          by BasilBrush (3994) on Saturday December 05 2015, @11:03PM (#272294)

          You didn't even read the summary. Open Source Swift is available for Linux. Not even theoretically, but right now.

          --
          Hurrah! Quoting works now!
          • (Score: 2) by darkfeline on Sunday December 06 2015, @04:59AM

            by darkfeline (1030) on Sunday December 06 2015, @04:59AM (#272389) Homepage

            Unless I'm misunderstanding, the Swift compiler runs on Linux, but only compiles code for OSX and family. The end result of your compilation doesn't run on Linux, even if you run Swift on Linux.

            --
            Join the SDF Public Access UNIX System today!
            • (Score: 2) by BasilBrush on Sunday December 06 2015, @02:10PM

              by BasilBrush (3994) on Sunday December 06 2015, @02:10PM (#272476)

              You are indeed misunderstanding. Write it on Linux, compile it on Linux, run it on Linux. And in the future any other OS someone ports it to.

              --
              Hurrah! Quoting works now!
  • (Score: 2) by fleg on Saturday December 05 2015, @02:26AM

    by fleg (128) Subscriber Badge on Saturday December 05 2015, @02:26AM (#272058)

    anyone tried it yet? would be interested to hear what its like. from installation on up.

    • (Score: 3, Informative) by theluggage on Saturday December 05 2015, @10:55AM

      by theluggage (1797) on Saturday December 05 2015, @10:55AM (#272140)

      from installation on up.

      As for installation, If you've got Ubuntu 14.04 or 15.10, there are snapshot builds - just apt-get install clang and then download the tarball, unpack it and add it to your PATH. If you haven't got Ubuntu 14.04/15.10 then you get to build your own from the git repository - or use Ubuntu in a VM. Now the source is released, proper packages for all the distros will doubtless follow.

      The Linux version doesn't have the fancy, graphical 'Playground' facility you get in XCode in OS X but it does have an interactive shell that you can play about in.

      I don't think anybody is suggesting that the Linux version is ready for production use yet. Really, the news at the moment is that Apple really have open-sourced it (as they've long promised).

  • (Score: 2) by lentilla on Saturday December 05 2015, @07:20AM

    by lentilla (1770) on Saturday December 05 2015, @07:20AM (#272118)

    at the time of writing, the servers at Swift.org are failing to live up to their name

    This isn't the first time a company's web servers have found it difficult to keep up with the load on release day. There's a perfectly good solution to this: BitTorrent. I find it hard to understand why companies seem so resistant to this. It's so very simple: publish the magnet link and seed the download. Now; instead of download speeds slowing to a crawl with each new downloader; the downloads get faster as a new downloader joins the swarm. BitTorrent is a match made in heaven for this kind of distribution.

    • (Score: 2) by theluggage on Saturday December 05 2015, @11:17AM

      by theluggage (1797) on Saturday December 05 2015, @11:17AM (#272143)

      the downloads get faster as a new downloader joins the swarm

      ...provided enough of the downloaders have set up their firewall properly (and aren't behind a company firewall), leave their client seeding after they've finished downloading and are not on an ADSL connection with a feeble 'upload' speed.

      Anyway, the download was ~90MB - I wouldn't bother with Bittorrent for less than about a gigabyte - and we don't know that that was the problem (I've been caught out in the past by a certain web-hosting package that has a "max-connections=10" default buried in one of the config files...)

      • (Score: 0) by Anonymous Coward on Saturday December 05 2015, @03:02PM

        by Anonymous Coward on Saturday December 05 2015, @03:02PM (#272176)

        I've been caught out in the past by a certain web-hosting package that has a "max-connections=10" default buried in one of the config files...

        Was it Windows NT workstation? See https://www.fsf.org/bulletin/2007/fall/antifeatures/ [fsf.org]

    • (Score: 2) by BasilBrush on Saturday December 05 2015, @11:08PM

      by BasilBrush (3994) on Saturday December 05 2015, @11:08PM (#272298)

      For a business, having servers stay up on launch day is mandatory. For an open source project it doesn't really matter. People can download it later. It's not as if anyone is losing money over it.

      --
      Hurrah! Quoting works now!