Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Friday December 04 2015, @06:52PM   Printer-friendly
from the Is-that-you-Jonathan? dept.

Swift Is Open Source

Swift, Apple's hot new programming language, is now open source. It is available (or will be once the web site isn't so overwhelmed!) for Mac and Linux under an Apache 2.0 license.

"Swift is now open source! We are excited by this new chapter in the story of Swift. After Apple unveiled the Swift programming language, it quickly became one of the fastest growing languages in history. Swift makes it easy to write software that is incredibly fast and safe by design. Now that Swift is open source, you can help make the best general purpose programming language available everywhere. "

Apple Open-Sources Swift and Releases Linux port, as Promised

Apple's Swift programming language may eventually replace the respected but arcane Objective C as the native language for OS X and iOS development, but if you don't have a Mac you might be forgiven for not having taken an interest so far.

However, as MacRumors now reports, Apple have now delivered on their promise to open-source Swift and release a Linux port. It doesn't sound as if the Linux port is quite ready for production use just yet, but the source is out there. Does this mean that Swift is now a contender for general purpose programming?

(Note: at the time of writing, the servers at Swift.org are failing to live up to their name.)


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Friday December 04 2015, @08:45PM

    by Anonymous Coward on Friday December 04 2015, @08:45PM (#271942)

    I've showed up here in the past, and always benefited from your collective wisdom. Here I return to ask another question.

    I'm having trouble punching through the $90K barrier (in the midwest, so adjust currency values accordingly) using my chosen skillset.

    The big issue is that I'm firmly entrenched in the GNU/Linux ecosystem (C, gcc, Perl, Python, MySQL/Postgres) doing web development (middleware) stuff. I've thought about hopping on the Mac bandwagon purely for the money, but Objective-C was horrifying.

    Although it would feel like selling out, I wonder if now is the time. Learn Swift, get up to speed on iOS development, and essentially become "one of them." I'll bet the money and job opportunities are better.

    Surely Swift is less a nightmare than Objective-C, and since there's a Linux port available, might be something I could spend some time with.

    Soylentils, is it worth it?

  • (Score: 1, Insightful) by Anonymous Coward on Friday December 04 2015, @09:10PM

    by Anonymous Coward on Friday December 04 2015, @09:10PM (#271948)

    In a couple months, the automated resume filters are going to be expecting 5+ years of production-level experience with Swift, or your CV will hit the bit bucket. So you better get started right away.

    • (Score: 0) by Anonymous Coward on Friday December 04 2015, @09:20PM

      by Anonymous Coward on Friday December 04 2015, @09:20PM (#271953)

      Hey, I can lie on resumes with the best of 'em. Who's going to interview me? Some milennial node.js kid? Bring it on!

    • (Score: 1, Informative) by Anonymous Coward on Friday December 04 2015, @10:12PM

      by Anonymous Coward on Friday December 04 2015, @10:12PM (#271964)

      The "sad but true" mod is "insightful"?

      I think the average soylentil may be cynical. (Or I may be just projecting.)

  • (Score: 2) by theluggage on Friday December 04 2015, @10:08PM

    by theluggage (1797) on Friday December 04 2015, @10:08PM (#271963)

    Surely Swift is less a nightmare than Objective-C, and since there's a Linux port available, might be something I could spend some time with.

    Well, if you're experienced in C++ Python then learning the Swift language should be easy - its much closer to "C meets Python" than Objective-C's "C meets Smalltalk". The hard (or at least time-consuming) bit will be getting up to speed on the iOS API. For that, I'm afraid, you'll still need a Mac.

  • (Score: 0) by Anonymous Coward on Saturday December 05 2015, @12:20AM

    by Anonymous Coward on Saturday December 05 2015, @12:20AM (#272021)

    Is there much call for app development in the midwest?

    My understanding is that something like 95% of apps are money losers for the developers. If you are doing custom software for internal use by clients that might be a better market. But how many business clients would put their system on the ios platform instead of android? My guess is that android would be a lot more attractive since the phones and tablets are so much cheaper. But I don't work in that market at all so what do I know?

  • (Score: 2) by fleg on Saturday December 05 2015, @02:21AM

    by fleg (128) Subscriber Badge on Saturday December 05 2015, @02:21AM (#272057)

    >Surely Swift is less a nightmare than Objective-C

    havent done obj-c but i've had to learn to read a fair bit of it when looking up ways
    to do stuff in swift (cos the examples and tutorials hadnt been updated for swift yet).

    so on that basis, yes, it is nowhere near the eye-bleeding nightmare the obj-c syntax is.

    i like swift a lot, with this announcement i'm kind of hoping it will replace
    c++ long term too.

    as to yourself AC if you have c, python etc i dont think you'll have too much trouble, and
    xcode + ios make the sweetest embedded development environment i've ever seen.

    • (Score: 2) by linuxrocks123 on Saturday December 05 2015, @03:43AM

      by linuxrocks123 (2557) on Saturday December 05 2015, @03:43AM (#272074) Journal

      I doubt it will replace C++, and I certainly hope not. The raw power and runtime efficiency you can get out of C++ isn't matched by anything else out there, and it's getting better with each new standard, yet the changes are evolutionary, and done in a way where the language maintains its maturity. C++ templates are an absolute marvel; Java and C# "generics" are pale imitations in comparison.

      There's been a glut of new languages in the past few years. I'm sure some will stick around, but C++ is in a league by itself.

      • (Score: 2) by BasilBrush on Saturday December 05 2015, @10:52PM

        by BasilBrush (3994) on Saturday December 05 2015, @10:52PM (#272289)

        C++ is only faster if you ignore security and defensiveness. With no overflow or bounds checking C++ is much faster than Swift. But compile Swift with -Ofast, which also tuns off overflow and bounds checking, and Swift is just as fast as c++.

        http://www.helptouser.com/code/24101718-swift-performance-sorting-arrays.html [helptouser.com]

        --
        Hurrah! Quoting works now!
        • (Score: 3, Informative) by linuxrocks123 on Sunday December 06 2015, @12:19AM

          by linuxrocks123 (2557) on Sunday December 06 2015, @12:19AM (#272304) Journal

          http://www.primatelabs.com/blog/2014/12/swift-performance/ [primatelabs.com]

          Turning off overflow and bounds checking definitely helps, but Swift is still slower.

          Now, part of this could be because Swift's compiler is not as good. Over time, the gap might close a little more. But, for instance, Swift isn't doing generics right -- it's following the Java/C# method of not resolving them at compile time. So that will always handicap it versus C++.

          The only languages out there that can hope to match C++'s performance are C and Fortran. C because it doesn't provide conveniences that slow things down, and Fortran because it match's C's simplicity and also takes pains to eliminate the aliasing problem for compilation.

          In fact, the aliasing problem is one of the most problematic cases for C/C++ optimization, and Fortran's advantage here is enough to make Fortran code faster than C and C++ for a significant number of workloads. C and C++ are finally catching up, though, with restrict. Here's a C++ Standards Committee whitepaper on the problem: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3635.pdf [open-std.org]

          The primary advantage of C++ is that the language is almost Borglike in its appropriation of other languages' features, but it always does so in a way that is as high-performing as possible, and that doesn't penalize code that does not use the new feature. My view is that any language that wants to replace C++ will have to both match its features and also provide something new, something that C++, for some reason, can't appropriate. That will be a tall order.

          Oh, regarding bounds checking, you can definitely turn on bounds checking for vectors if you want to, as a debugging tool. I forget the details on how to do that; it's a macro or something. One thing I would like to see in C++ is a way to run all the code in an interpreter that traps invalid pointer accesses and the like. Someday, I may write an interpreter that runs off the Clang AST.

          • (Score: 2) by BasilBrush on Sunday December 06 2015, @02:45PM

            by BasilBrush (3994) on Sunday December 06 2015, @02:45PM (#272481)

            Your link is Dec 2014, therefore is for Swift 1.x. I specifically gave you a link for Swift 2.x, as 1.x was always about functionality more than speed. And for sure, Swift will continue to gain performance faster than c++, if only because c++ has already been optimised for so many years.

            I can see you're a true believer. But your comment "Swift isn't doing generics right" is based on the belief that C++ is right. However what c++ does is well known, as is what Java and c# does. And Chris Lattner's specialism is compiler design. If C++ templates were the acme of generic programming then Swift would be done that way - Swift takes the best from older languages. But it's not.

            In your belief that C++ is inevitably faster than Swift, what you are missing is that one of the major design goals of Swift was to enable compiler optimisations. For example the Swift compiler tends to knows better what is constant. And what pointers may or may not be nil. And those make for optimisations.

            In fact optionals as a first class feature of the language, with flow control support rather than simply a template in the standard library is one of the great advantages of Swift over C++. It tackles that other biggest source of bugs and security defects in C/C++ besides buffer overflows. Dereferencing null.

            --
            Hurrah! Quoting works now!
            • (Score: 2) by linuxrocks123 on Sunday December 06 2015, @04:11PM

              by linuxrocks123 (2557) on Sunday December 06 2015, @04:11PM (#272497) Journal

              Your link is Dec 2014, therefore is for Swift 1.x.

              It was the most complete benchmark I could find. Yours is just sorting arrays. It's rather hard to screw up sorting arrays; even Java manages to get that right. You'd have to move to scripting languages to find significant performance degradation there.

              And Chris Lattner's specialism is compiler design. If C++ templates were the acme of generic programming then Swift would be done that way - Swift takes the best from older languages. But it's not.

              My specialization is also compilers, and I actually worked in the same group as Chris Lattner, after he'd left. Let's discuss technical merits rather than appealing to authority.

              From here [austinzheng.com], it appears that I was initially wrong, and Swift does do generics (mostly) right. That's good news. Unfortunately, it does appear Swift has made a similar mistake to C# by using structs as always value types and classes as always reference types. There is no reason to force this limitation on the programmer.

              For example the Swift compiler tends to knows better what is constant.

              With constexpr in C++ since 2011, I find that assertion unlikely to be true.

              In fact optionals as a first class feature of the language, with flow control support rather than simply a template in the standard library is one of the great advantages of Swift over C++. It tackles that other biggest source of bugs and security defects in C/C++ besides buffer overflows. Dereferencing null.

              C++ has optionals. They're called pointers. Dereferencing NULL is not generally a security problem, and Swift solves buffer overflows by doing runtime checks that kill performance unless you disable them with a compiler flag. This is not impressive.

              Swift, Rust, and Go all seem to be products of a "Let's recreate ML" craze. They're not by themselves bad languages, but they're also not particularly good. And, if you want ML, it's there for you, and it's fine. We don't all of a sudden need three new ML variants, it's just that Apple, Mozilla, and Google all just want to create their own pet company languages, probably from due to an overly developed organizational ego. This isn't likely to end particularly well for those organizations, as hipster languages tend to have short lifespans. Tell me, is node.js still in, or is that last month's fashion now?

              Don't get me wrong, they probably won't "die". It takes a lot for a language to "die". Apple can guide its lemmings off Objective C and onto Swift fairly easily. This will likely kill Objective C, since Apple is the last major user of that language. If and when Apple decides to move its lemmings elsewhere, Swift will then go the same way. But that will be up to Apple. Google can push Go on Android with the same effect. Mozilla, umm ... Mozilla can write Firefox in Rust I guess.

              • (Score: 2) by BasilBrush on Sunday December 06 2015, @10:06PM

                by BasilBrush (3994) on Sunday December 06 2015, @10:06PM (#272597)

                OK, you nearly sound like you know what you're talking about, even given you owning up to the generics mistake. But then you say this:

                "C++ has optionals. They're called pointers."

                And at that point it's clear you don't have a clue about Swift. Optionals are certainly NOT pointers. Rather they fix the problem of pointers - that whether or not an empty value is allowed is undefined. C/C++ Compilers always allow nil pointers, even when programmer logic assumes they don't. And yet they don't even have to be pointers. Swift optionals can be any value.

                Actually C++ does support optionals buy only as std::optional. But that means you don't get any compiler support.

                Finally you go off into a rant, featuring hipsters. At which point I can completely write you off as someone who's stuck in his ways. You simply think that C++ is the best at everything and that's that. Even though it's clear to pragmatic programmers that C++ has it's place and so do many other languages.

                --
                Hurrah! Quoting works now!
                • (Score: 2) by linuxrocks123 on Monday December 07 2015, @07:07AM

                  by linuxrocks123 (2557) on Monday December 07 2015, @07:07AM (#272797) Journal

                  First, there's really no need to be rude. Flames are for Slashdot. Let's be civil.

                  Optionals are certainly NOT pointers. Rather they fix the problem of pointers - that whether or not an empty value is allowed is undefined. C/C++ Compilers always allow nil pointers, even when programmer logic assumes they don't. And yet they don't even have to be pointers. Swift optionals can be any value.

                  A more complete answer than "pointers are optionals" is that there are multiple ways to express a value that can't be NULL in C++. If you're writing a function and you want parameters that can't be NULL, you should use value parameters if the object is small and references if the object is large. Unlike pointers, references can't be null. If you want something that can be NULL, well, the easiest solution is to take a pointer and check if it's NULL. Since you don't "own" the pointer in this case, and it can be on the caller's stack rather than the heap, taking a pointer is not usually a problematic thing to do.

                  Now, if you want to return an optional, then there can be issues. You can still use pointers, of course, but now the pointer has to be on the heap, and the callee has to delete it. You can use reference-counted pointers (shared_ptr), and that's pretty good, but somewhat inefficient. So, not ideal, especially for a maniacally-performance-focused language like C++. This is why C++ does have experimental support for std::optional.

                  But that means you don't get any compiler support.

                  optional is a template, so, if you violate the type rules for optional, you'll get a compiler error. There is no reason the language has to be extended for something easily implementable as a template, which is why the standards committee didn't do that. Putting optional in the standard library is the right approach.

                  Finally you go off into a rant, featuring hipsters. At which point I can completely write you off as someone who's stuck in his ways. You simply think that C++ is the best at everything and that's that. Even though it's clear to pragmatic programmers that C++ has it's place and so do many other languages.

                  Okay, the hipsters thing might have been uncalled for. But I do think Swift, Rust, and Go are three languages in a space where there really only needs to be 1, and that 1 is probably one of Delphi, Object Pascal, or OCaML.

                  There are, in general, really just too many languages. For instance, Python and Ruby are, for all practical purposes, the same language, yet there are two communities with two sets of duplicated standard libraries and two teams working on trying to optimize two interpreters. It's just silly. Yes, we needed a "better Perl". We didn't need two of them. And, honestly, we probably didn't need Python or Ruby, because we already had TCL.

                  So, now, some people have decided they want Java, but faster and natively compiled, so we have three separate attempts to create a language in that space. They could have found one of the many existing natively-compiled safe languages and started from there. Or, at a bare minimum, they could have worked together to create this language they think needs to exist. But, no, we have three different communities triplicating their effort to create three different languages from scratch that will not be learning from past mistakes in the space they're "innovating" in, three times over.

                  Ultimately, I think it's going to be C++ that fills the "safe C++" space. If you follow a few simple coding conventions, you can already eliminate C++'s lack of safety from new code. Enforcing those conventions through syntax rules, and having a C#-like "unsafe" construct for when the rules need to be broken by old code or libraries is a logical next step. Optional safety in C++, plus an interpreter to aid development, and C++ becomes safe, fast, and easy to develop in. Best of all, or at least most, worlds.

                  That's why I'm going to write a C++ interpreter. C++ is the Borg, and I will help it assimilate interpreted scripting languages :). Mwa ha ha ha.

                  • (Score: 2) by BasilBrush on Thursday December 10 2015, @04:02PM

                    by BasilBrush (3994) on Thursday December 10 2015, @04:02PM (#274495)

                    optional is a template, so, if you violate the type rules for optional, you'll get a compiler error. There is no reason the language has to be extended for something easily implementable as a template

                    Again that's flawed thinking. An optional is not a template. It happens to be implemented as a template in C++. But that doesn't mean that doesn't reduce the functionality of the optional, due to the limitations of the template implementation. It does. C++ doesn't support half of the functionality of optionals in Swift.

                    The problem is you haven't really looked at Swift.

                    --
                    Hurrah! Quoting works now!
                    • (Score: 2) by linuxrocks123 on Thursday December 10 2015, @11:13PM

                      by linuxrocks123 (2557) on Thursday December 10 2015, @11:13PM (#274692) Journal

                      Again that's flawed thinking. An optional is not a template. It happens to be implemented as a template in C++. But that doesn't mean that doesn't reduce the functionality of the optional, due to the limitations of the template implementation.

                      This is an annoying thing to say. What, specifically, do you think C++ loses by implementing optional as a template? This is really a moot argument anyway, because optionals frankly aren't very important or interesting, but why don't you say, specifically, what you think C++ is missing here?

                      The problem is you haven't really looked at Swift.

                      The problem is you haven't surveyed the history of programming languages and so aren't aware of the vast number of languages already designed in the "imperative, non-GC language without pointers" space. Look at Ada, and then tell me what Swift has that's new. Now that I think of it, I was wrong to be talking about Pascal and OCaml. The small field there is so crowded, I forgot about the queen. Ada is the safe imperative language, and it's a quite good one that never really got the pickup it deserved.

                      I've looked at Swift more, in this conversation, and my initial assessment of it hasn't changed much. Ada is an excellent -- stellar -- safe, imperative language. And it got picked up by exactly the people who wanted that. Which was not many people -- basically the military and designers of software for airplane cockpits and life-critical medical equipment, and no one else. And now many of them are moving to C++, which actually makes me kind of sad, because there should probably be at least one high-performance imperative language that's not C/C++ or Fortran. But the Ada community is still small, even after all these years, because not enough people care about safe, high-performance imperative programming to make Ada's community thrive.

                      Seriously, look at Ada, then tell me again how new and awesome Swift is. I'm so glad I thought of Ada, because I was flailing around trying to find a good language to compare Swift to. Now I've got one. And I'm going home, so no link to prove it, but I would be extremely surprised to learn Ada doesn't have a way of dealing with optional values.

                      • (Score: 2) by BasilBrush on Friday December 11 2015, @02:03AM

                        by BasilBrush (3994) on Friday December 11 2015, @02:03AM (#274760)

                        Perfect.

                        Consider this article:"Null Considered Harmful"
                        http://www.adacore.com/adaanswers/gems/ada-gem-23/ [adacore.com]

                        Up until Ada 2005 recommended practice was to document whether a function could return a null. Then they extended the language to the compiler/run time could do checks.

                        Swift doesn't claim to do anything new, but rather to pick up best modern practice, and improve on it. And so it does, optionals cover this issue and lots more that come under the title "Null Considered Harmful".

                        What does Swift do, over and above?

                        In C/C++ you often have the construct:

                        if (foo) {
                        // do something with foo
                        }

                        If foo is a value that can't be nil (any non optional in Swift) then that construct is superfluous. Swift complains if you try. Very often C/C++ code checks "just in case". There's a reason for Swift to be faster in average code right there. In addition to the code being clearer.

                        If foo is an optional, then trying trying a simple dereference is an error. Because you haven't said what you want to happen if it's a nil. There are other things you can do. cover the nil in an else. Tell Swift to not do an operation if it's a nil. Or in extremis you can guarantee that although the type says it could be a nil, you happen to know it's not.

                        In other words, C++ std::optional gives you the datatype, but none of the checking. It still relies on you to remember to check for nil. In other words it has no equivalent of "if let", "!" or "?".

                        Worse. virtually all existing code and practice uses pointers in C++, whether they can conceptional contain null or not. At best they are documented, just like ADA pre 2005.

                        Now you can argue against some of this, and no doubt you will. But the fact is that a feature that is a central part of the language from it's conception is bound to be better than ADA or C++'s lesser facilities, tacked on as an afterthought, and in C++'s case not even part of the language. Yet the fact that C++ and ADA have tacked these things on, illustrates the need.

                        And frankly I'm surprised that you, having been programming for some years, haven't felt frustration at some points because you don't know whether a pointer you've been given could in some circumstances be nil. Or indeed whether some poorly documented API you are calling will accept nil for parameters you don't care about. Your love for C++ seems to have blinded you to it's limitations.

                        --
                        Hurrah! Quoting works now!
                        • (Score: 2) by linuxrocks123 on Saturday December 12 2015, @03:56AM

                          by linuxrocks123 (2557) on Saturday December 12 2015, @03:56AM (#275292) Journal

                          So the best complaint you can come up with about Ada versus Swift is something they fixed in 2005? Okay then.

                          If foo is a value that can't be nil (any non optional in Swift) then that construct is superfluous. Swift complains if you try. Very often C/C++ code checks "just in case". There's a reason for Swift to be faster in average code right there. In addition to the code being clearer.

                          In C++, you should never pass a pointer as a parameter that is guaranteed not to be NULL. That's what references are for. C/C++ code checking for NULL when a pointer is guaranteed to not be NULL is just bad code. You can write bad code in any language.

                          If foo is an optional, then trying trying a simple dereference is an error. Because you haven't said what you want to happen if it's a nil. There are other things you can do. cover the nil in an else. Tell Swift to not do an operation if it's a nil. Or in extremis you can guarantee that although the type says it could be a nil, you happen to know it's not.

                          This is exactly equivalent to optionals in C++. If you don't know if the optional is empty, the value() and value_or() functions allow you to check. If you happen to somehow know it's not, then you can dereference it without suffering the performance penalty checking for an empty optional implies. Exactly the same as you describe in Swift.

                          In other words, C++ std::optional gives you the datatype, but none of the checking. It still relies on you to remember to check for nil.

                          This is merely a syntax complaint. Dereferencing an optional in C++ is saying, "I know this optional isn't empty and am overriding the checking." If you want the checking, you use value() or value_or(). If, instead of overloading operator*, the C++ library instead provided a function named unsafe_dereference(), you would not be complaining. Well, C++ named the "unsafe_dereference()" function as "operator*" instead. Same function, just a different name. Complaining about the name of a function is bikeshedding.

                          In other words it has no equivalent of "if let", "!" or "?".

                          Yes, it does. ! and ? work exactly as they should, because optionals can be implicitly converted to bool. The converted optional will be true if the optional has a value and false if it does not. You do "if let" like this:

                          try
                          {
                                  auto& varname = opt_value.value();
                                  //...
                          }
                          except(std::bad_optional_value e) {}

                          Worse. virtually all existing code and practice uses pointers in C++, whether they can conceptional contain null or not. At best they are documented, just like ADA pre 2005.

                          Much C++ code was originally C, which does not have references. And, yes, good C code -- actually, good ... code -- documents what its parameters can be when it's not obvious from the function name or general description. This doesn't really affect modern C++'s comparison with Swift, except in C++'s favor, since C++ can use these legacy APIs without a foreign function interface.

                          Now you can argue against some of this, and no doubt you will. But the fact is that a feature that is a central part of the language from it's conception is bound to be better than ADA or C++'s lesser facilities, tacked on as an afterthought, and in C++'s case not even part of the language. Yet the fact that C++ and ADA have tacked these things on, illustrates the need.

                          I just established that, yes, C++'s implementation of optional is basically equivalent to Swift's. That C++ is flexible enough to fully express the concept of optional without needing to change the language itself is a testament to the language's flexibility. But not that big a one, because, at its core, this optional concept you keep going on about is a struct containing memory to hold an instance of an arbitrary datatype and a bool tag indicating whether that memory is a valid object. If C++, with its strong focus on first class user-defined types and generic programming, were yet so brittle it needed to make the concept of "a struct tagged with a bool indicating whether it's initialized" part of the language itself, that would be quite concerning indeed.

                          Yet the fact that C++ and ADA have tacked these things on, illustrates the need.

                          I still don't know why you are so fixated on this trivial and only marginally useful concept.

                          C++98 was expressive enough so that optional could have been added to the standard library then. The reason it only made it into std::experimental ~20 years later is because the use cases for this concept in C++ are vanishingly small. They are basically limited to places where you don't want to pass a reference as a parameter because the object is so small that the overhead of passing by reference is inefficient compared to passing a copy of the object itself plus 8 bits for a bool tag, and to places where you want to return a value from a function that may fail. In the case of a function that may fail, you can actually use std::pair instead, so optional isn't necessary in C++ anywhere. But optional is slightly more efficient and syntactically "cleaner" in a few cases. So, Boost designed an optional template, because Boost's purpose in life is to add random crap to the C++ standard library, and the standards committee shrugged and imported it into the standard after Boost showed over the course of a decade or so that the template was sometimes useful and didn't have serious flaws or cause unexpected problems.

                          std::optional doesn't need to be part of the language in C++. It only barely deserves to be part of the standard library.

                          And frankly I'm surprised that you, having been programming for some years, haven't felt frustration at some points because you don't know whether a pointer you've been given could in some circumstances be nil. Or indeed whether some poorly documented API you are calling will accept nil for parameters you don't care about.

                          It is a very poor API indeed that doesn't tell you what it returns or what arguments it takes. I haven't run into many of those. When I do, I blame the library author, not the language, whatever the language happens to be.

                          Your love for C++ seems to have blinded you to it's limitations.

                          Saying things like this is not constructive.

                          • (Score: 1) by linuxrocks123 on Saturday December 12 2015, @03:59AM

                            by linuxrocks123 (2557) on Saturday December 12 2015, @03:59AM (#275293) Journal

                            Oh, in case comments close on this and you want to keep talking, I made a journal for us here: https://soylentnews.org/comments.pl?sid=11091 [soylentnews.org]

                          • (Score: 2) by BasilBrush on Saturday December 12 2015, @11:57PM

                            by BasilBrush (3994) on Saturday December 12 2015, @11:57PM (#275615)

                            I still don't know why you are so fixated on this trivial and only marginally useful concept.

                            Because it's neither trivial nor marginally useful. One of the most common bugs in C derived languages is dereferencing null. You don't see the problem because your language of choice is archaic, but you are used to it.

                            So the best complaint you can come up with about Ada versus Swift is something they fixed in 2005? Okay then.

                            No. From that article ADA had added a facility to mark a pointer argument as not being null. That is not the complete support for optionals that Swift has.

                            allow you to check. If you happen to somehow know it's not, then you can

                            "allow", "can". That's what you get with a facility tagged on as an afterthought. All Swift code uses optionals rather than nil pointers or dummy values. And the language enforces good practice with them. That's not true of C++.

                            But it's clear that your love for C++ (and apparently Linux) won't let you see where other languages have bettered it. So this is going nowhere.

                            --
                            Hurrah! Quoting works now!
                            • (Score: 2) by linuxrocks123 on Sunday December 13 2015, @05:59AM

                              by linuxrocks123 (2557) on Sunday December 13 2015, @05:59AM (#275700) Journal

                              You seem to have missed something I said.

                              All Swift code uses optionals rather than nil pointers or dummy values. And the language enforces good practice with them.

                              This appears inconsistent with:

                              in extremis you can guarantee that although the type says it could be a nil, you happen to know it's not.

                              Assuming the "in extremis" line is correct, here is the exact equivalence between Swift optionals and C++ optionals. This is close to a constructive proof on this point.

                              - Swift operator! is equivalent to C++ std::optional::operator*.
                              - Swift "if let" / operator? equivalent to C++ the try{...} construct I mentioned previously.
                              - Swift's optional-in-if-statement syntax is equivalent to C++'s std::optional::operator bool().

                              The two are mathematically equivalent, man. C++ didn't add optionals to the language because C++ didn't need a separate language construct for this concept. That's what happens when a language is designed with the explicit goal of having true first class user-defined types: you can put big changes in the standard library rather than needing to add new language constructs. Admit you're on wrong on this point -- or don't -- but, either way, let's move on to a different point of comparison.

                              But it's clear that your love for C++ (and apparently Linux) won't let you see where other languages have bettered it. So this is going nowhere.

                              This is going nowhere because you don't appear to really be reading what I'm saying. And, I'm never going to respond to statements like this by defending myself; they are too silly to deserve a response. You have no idea what languages I've worked with in the past, or what my perspective is on programming languages in general. Even if you were right about what you're saying, the fundamental problem here is you think ad hominem attacks are logically coherent arguments. They are not [yourlogicalfallacyis.com].

                              I thought, after pointing out you were making pointless ad hominem attacks, that you would engage in the minimal amount of introspection necessary to determine that you were and would contribute more productively to the conversation from that point on. My experience in the past has been that this usually works. But you obviously need a little more XP to complete that maturity level-up. I'm at a loss as to how to proceed on this point, so, please tell me, how can I help you with that?

  • (Score: 2) by TheRaven on Saturday December 05 2015, @10:56AM

    by TheRaven (270) on Saturday December 05 2015, @10:56AM (#272141) Journal

    Objective-C was horrifying.

    Without knowing what you don't like about Objective-C, it's hard to answer. Objective-C is a very small set of extensions to C that provides a Smalltalk-like object model (late-bound dynamic dispatch, duck typing, full introspection, closures as first class objects) and Smalltalk-like syntax (named parameters everywhere) and a few extensions. If you have objections to duck typing, then you may like Swift a bit more (thought the Objective-C bridging makes it leak through in places). If you have objections to reference counting with explicit weak references for cycle detection as a memory management strategy then Swift will be just as bad (so will C++). If you have an objection to object orientation in general then, again, Swift probably isn't for you. If your objection to Objective-C is that it's a simple language that has a small set of well-defined semantic and syntactic extensions to C, then you'll probably like Swift. If your objection to Objective-C is that it inherits all of the horrible design decisions in C, then you may like Swift.

    If your objection is 'OMG, syntax that is not exactly the same as C', then you're an idiot and no advice will help you.

    --
    sudo mod me up
    • (Score: 2) by BasilBrush on Saturday December 05 2015, @10:59PM

      by BasilBrush (3994) on Saturday December 05 2015, @10:59PM (#272292)

      I think most people's objection to Obj-C is that it's very verbose and filled with square brackets. And there's truth in that. Obj-C tends to be easy to understand because of the conventions for verbose identifiers and method names with always named parameters. But the verbosity does mean you tend to need several lines to accomplish what you can do in other languages with one line.

      Personally I quite like Obj-C. But Swift is an improvement in many ways.

      --
      Hurrah! Quoting works now!