Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday June 03 2014, @04:54PM   Printer-friendly
from the swift-language-but-not-so-swift-name dept.

Apple surprised the audience at its Worldwide Developers Conference in San Francisco on Monday with a tool that few attendees expected: a new programming language for iOS and OS X development called Swift (https://developer.apple.com/swift/). There already is a programming language called Swift (http://swift-lang.org/main/) that was developed by the National Science Foundation, some other government agencies, and the University of Chicago for use in parallel computing applications. This isn't that. What it is, is an entirely new syntax that -- in the words of Apple senior VP Craig Federighi, who unveiled it during the Monday morning WWDC keynote -- aims to be "Objective-C without the baggage of C."

Some of that "baggage" will already be familiar to developers who cut their teeth on C but later moved on to scripting languages such as Python (and Federighi compared Swift to Python several times during his presentation). Like scripting languages but unlike C, Swift lets you get straight to the point. The single line println("Hello, world") is a complete program in Swift. Note, also, that you don't even have to end the statement with a semicolon, as you do in C. Those are optional, unless you're combining multiple statements on a single line; i.e. a semi-colon is a statement separator rather than a statement terminator.

In addition to its online documentation, Apple has released an e-book, The Swift Programming Language, that's a free download (https://itunes.apple.com/us/book/the-swift-programming-language/id881256329) from the iBooks Store. To start working with the language itself, you'll need to download the beta release of XCode 6 (https://developer.apple.com/xcode/downloads/), which includes tutorials to get you going.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Aiwendil on Tuesday June 03 2014, @05:30PM

    by Aiwendil (531) on Tuesday June 03 2014, @05:30PM (#50699) Journal

    Semicolon is great for when CR/LF and such has been messed up when crossing platforms or passing the code through external tools.

    The begin .. end; or { } is good to actually give a visual clue about where the logical breaks are in the "flow" of the code, not to mention it makes nesting of subprograms easier to follow, not to mention it also creates consistency for the start-end of blocks triggered by a loop or if-statement (for instance
    while true { } how is that done without the extra cruft?)
    And don't get me started on how good it is to actually have named blocks called from an exit-statement (even when not in a loop)

    Also, a language that allows for shortcuts sadly enough tends to cause headaches (implicit typecasts and allowing use of non-defined variables are downright dangerous [imagine having a variable called "grey" to define just what shade to use, now imagine someone else starting to use "gray" in the same code thinking it is the same])

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Tuesday June 03 2014, @05:40PM

    by Anonymous Coward on Tuesday June 03 2014, @05:40PM (#50704)

    implicit typecasts and allowing use of non-defined variables are downright dangerous

    Implicit typecasts, when restricted to the proper set of operations, are absolutely not dangerous (even Pascal had them!). The problem with languages like C is that they have too many of them, most of which are simply not a good idea (this includes some which at first sight look like a good idea, like the implicit conversion from signed to unsigned types).

    Restrict type conversions to the cases where they make sense, and you'll have no problems with them. Note that without type conversions, object oriented programming would be basically impossible with statically typed languages, because you need implicit conversion from pointer/reference to derived to pointer/reference to base.

    • (Score: 3, Interesting) by Aiwendil on Tuesday June 03 2014, @07:02PM

      by Aiwendil (531) on Tuesday June 03 2014, @07:02PM (#50740) Journal

      I actually like how Ada approaches this. In Ada on defines types first, and then - if needed - subtypes and conversion between subtypes that share the same type is implicit (with a rangecheck in case the origin subtype does not fit completly within the target subtype) but to convert outside of this type requires explicit typecasting.

      I agree that typecasting is good to have (and many things would indeed be impossible (or well - a lot harder) without it, but it is implicit typecasting that is bad (I'm all for explicit typecasting)).
      However, I fail to see why one needs implicit conversion for OOP (ie. I fail to see why explicit typecasting wouldn't suffice)

      • (Score: 0) by Anonymous Coward on Thursday June 05 2014, @09:41AM

        by Anonymous Coward on Thursday June 05 2014, @09:41AM (#51561)

        Well, formally you of course could do only explicit typecasting in statically-typed OOP as well, but it would fully defeat the point of OOP (basically every derived class would automatically violate the Liskov Substitution Principle in this regard). Here's an explicit example (using C++ syntax):

        class Base { ... };
        class Derived: public Base { ... };
         
        Derived* foo()
        void bar(Base*)
         
        bar(foo()); // Needs implicit typecast from Derived* to Base*