Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Sunday June 14 2015, @10:44PM   Printer-friendly
from the swift-rise-in-popularity dept.

The hype around Swift is near non-existent by Apple standards, yet the language has attracted high praise since its release last year. Swift is essentially one of the very few Apple products representing a clear departure from the hardware-led approach Steve Jobs took to the business. If Stack Overflow's 2015 dev survey is anything to go by, it looks as if the Swift language might have potential to really shake things up.

Might the days of Apple programmers relying upon objective C be numbered?


Original Submission

Related Stories

An 18-part Series on Building a Swift HTTP Framework 3 comments

Software engineer, Dave DeLong, has written an 18-part series on building an HTTP framework in Swift. Apple's Swift programming language is a general-purpose, open source, compiled programming language intended to replace Objective-C. It is licensed under the Apache 2.0 license. In his series, Dave covers an Intro to HTTP, Basic Structures, Request Bodies, Loading Requests, Testing and Mocking, Chaining Loaders, Dynamically Modifying Requests, Request Options, Resetting, Cancellation, Throttling, Retrying, Basic Authentication, OAuth Setup, OAuth, and Composite Loaders.

Over the course of this series, we've started with a simple idea and taken it to some pretty fascinating places. The idea we started with is that a network layer can be abstracted out to the idea of "I send this request, and eventually I get a response".

I started working on this approach after reading Rob Napier's blog post on protocols on protocols. In it, he makes the point that we seem to misunderstand the seminal "Protocol Oriented Programming" idea introduced by Dave Abrahams Crusty at WWDC 2015. We especially miss the point when it comes to networking, and Rob's subsequent posts go in to this idea further.

One of the things I hope you've realized throughout this blog post series is that nowhere in this series did I ever talk about Codable. Nothing in this series is generic (with the minor exception of making it easy to specify a request body). There is no mention of deserialization or JSON or decoding responses or anything. This is extremely deliberate.

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by kaszz on Sunday June 14 2015, @11:17PM

    by kaszz (4211) on Sunday June 14 2015, @11:17PM (#196279) Journal

    Does Swift have general and basic benefits over other languages? will the standardization and usage be stable over decades? Does it have any real world application benefit outside of the Apple hegemony?

    The Apple APIs are Objective-C so they will stay around for a while. But the replacement better measure up or there will be deep trouble ahead.

    • (Score: 3, Interesting) by NCommander on Sunday June 14 2015, @11:30PM

      by NCommander (2) Subscriber Badge <michael@casadevall.pro> on Sunday June 14 2015, @11:30PM (#196284) Homepage Journal

      My understanding is swif tis basically wrapping over the ObjC APIs, so its 1:1 functionality, or put in another way, Swift is a new language with ABI compatibility with ObjC message passing. I tried to play with ObjC, and while I think in some ways its better than C++, TBH, the syntax is very hard to grok. This is compounded by the fact that a lot of stuff at ObjC is detected at runtime vs compile time.

      Please note, I'm just a hobbist, never did a major project in ObjC

      --
      Still always moving
      • (Score: 4, Disagree) by kaszz on Sunday June 14 2015, @11:59PM

        by kaszz (4211) on Sunday June 14 2015, @11:59PM (#196288) Journal

        I'm just suspicious that this falls into:
          * Hey cool we have written our own language that doesn't add anything than more book reading for you and fragmentation
          * We must change language every so often to keep up with fashion because we lack impulse control
          * Not invented here!
          * We need this extra thing so we make a whole new language
          * Our developers have skills to make use of other platforms so lets force them to use our language so they forget others

        • (Score: 0) by Anonymous Coward on Monday June 15 2015, @02:03AM

          by Anonymous Coward on Monday June 15 2015, @02:03AM (#196323)

          > Our developers have skills to make use of other platforms so lets force them to use our language so they forget others

          They are saying some of the right words, [opensource.com] deeds are still pending.

        • (Score: 5, Informative) by BasilBrush on Monday June 15 2015, @05:19AM

          by BasilBrush (3994) on Monday June 15 2015, @05:19AM (#196363)

          * Hey cool we have written our own language that doesn't add anything than more book reading for you and fragmentation

          There are a number of unique and very nice features in Swift.

          * We must change language every so often to keep up with fashion because we lack impulse control

          Apple has been using Obj-C as it's major language for 15 years. OSX(including it's pervious incarnation as NextStep) for 26 years.

          * Not invented here!

          There is no existing language that has the fundamental requirement that it supports the existing Cocoa frameworks.

          * We need this extra thing so we make a whole new language

          Apple already has de-facto control over objective-c and has been adding extra things over the years. Including this year.

          * Our developers have skills to make use of other platforms so lets force them to use our language so they forget others

          No change over Obj-C.

          So all your suspicions are stupid.

          --
          Hurrah! Quoting works now!
          • (Score: 2) by kaszz on Tuesday June 16 2015, @06:35PM

            by kaszz (4211) on Tuesday June 16 2015, @06:35PM (#196980) Journal

            There is no existing language that has the fundamental requirement that it supports the existing Cocoa frameworks.

            Does the Cocoa framework bring any new possibilities to the computing except being able to do graphics in the Apple ecosystem?

      • (Score: 2) by bzipitidoo on Monday June 15 2015, @02:03PM

        by bzipitidoo (4388) on Monday June 15 2015, @02:03PM (#196486) Journal

        I took a cursory look at Swift, and get the impression it's Python with curly braces instead of indentation (or, can think of it as C/C++ without semicolons and with better looping), with much simpler syntax for using the Objective C/C++/Cocoa libraries than Objective C/C++ itself.

        I think the religious fervor over Object Oriented Programming faded years ago, and now people can see that while OOP has its points, it also has plenty of issues. I was never sure about the value of the whole idea of an inheritance hierarchy, just seemed too rigid to try to organize data that way.

    • (Score: 2) by c0lo on Monday June 15 2015, @02:28AM

      by c0lo (156) Subscriber Badge on Monday June 15 2015, @02:28AM (#196332) Journal

      The Apple APIs are Objective-C so they will stay around for a while. But the replacement better measure up or there will be deep trouble ahead.

      Remember the good old MS OLE [wikipedia.org] and what it meant for VB*?
      Well, seems that Apple "innovated" their own version of VB and are pretty proud about it. Perhaps over the years they'll get to evolve it into the equivalent of Java/dontNetFramework.

      * VB as in Visual Basic, not Victoria Bitter [wikipedia.org]

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 3, Insightful) by BasilBrush on Monday June 15 2015, @05:22AM

        by BasilBrush (3994) on Monday June 15 2015, @05:22AM (#196364)

        Remember the good old MS OLE and what it meant for VB*?

        However if you think Swift is in any way comparable to VB, you're a cluless idiot.

        --
        Hurrah! Quoting works now!
        • (Score: 1, Flamebait) by c0lo on Monday June 15 2015, @06:01AM

          by c0lo (156) Subscriber Badge on Monday June 15 2015, @06:01AM (#196374) Journal

          However if you think Swift is in any way comparable to VB, you're a cluless idiot.

          Really? Doesn't it serve for Apple the same purpose as the old VB did for MS?
          I mean... "make available the Godness of our API to masses. Because, you know... developers, developers, developers... memory management and whatnot is too hard for the wannabe programmer" (I still remember the days before the dotcom bust, everybody and their dog would got hired with "I know VB" on their CV - 'cause, you know, before ASPX and other dontNet goodnesses MS came afterwards, the VB was the "language of Web as decreed by Microsoft").

          Yeah, sure, maybe it's more evolved than VB, but in the end... it only runs on Apple - very much like VB used to run only on MS. Yet another point they somehow resemble, don't yea think?

          --
          https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
          • (Score: 5, Informative) by BasilBrush on Monday June 15 2015, @07:22AM

            by BasilBrush (3994) on Monday June 15 2015, @07:22AM (#196385)

            Really? Doesn't it serve for Apple the same purpose as the old VB did for MS?

            No. This isn't an easy to use option for casual use. This is a replacement of the professional languages both for apps and systems programming.

            it only runs on Apple

            Wrong. Linux and open source is part of the v2.0 release.

            --
            Hurrah! Quoting works now!
            • (Score: 2) by c0lo on Monday June 15 2015, @08:58AM

              by c0lo (156) Subscriber Badge on Monday June 15 2015, @08:58AM (#196402) Journal

              it only runs on Apple

              Wrong. Linux and open source is part of the v2.0 release.

              It may seem pedantic, but I detect a slightly misleading use of tense time. The as precise as possible way to put it is:

              Linux and open source will be part of the v2.0 release. Sometime "later this year" [theregister.co.uk]

              On the other side, what should I be so enthused by Swift on Linux? For some +Informative mods, what does the language brings new/useful to Linux? (why should I believe is something else than an expression of Not-Invented-Here-Syndrome coming from Apple?)
              Does it have some standard libraries besides OSX/Cocoa to make it useful? Multi-threading, async/futures... something?

              --
              https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
              • (Score: 2) by BasilBrush on Monday June 15 2015, @03:26PM

                by BasilBrush (3994) on Monday June 15 2015, @03:26PM (#196537)

                You shouldn't be excited at all. You hate Apple and thus won't be using it. Some of the more open minded language experimenters will though.

                --
                Hurrah! Quoting works now!
            • (Score: 4, Insightful) by c0lo on Monday June 15 2015, @11:00AM

              by c0lo (156) Subscriber Badge on Monday June 15 2015, @11:00AM (#196421) Journal

              This is a replacement of the professional languages both for apps and systems

              Groan... Really? [wikipedia.org]

              Swift uses Automatic Reference Counting (ARC) to manage memory. Apple used to require manual memory management in Objective-C, but introduced ARC in 2011 to allow for easier memory allocation and deallocation.[29] One problem with ARC is the possibility of creating a strong reference cycle, where instances of two different classes each include a reference to the other, causing them to become leaked into memory as they are never released. Swift provides the weak and unowned keywords that allow the programmer to prevent strong reference cycles from occurring. Typically a parent-child relationship would use a strong reference while a child-parent would use either weak reference, where parents and children can be unrelated, or unowned where a child always has a parent, but parent may not have a child.

              Because... you know?... there's never going to be a relation between siblings, it will always be an asymmetrical parent-child-like relationship.
              Yes, circular lists or bidi-graphs with cycles are evil constructs one is never going to need in practice.

              Those foolish younsters with disdain for ancient wisdom [catb.org] are in for surprises:

              One day a student came to Moon and said: “I understand how to make a better garbage collector. We must keep a reference count of the pointers to each cons.”

              Moon patiently told the student the following story:

              “One day a student came to Moon and said: ‘I understand how to make a better garbage collector...

              --
              https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
              • (Score: 3, Touché) by BasilBrush on Monday June 15 2015, @03:23PM

                by BasilBrush (3994) on Monday June 15 2015, @03:23PM (#196535)

                Really.

                Because... you know?... there's never going to be a relation between siblings, it will always be an asymmetrical parent-child-like relationship.
                Yes, circular lists or bidi-graphs with cycles are evil constructs one is never going to need in practice.

                Just because it is possible to write software that has circular references, doesn't mean that properly written software ever has them. Put the correct attributes on properties and there is no problem I've been programming with ARC for years and literally never had a bug of that nature.

                ARC is NOT a garbage collector. That's your first mistake.

                --
                Hurrah! Quoting works now!
              • (Score: 2) by kaszz on Tuesday June 16 2015, @06:27PM

                by kaszz (4211) on Tuesday June 16 2015, @06:27PM (#196975) Journal

                "to allow for easier" alright, then only idiots will use it .. ;-)

                It sounds more and more like Swift might be useful to learn if creating apps for iPhone is your task. Otherwise it's just another NIH thing waiting to be scrapped.

    • (Score: 0) by Anonymous Coward on Monday June 15 2015, @10:08AM

      by Anonymous Coward on Monday June 15 2015, @10:08AM (#196413)

      Does Swift have general and basic benefits over other languages?

      I guess the advantage is that you can use it in situations where you otherwise would have been forced to use Objective C.

  • (Score: 5, Informative) by Anonymous Coward on Monday June 15 2015, @12:08AM

    by Anonymous Coward on Monday June 15 2015, @12:08AM (#196294)

    One of StackOverflow's survey questions was, what languages would you be most interested in learning over the next 12 months (clearly, respondents were allowed to name more than one; many apparently named several). The top languages in the responses were:


    Swift 78 pct
    C++11 76 pct
    Rust 74 pct
    Go 73 pct
    Clojure 71 pct
    Scala 71 pct
    F# 70 pct
    Haskell 70 pct
    C# 67 pct
    Python 67 pct

    But the caption was "Most loved programming languages". See what the problem is? Swift, Rust, and Go are very new languages which relatively few coders have proficiency with. Four others on this top ten list are functional programming langauges, a topic which has received plenty of buzz (warning: I'm old enough to remember a very similar phenomenom with "knowledge-based" languages such as Common Lisp, Schema, and Prolog in the late '80s and early '90s; that's what all the cool kids were using).

    While many have proficiency with C++, C++11 brought in a raft of new changes which many haven't caught up with, but which will almost certainly replace C++98 in many of the C++ shops sometime over the next few years.

    Finally, we have C# and Python, which are indeed widely known and used; here we have programmers trying to plug obvious gaps in their current skill sets.

    So to say these are the "most loved" programming languages is a pretty bad misconception. These are languages that are on people's to-do list of stuff to learn, for various reasons including curiousity and professional survival.

    • (Score: 2) by kaszz on Monday June 15 2015, @12:48AM

      by kaszz (4211) on Monday June 15 2015, @12:48AM (#196304) Journal

      My guess is that Swift is on the interesting to learn list because developers want to program the iPhone. But it's interesting in the same way you learn liberal arts to have sex with a girl. It's a means to an end, not the other way around.

      So how is the bloat factor and overload in Swift? C++ seems to suffer from them.
       

      • (Score: 2) by c0lo on Monday June 15 2015, @02:38AM

        by c0lo (156) Subscriber Badge on Monday June 15 2015, @02:38AM (#196333) Journal

        But it's interesting in the same way you learn liberal arts to have sex with a girl.

        Except that Apple is more like an 800 pounds gorilla than a girl someone would like to fuck with.

        Can you do a car analogy instead? (grin)

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
        • (Score: 1, Funny) by Anonymous Coward on Monday June 15 2015, @03:25AM

          by Anonymous Coward on Monday June 15 2015, @03:25AM (#196341)

          Swift is like learning how variable valve timing works just to put on chrome valve covers?

      • (Score: 5, Informative) by fleg on Monday June 15 2015, @03:34AM

        by fleg (128) Subscriber Badge on Monday June 15 2015, @03:34AM (#196343)

        ok, first some context, i have about 15 years of c++, around 7 of java, couple of years c#

        My guess is that Swift is on the interesting to learn list because developers want to program the iPhone.

        that was the case for me. it just happened that the place i'm at decided to write an app at about the same time swift came out. i'd looked at obj-c before and just couldnt get past the syntax, its ugly as sin. so i decided to go with swift. i hadnt used xcode before either. wind forward almost a year and i must say its been an absolute joy. xcode doesnt have the sweet refactoring options that say netbeans and java does but its not that big a deal, because everything else makes up for it (including a vim-like plugin).

        when i first started using it i can remember thinking it was like a scripting language version of java, or put another way like they'd taken the best bits of java and c# and thrown in some ruby. but after a while you start to see the underlying c'ness to it, this is not java or c# there is no vm underneath it, there's a linker, you dont have write makefiles, but there are targets and so on. this is all nicely handled for you by xcode but its there.

        So how is the bloat factor and overload in Swift? C++ seems to suffer from them.

        i wouldnt say it feels bloated but it does feel like a big language in a way that java doesnt. i came to java from c++ and loved the way i could just forget a whole bunch of stuff. with swift i find myself much more in that c++ frame of mind, as in "hmmm this corner case of the language? what happens here again?" you have to remember more, its funkier than java. having said that, going back to java after doing swift is really painful, it feels incredibly verbose.

        downsides? when it dies in the arse the error messages are enough to make you weep, we're back in the land of awk "bailing out at line 1" kind of thing. the constructor inheritance rules are rather painful. its still very new, and whilst it has been getting better, there's still some clunkiness, for instance the interface to the obj-c stuff, its still a bit hit and miss on when you use swifts nil or obj-c's NSNULL. also i would have liked to see true multiple inheritance, i miss mixins.

        i dont think it will completely replace java because there is too much c++ type discipline required, but i think it may fill a very useful niche, where you dont really have the justification to write c++ but you want something that isnt as unwieldly as java yet gives you some help.

      • (Score: 3, Funny) by BasilBrush on Monday June 15 2015, @05:25AM

        by BasilBrush (3994) on Monday June 15 2015, @05:25AM (#196366)

        But it's interesting in the same way you learn liberal arts to have sex with a girl.

        I hope your coding technique is better than your pick-up technique.

        --
        Hurrah! Quoting works now!
  • (Score: 3, Insightful) by MichaelDavidCrawford on Monday June 15 2015, @12:18AM

    My take is that I'll be happy to learn Swift after it's an ISO standard. This because Apple pulled Objective-C 2.0 as well as Objective-C++ out of its ass without the slightest pretense of standardization.

    There's a real good reason that language committees are so big and the process goes on for so long.

    I haven't even tried to learn Swift yet but from what I hear it's designed to prevent the kinds of bugs that cause crashes and security holes. Given that infinitely many monkeys are submitting to the app store the very instant their obstetrician spanks their first breathe into their wee little baby bottoms, I applaud that goal.

    There are some merits to Objective-C in that calling native apis is done directly rather than the byzantine ways one does that from Smalltalk. I enjoy smalltalk for many good reasons but don't care much for objective-c. If I write native code I'd far rather use either C++ or assembly.

    --
    Yes I Have No Bananas. [gofundme.com]
    • (Score: 2) by kaszz on Monday June 15 2015, @12:44AM

      by kaszz (4211) on Monday June 15 2015, @12:44AM (#196302) Journal

      Why not plain C for native code?

      • (Score: 2) by MichaelDavidCrawford on Monday June 15 2015, @01:04AM

        The correct use of C++ initialization lists as well as exception safe techniques such as smart pointers enables one to guarantee an object is correctly initialized or not created at all.

        For example Haim Zamir describes his Edge Highlighter algorithm - it's patented, look it up - as more-accurate when you use more memory, but it is not possible to predict the memory usage from the input parameters. So it if runs out of memory it throws an exception, then backs out as if the call to start the algorithm never took place. One then adjusts the input and tries again.

        Yes one certainly can do that in C but it is a huge pain in the ass and immensely error-prone.

        --
        Yes I Have No Bananas. [gofundme.com]
        • (Score: 0) by Anonymous Coward on Monday June 15 2015, @01:44AM

          by Anonymous Coward on Monday June 15 2015, @01:44AM (#196316)

          Excellent point, but can you you please tell me how to get a patent on an algorithm? Using your answer I would like to patent an algorithm for rare steak cooking. Then proceeded to collect royalties from every steakhouse, it's a first to file system after all...

          • (Score: 2) by MichaelDavidCrawford on Monday June 15 2015, @02:00AM

            I agree strongly with Richard that there should not be frivolous patents.

            For Haim to have invented the Edge Highlighter is no different that Bell inventing the telephone. The simple fact that something is an algorithm does not mean it is not an invention one should not be permitted to patent.

            I invented a compressor that I requested my employer patent. They got acquired before the patent was filed, the new owner dropped my invention on the floor. I have many improvements, one may patent improvements so quite likely I will file.

            --
            Yes I Have No Bananas. [gofundme.com]
            • (Score: 2) by bzipitidoo on Monday June 15 2015, @01:38PM

              by bzipitidoo (4388) on Monday June 15 2015, @01:38PM (#196480) Journal

              If you think algorithms should be patentable, then perhaps scientific discoveries and mathematical formulas should also be patentable? Should Einstein have been able to get a patent on e=mc^2?

              That many algorithms are genuinely innovative does not change the fundamental brokenness of the patent system. The point of patents is to encourage innovation, and the mechanism is through an artificial monopoly that, ideally, can be used to demand payment in exchange for permission to use, thereby simultaneously rewarding the inventor and determining just how valuable the invention really is. Nice, if it worked. But it doesn't. Often, the system hinders innovation. Ironic. Among the many problems, it unintentionally plays upon the fear of loss, makes otherwise sensible people cling to "their" inventions, hide them for fear of someone "stealing" the ideas. It also promotes a "mother may I" permission seeking culture that is completely unnecessary and detrimental to independent thinking. The notion that, before writing any code, you ought to search through thousands of patents to make sure you're not about to infringe some, is of course ridiculous, and is routinely ignored.

              There are other, better ways to encourage innovation. We should develop and refine them. Perhaps a system of independent crowdfunding organizations, all using different methods to raise money and different criteria to judge value. Free digital notary services would be invaluable for sorting out priorities, would be much better than the fallacious method of sending a sealed envelope to yourself, to "prove" the date.

      • (Score: 3, Insightful) by fleg on Monday June 15 2015, @03:49AM

        by fleg (128) Subscriber Badge on Monday June 15 2015, @03:49AM (#196351)

        as ever, it depends on the context. where context means, among other things, how much power do you have, which cpu, how much memory, are you on baremetal or do you have an os etcetc

        so generally...

        dont write assembler if the context allows you to write c
        dont write c if the context allows you to write c++
        dont write c++ if the context allows you to write java/c#/swift
        dont write java/c#/swift if the context allows you to write perl/ruby/python

        always go to the highest level of abstraction you can.

        well, i say always, but really it depends on context :)

        • (Score: 2) by CirclesInSand on Monday June 15 2015, @12:23PM

          by CirclesInSand (2899) on Monday June 15 2015, @12:23PM (#196453)

          And don't write perl unless your context allows you to delete the source code immediately after running the program once.

          • (Score: 3, Funny) by Marand on Monday June 15 2015, @02:23PM

            by Marand (1081) on Monday June 15 2015, @02:23PM (#196497) Journal

            And don't write perl unless your context allows you to delete the source code immediately after running the program once.

            I thought it already did that. Are you saying all that gibberish in .pl files isn't bytecode? Well, shit.

        • (Score: 2) by Marand on Monday June 15 2015, @02:31PM

          by Marand (1081) on Monday June 15 2015, @02:31PM (#196502) Journal

          so generally...

          dont write assembler if the context allows you to write c
          dont write c if the context allows you to write c++
          dont write c++ if the context allows you to write java/c#/swift
          dont write java/c#/swift if the context allows you to write perl/ruby/python

          And,finally, don't write in anything else if you can use a lisp dialect instead. :)

          Took me a while to give the lisps a fair shot but once I did I really liked them. Realised I'd been following a lot of functional patterns already in other languages and never even noticed before, so I wish I'd tried sooner.

          • (Score: 2) by kaszz on Tuesday June 16 2015, @06:32PM

            by kaszz (4211) on Tuesday June 16 2015, @06:32PM (#196979) Journal

            How would you say LISP compare to the above stuff?

            • (Score: 4, Informative) by Marand on Thursday June 18 2015, @07:54PM

              by Marand (1081) on Thursday June 18 2015, @07:54PM (#197955) Journal

              I'll try to answer that as best I can, but with the caveat that it's a recent thing I've picked up so I'm still new, and I've mostly used Clojure so that's the perpsective I'll be answering from. (In fact, looking for alternatives to working in Java is what got me to consider lisps again.)

              [Some general info here]
              The biggest difference is the style of programming. Functions always return values, and you're expected to avoid side effects (doing multiple things within a function) when possible (pure functions) for optimisation purposes. There are ways to do multiple things within a function to introduce side effects like IO (making it impure), but the idea is to isolate them so the other functions can be optimised.

              Another difference is that a lisp, at its core, is extremely simple. The language grammar is very basic: everything is basically represented in lists, so (+ 2 3 4 5) is a list where the first element is the function [+] and the next elements are the arguments to the function. This can result in a ton of nested parentheses, but Clojure, at least, provides syntactic sugar to let you reduce the nesting in many cases, which I find makes it more readable. Speaking of syntactic sugar, technically most of a lisp dialect is syntactic sugar. A simple lisp only has a handful of operations implemented specially, and everything else after that gets created in the language itself using compiler macros and new function definitions to simplify common uses. The end user of the language tends to do the same thing, which seems to make larger projects evolve in a way that sort of creates a sub-dialect as they add more and more convenience functions.

              [more specific comparisons follow]
              The end result of all this, from what I've seen so far, is that there's less thinking about the language and more thinking about the problem. It feels more like problem solving than language-wrangling. That alone seems different from dealing with other languages, especially other high-level ones that include everything-and-the-kitchen-sink.

              There is also not a lot of language-mandated boilerplate (unlike, say, Java), though you end up with a lot of application-specific functions, and there's a tendency to re-use suites of functions in a CPAN-esque way, similar to Perl. So, more akin to writing Perl, Python, or Ruby in this way.

              Functions are first-class in the same way objects are first-class in Ruby. You don't need pointers or other workarounds, because you can just assign a function to a name and work with it, and you can even pass functions as arguments to other functions (either anonymous ones, aka lambdas, or named functions). In fact, in Clojure the function definition (defn) is just syntactic sugar; you can get the same behaviour using def to assign an anonymous function to a variable. You can also replace existing functions like this, again like Ruby. These sorts of things are possible to a lesser extent in Python and Perl; you can do a lot of the same things, but with more exceptions in the language.

              Lisps lend themselves well to metaprogramming, too. Unlike most languages, the source code is a correct representation of the syntax tree, which simplifies code generation and modification. A lisp program is data, and can be modified at runtime like other data. This, again, is more on the Python/Perl/Ruby end of the spectrum, but a step further.

              I've also noticed a trend toward more interactive development with lisps. Work within the REPL, or with an editor that can interact with the REPL, and make changes on-the-fly, then save the changes. Clojure even lets you do this with Android applications; during the devleopment process the Android app has a REPL built in (later disabled for release) that you can connect to and make changes to the running android app, so you can create and modify the UI or application logic on a running device and test it real-time.

              ---

              Basically, if you're going for the "highest level of abstraction you can" like fleg suggested, you aren't going to find much higher than a lisp dialect. A lot of things lisps have done, other languages have picked up over time; as other languages attempt to become higher-level and more useful for the user, they move closer to acting like a lisp.

              Another interesting thing is, because the basic requirements for a lisp dialect are so simple, people tend to create lisps on top of other languages, either converting lisp expressions into another language's syntax (lisp-to-lua or lisp-to-javascript converters, for example), or into an equivalent representation in a language's AST (used for Hy [lisp to Python], with Clojure [Lisp to JVM] doing something similar).

              That means if you're familiar with, say, Python, you can use Hy [wikipedia.org] instead to have lisp syntax and flexibility while retaining access to parts of the underlying language, either for familiarity or because that language is powerful, or the only option in an area. For example, this means you can use Clojure to write Android applications. (Runtime speed seems prety good, though there's increased startup time and memory use vs a pure Java implementation.)

              ---

              I hope that helped. I'm not sure if I answered you in a way that's satisfactory, but did the best I can, and I'll keep responding if you have other questions that I can answer in any way.

              • (Score: 2) by kaszz on Thursday June 18 2015, @09:09PM

                by kaszz (4211) on Thursday June 18 2015, @09:09PM (#197984) Journal

                Your answer is great *thumbs up*. What is your experience with functional languages?
                Any good introduction book to Lisp?

                • (Score: 2) by Marand on Friday June 19 2015, @09:26PM

                  by Marand (1081) on Friday June 19 2015, @09:26PM (#198438) Journal

                  What is your experience with functional languages?

                  Lisp (by way of Clojure) is my first experience with a proper functional language. I'd looked at it and Haskell briefly in the past but never considered them seriously for anything because I was happy with Perl (and more recently Ruby) for most things. For what it's worth, I still like Perl and Ruby, but there are a lot of places where they just aren't applicable for use. They're a pain in the ass to use for knocking together useful Android tools, for one big example.

                  What got me interested finally was a search for Java alternatives, because everything I tried (such as Scala) still felt like supersets of Java, including the parts I disliked about Java. The closest I got was Mirah, which is a Ruby-like syntax on top of the JVM, but it explicitly avoids providing a runtime as a design decision, so as much as I liked the Mirah-specific parts (syntax mostly), it was just a thin wrapper over the warts of Java, so I started looking again. [for what it's worth, Mirah's nice and worth looking into if you don't mind Java but wouldn't mind a cleaner, more concise syntax to it. No runtime means full-speed Java and small class sizes]

                  I finally realised that, with the exception of Ruby's smalltalk-esque design where everything is an object and can be mangled at will, I don't really like OOP for OOP's sake, so I decided to give the "dark side" a try and started looking for info on Clojure.

                  It's definitely had a learning curve, because in addition to being functional, Clojure does some slightly odd things to make the language more concurrency-friendly, but I'm finding I like the style. In fact, as I mentioned in one of the other comments, I already did some functional-esque things in Perl code without ever realising it. I always made subroutines return their value, generally (but not always) avoided side effects though I didn't quite realise it, and I loved using anonymous subroutines. I'd stuff them into variables, pass them to other subs, dereference them, etc.

                  As it turns out, that's a big part of functional programming, though it's much cleaner and easier in a lisp than it is in Perl, and I'm finding it much more enjoyable than I ever expected, though I'm still fighting with imperative programming habits some, so I'm not as efficient as I'd like to be yet. I'm doing it as a hobby, though, so luckily I don't have to worry about getting efficient fast.

                  ---

                  Small note about Clojure:

                  As lisps go, it seems to be a contentious dialect. It does some unique, new things and there seem to be a lot of hard-liner lisp fans of the primary dialects (Common Lisp and Scheme) taking a bit of an elitist stance against it because of that. For example, you have to use transactions to change variables, in a SQL-esque way, so that variable changes are concurrency-friendly. It makes variables a little weird compared to other languages and has a learning curve to it, but it's just "this is different" rather than "this is good|bad".

                  It also provides some nice shorthand tricks like a "threading macro", called ->, that lets you eliminate excess nesting for readability, and I think that annoys some purists as well. Swiping the example from Hy's documentation, you can take a line of lisp such as (loop (print (eval (read)))) and write it instead as (-> (read) (eval) (print) (loop)), which gives it a more natural left-to-right flow. There are a couple similar tricks for Java interop that makes it possible to reduce nesting when calling Java code, too.

                  Also, I haven't used it, but there's also a way to turn Clojure into native code by way of the Gambit Scheme compiler. Gambit's a Scheme variant that can create native binaries, and someone made a tool to convert Clojure into valid Gambit Scheme, so the result is Clojure-to-native. Performance looked pretty good, though obviously you won't have access to Java-specific stuff that way, so I don't know what the caveats to this are. Still interesting...

                  (There's also a subset of Clojure called ClojureScript that compiles to JavaScript instead of JVM bytecode)

                  Any good introduction book to Lisp?

                  Well, from a Clojure perspective at least, Pragmatic Bookshelf's Programming Clojure [pragprog.com] and O'Reilly's Clojure Programming [oreilly.com] have been useful. The Pragmatic one is a gentler introduction, though I think the O'Reilly one might be more useful overall. I've been juggling both at the same time. There's also a whimsical online tutorial called Clojure for the Brave and True [braveclojure.com] that I glanced over; I saw recommendations for it, but its style didn't help me much for some reason. Still, worth a look since it costs you nothing and people learn differently.

                  Clojure itself is also extremely well documented, and self-documenting. Function definitions can supply docstrings in a Perl- or Ruby-like way and you can access the documentation from the REPL via (doc func-name), including functions you import or define yourself. You can get access to that, and more, from clojure.org's documentation page [clojure.org], which has the (doc) info + examples, a cheat sheet quick-reference for all the Clojure-provided functions, and some reference documentation about the language itself.

                  Also useful are various Clojure learning tools like 4Clojure [4clojure.com] and the Clojure koans [clojurekoans.com], which present Clojure code with brief explanations and fill-in-the-blank problems. I didn't find them useful by themselves, but combined with the above sources and searching stackexchange they're helpful.

                  If you're interested in the Lisp-on-Python, Hy, then Hy's documentation [readthedocs.org] is probably a good place to start. Haven't messed with it much because I'm more interested in Clojure, though.

                  For CommonLisp and Scheme (including Racket, a popular Scheme variant) you should be able to find examples and tutorials all over the internet, because they're the Big Two, and for oddball things like Lisp-to-Lua (l2l [github.com]) the syntax is usually small enough that there isn't much to learn.

                  For more general lisp information and learning, I've also been reading through the SICP [github.com] and Paul Graham's On Lisp [yimg.com]. They're older, but more general and available free online. Of the two, I'd say I've liked On Lisp more, though that viewpoint may be a minority -- everyone seems to love SICP. They're helpful in a more abstract way because I'm not working with Common Lisp or Scheme; the function names differ in places but the concepts are the same so there's still something to learn from them.

                  ---

                  Finally, a note about editors: get an editor that can highlight matching parens, and if possible also colourises paren matches. To see what I mean you can use KDE's editor, kate; it has both options built in, so if you make any sort of lisp file it'll show parens in a rainbow of colours and highlight the match to the one under your cursor. Emacs has the paren highlighting by default (show-paren-mode) and an extension (rainbow-delimiters-mode) to do the same. Not sure about other editors.

                  They're not absolutely necessary, but those two things make dealing with the nested functions a lot easier.

                  • (Score: 2) by kaszz on Friday June 19 2015, @10:31PM

                    by kaszz (4211) on Friday June 19 2015, @10:31PM (#198470) Journal

                    Thanks again for an interesting answer. The reason for interest in functional languages is that in the beginning they were slow snails and more or less a curiosity on computers. But now there's really fast processors >4 GHz with multiple cores. And definitely the latter property coupled with the ability to use many physical processors on the same motherboard. Should now make functional languages a good match. Perhaps it possible to scale a functional language down to 8- and 16-bit embedded microcontrollers too and allow them to chit chat over an interconnect? Anyway do you think multicore-SMP using functional languages has any bearing?

                    Perl is btw really nice to get things done fast, but it might show its more ugly side effects when a software project gets big. Python seems also nice but has this fascist indentation regime. So is there any other scripting language that is a suitable replacement?

                    • (Score: 2) by Marand on Saturday June 20 2015, @09:45PM

                      by Marand (1081) on Saturday June 20 2015, @09:45PM (#198808) Journal

                      No problem. Since it's still a new area for me the culture shock is still fresh and I haven't forgotten how alien parts of it can be.

                      The reason for interest in functional languages is that in the beginning they were slow snails and more or less a curiosity on computers. But now there's really fast processors >4 GHz with multiple cores.

                      There's also been work to make them faster, much like the sorts of optimisation projects we've seen for Javascript, Python, Ruby, etc. over the years. You can check things like the debian.org benchmark game [debian.org] and see that the performance of Clojure vs. Java in most of the benchmarks is very close, with Clojure even outperforming in one. And that's probably not even using the experimental Skummet [clojure-android.info] compiler.

                      It helps that it's compiled, but it's not just Clojure that's seeing benefits. SBCL (commonlisp) shows similar performance quality, and Haskell's faster than any of them in most of the benchmarks. You're right that the differences would have been a huge problem years ago, but between optimisation and hardware the disparity is small enough that you can shrug and go "good enough" and enjoy a nicer language.

                      On a related note, there's been an increase in interest in so-called transpilers [wikipedia.org] lately that turn one language into another, probably thanks to Javascript being ubiquitous but generally unpleasant to use directly. Things like Emscripten (turns anything that llvm can compile into javascript) or specific language-to-JS converters like ClojureScript (Clojure to JS), Opal (Ruby to JS), etc.

                      Sometimes those converters exist because you want to use a different language but are constrained by what language is available on a platform: Lua gets embedded into game engines, Python's popular with 3d modelers and the like, etc. Other times it's for performance: Crystal [crystal-lang.org] is a Ruby-like language that compiles to native code, as one example.

                      And definitely the latter property coupled with the ability to use many physical processors on the same motherboard. Should now make functional languages a good match. Perhaps it possible to scale a functional language down to 8- and 16-bit embedded microcontrollers too and allow them to chit chat over an interconnect? Anyway do you think multicore-SMP using functional languages has any bearing?

                      Can't say much about scaling down to lesser hardware, it's not an area I have much familiarity; I'm more of the "cobble something together when needed or when it sounds fun" systems programming/sysadmin type guy. Never really had to mess with embedded and tend to hate dealing with GUI stuff, though I've been trying to improve with that.

                      As for the rest, yeah, it seems like functional programming is in a position to shine thanks to the increased focus on multi-core and multi-threaded processors. Unless we hit some kind of epic breakthrough soon that allows massive single-core performance boosts vs. what we have now, more threads are going to be where it's at for performance for at least a while.

                      You still have to do some things differently to make your programs concurrency-friendly, but at least in the case of Clojure, the language itself does a lot to make it easier. Even if you don't explicitly do anything to take advantage of its concurrency you get some benefits out of its design, and it seems like (from my limited experience so far) the options it adds for concurrency aren't a huge cognitive hurdle like they can be in some languages.

                      There's also Erlang, which is sort of the opposite...It's designed for concurrency first, but is closer to functonal programming than imperative too, so I guess that counts as well.

                      Perl is btw really nice to get things done fast, but it might show its more ugly side effects when a software project gets big. Python seems also nice but has this fascist indentation regime. So is there any other scripting language that is a suitable replacement?

                      Of the main three, I think I like Ruby best these days. Perl's faster and CPAN's extensive selection is awesome, but it's also more of a crapshoot getting the modules to work. Ruby's like a cleaner Perl in a lot of ways, and the "gems" system (CPAN-esque) is pretty large and less error-prone. Also, I agree about Python's indentation, but Hy (the lisp-on-python) lets you bypass that and still get the benefits of Python, so that has potential as a way to make Python less onerous to use.

                      So, if a normal lisp isn't an option for some reason, I think I'd give Hy a shot for a project, though I do still like Ruby and Perl for knocking quick stuff together. If installing Hy isn't possible I'd say Ruby over the others. It's clean, easy to read, I like the smalltalk-esque object model, and the style of it works out well if you're familiar with Perl already.

                      An interesting one that isn't well known right now is Red [wikipedia.org]. It's based on Rebol, but has two components. The Red/System part is very fast and sort of low-level, while the higher-level portion is akin to Ruby/Python/Perl in features but slow -- far slower than Ruby, in fact, which is kind of scary and I hope is an optimisation issue to be addressed later. It doesn't quite seem ready for general use right now, but might be interesting if so.

    • (Score: 4, Funny) by fleg on Monday June 15 2015, @03:41AM

      by fleg (128) Subscriber Badge on Monday June 15 2015, @03:41AM (#196346)

      >I hear it's designed to prevent the kinds of bugs that cause crashes and security holes.

      well i cant speak on the security angle but i assure you that i have not noticed any tendency to "prevent the kinds of bugs that cause crashes" ;)

  • (Score: 2, Interesting) by BananaPhone on Monday June 15 2015, @02:53AM

    by BananaPhone (2488) on Monday June 15 2015, @02:53AM (#196335)

    I keep on seeing articles about how great Swift is.

    What does it fix that other languages don't?

    • (Score: 1, Disagree) by c0lo on Monday June 15 2015, @04:48AM

      by c0lo (156) Subscriber Badge on Monday June 15 2015, @04:48AM (#196358) Journal

      What does it fix that other languages don't?

      Others (including ObjC) were not invented by Apple.
      And, nowadays, really new gizmos are so hard to invent; and something need inventing.. so... swift?

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 0, Redundant) by BasilBrush on Monday June 15 2015, @05:37AM

      by BasilBrush (3994) on Monday June 15 2015, @05:37AM (#196368)

      Google is your friend.

      --
      Hurrah! Quoting works now!
      • (Score: 0) by Anonymous Coward on Monday June 15 2015, @10:20AM

        by Anonymous Coward on Monday June 15 2015, @10:20AM (#196414)

        So Swift fixes the issue that Google is your friend? :-)

    • (Score: 2) by VLM on Monday June 15 2015, @12:10PM

      by VLM (445) on Monday June 15 2015, @12:10PM (#196448)

      I keep on seeing articles about how great Swift is.

      This thread is useless without pictures. In my parents generation the music industry was completely audio oriented and now its completely visual oriented, a musician is a hottie model who likes to pose with guitars. Given that, she is in fact pretty good at looking hot. It'll be interesting to watch her flame out like most female teen pop stars, who wants to take bets on how long till she shaves her head and does pr0n like the historical greats of her genre, etc. I'm still convinced almost all of the cheering at the apple dev conference during the swift section was drunk/high devs thinking they were getting taylor swift to sing the mini-concert at the end instead of the not so interesting music they actually had.

      Oh you mean the apple nih language. I'm sure the programming world is going to get really tired of taylor swift jokes. How many lines from her songs can be worked into function names and comments?

      Anyway its Scala with a huge dose of NIH and some apple specific bindings and libraries included as standard. If you're trying to do "stuff" with an iphone then you'll need swift, if you just want to learn the paradigms and ideas then just fool around with scala for awhile, just don't bother memorizing the syntax (somewhat fewer semicolons etc). If you can't write something functional, then learning how to do it in scala will help immensely when "Swift version NIH" is finally released.

      Also you'll hear infinite bullshit because "scala doesn't have concurrency" because you have to explicitly "import scala.concurrent._" in order to load the concurrency package whereas supposedly new swift doesn't. Kinda like "perl can't talk to mysql" because you need to load not one but two cpan modules before you can talk to mysql. So uh, sure. In other words most of the comparisons online are crap not worth reading.

    • (Score: 0) by Anonymous Coward on Monday June 15 2015, @04:58PM

      by Anonymous Coward on Monday June 15 2015, @04:58PM (#196584)

      I keep on seeing articles about how great Swift is.

      Given the amount of product placement by Apple, and its fans, I would take all discussions of the 'greatness' of Swift with a grain of salt.

  • (Score: 3, Insightful) by PizzaRollPlinkett on Monday June 15 2015, @11:25AM

    by PizzaRollPlinkett (4512) on Monday June 15 2015, @11:25AM (#196433)

    I'm not against Swift. I want to like it. The problem is that if you're not an Apple developer, it's just Yet Another Language to throw into the pile with all the other new languages. Even if Swift is great, and even if the open source (notice they don't call it free software) version catches on, it's still yet another language competing with all the other languages. Used to be that new languages could be kind of exciting. Fun to learn. These days, though, there's a new language every few months, and nothing compelling to distinguish them. I try to use as few languages as possible, because the syntax is all basically the same, but different. It's cognitively hard to switch from one language to another, because you know exactly what you want to do, but don't remember the exact syntax. The idea of trying to learn yet another language with the same basic syntax but all the minor variations in how to do things is a pain I don't need. Swift would have to do something so revolutionary great that I couldn't live without it for me to spend time on it.

    Why not go back to industry standards? Whatever great things Swift does, why can't these great things be added to an existing language rather than creating a whole new language? Especially since there is a "shortage" of developers. Maybe the shortage wouldn't be as bad if we used industry standards and skills were more easily transferable.

    The argument against adding to an existing language is, of course, C++. Lambdas, for example, were added in a way that makes me wish they had just scrapped it and started over. The syntax is hideous. So I can see why people want to start over with a clean slate. The problem right now is there are too many of these false starts, and programmers don't have the cognitive space to handle all these languages.

    --
    (E-mail me if you want a pizza roll!)
    • (Score: 0) by Anonymous Coward on Monday June 15 2015, @02:15PM

      by Anonymous Coward on Monday June 15 2015, @02:15PM (#196491)

      Not typing your variables isn't natural!!!

  • (Score: 2) by darkfeline on Monday June 15 2015, @09:42PM

    by darkfeline (1030) on Monday June 15 2015, @09:42PM (#196657) Homepage

    Does Swift have any applications outside of Apple's walled garden? Because if it doesn't, this is non-news (for me and many others); even if Swift reads your mind and instantly produces the exact code you need without any bugs, I wouldn't touch it with a long stick.

    --
    Join the SDF Public Access UNIX System today!