Stories
Slash Boxes
Comments

SoylentNews is people

posted by n1 on Wednesday October 01 2014, @04:11AM   Printer-friendly
from the (DEFUN-HELLO-()-"HELLO-WORLD"-) dept.

raganwald.com has some interesting thoughts on the time it takes to be productive in different programming language (and programming language types), as well as what it means to be productive, in a an essay title "600 Months". It starts with this thought-provoking statement:

“I have personally found that LISP is unbelievably productive if you’re willing to invest in the 600-month learning curve.”-Paul Ford

That's 50 years - nearly the entire history of LISP as a language, and far more time than most of us have available for learning a new language.

What languages took you the longest to feel "fluent" in? What language concepts do you still have trouble grasping?

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by CRCulver on Wednesday October 01 2014, @04:24AM

    by CRCulver (4390) on Wednesday October 01 2014, @04:24AM (#100290) Homepage

    Lisp takes 10 years to be proficient in, what?! As anyone who falls for Emacs can tell you, Lisp can be picked up and used for quite complicated data processing within a short amount of time. People who write up packages to automate their workflow and post them on Marmalade for others to extend might have known nothing of Lisp just a few weeks before.

    If you want to pit Lisp against widely popular platforms and claim that Lisp people aren't as productive, that's solely due to the fact that the languages developed separate libraries for separate tasks. It's not a matter of language semantics.

    • (Score: 2) by c0lo on Wednesday October 01 2014, @05:27AM

      by c0lo (156) Subscriber Badge on Wednesday October 01 2014, @05:27AM (#100303) Journal

      If you want to pit Lisp against widely popular platforms and claim that Lisp people aren't as productive, that's solely due to the fact that the languages developed separate libraries for separate tasks.

      You sure the problem is with the libraries? I mean, LISP being so wonderfully functional (as in functional programming), maybe the problem stays with the (very tempting) code automation [xkcd.com]?

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 0) by Anonymous Coward on Wednesday October 01 2014, @10:59AM

        by Anonymous Coward on Wednesday October 01 2014, @10:59AM (#100389)

        Lisp isn't a functional programming language. Yeah, it has first-class functions, but then so do C#, C++, Java, and lots of other non-functional languages. Languages like SML and Haskell are functional programming languages. Lisp is not.

        • (Score: 3, Interesting) by CirclesInSand on Wednesday October 01 2014, @03:19PM

          by CirclesInSand (2899) on Wednesday October 01 2014, @03:19PM (#100493)

          No software development language can be 100% functional. That is because the only thing a functional language can do is output a single value. If you want to do anything persistent, such as file IO or graphics, then you have to have persistent constructs that contradict the invariance of functional languages.

          Functions are a useful part of a language, but they are not useful enough to be an entire dev language. This is why every "functional" language will have exceptions that eliminate the useful assumptions of functional composition.

          • (Score: 0) by Anonymous Coward on Wednesday October 01 2014, @10:04PM

            by Anonymous Coward on Wednesday October 01 2014, @10:04PM (#100687)

            I see that you've never used Haskell, and you don't know what monads are. Since you don't know much about this topic, maybe you should refrain from discussing it until you've acquired sufficient knowledge of it.

            • (Score: 0) by Anonymous Coward on Thursday October 02 2014, @07:30AM

              by Anonymous Coward on Thursday October 02 2014, @07:30AM (#100846)

              Yet again the Haskell tards imagine that they invented functional programming despite the fact it preexisted them by 3 decades.

      • (Score: 2) by fadrian on Wednesday October 01 2014, @03:59PM

        by fadrian (3194) on Wednesday October 01 2014, @03:59PM (#100511) Homepage

        ... maybe the problem stays with the (very tempting) code automation?

        The only problem with that comic is that it's right. And you're right that Lisp makes it a bit easier to fall into that trap because of its "code as data" - nature (sorry for not using the fancy term homoiconicity, smug Clojure weenies and, by the way, nice work with the transducer thingies). However, even most Lisp programmers know not to start writing automation immediately. Remember, once is a random occurrence, twice is a statistical anomaly, and, the third time you see it, then you start thinking about automating.

        --
        That is all.
    • (Score: 3, Insightful) by TheLink on Wednesday October 01 2014, @07:15AM

      by TheLink (332) on Wednesday October 01 2014, @07:15AM (#100332) Journal

      Yeah libraries and frameworks are important.

      Languages like Lisp may be powerful and wonderful for the code you have to write. But I prefer using other languages because they are better for all the code I _don't_ have to write. The less code I write, the less code I have to type, debug, document, support, etc.

      Some programmers are doing stuff nobody has done before, so they have to write practically everything. In which case existing libraries and frameworks don't help them much.

      I'm not one of those programmers and I'm far from a top programmer. So the less code I write, the higher the average code quality of the total output if you include the libraries and frameworks ;).

    • (Score: 0) by Anonymous Coward on Wednesday October 01 2014, @10:57AM

      by Anonymous Coward on Wednesday October 01 2014, @10:57AM (#100388)

      If Lisp (or Scheme, or Clojure, or whatever variant you want to bring up) is so great and so productive, then why does it never get any real traction? Why has it stumbled around for over half a century, seeing at most only some amount of use in the mid 1980s?

      If Lisp really were that good, it would be more widely used. It wouldn't just be a niche language after so long.

    • (Score: 2) by VLM on Wednesday October 01 2014, @12:12PM

      by VLM (445) on Wednesday October 01 2014, @12:12PM (#100411)

      "claim that Lisp people aren't as productive, that's solely due to the fact that the languages developed separate libraries for separate tasks"

      Yeah you know like Clojure, where you can't use the java libraries despite both being JVM langs. Oh wait you can and its really easy and well integrated and works great. And productivity takes a lot less than decades LOL.

      The original story is basically trolling or maybe parody.

  • (Score: 2) by kaszz on Wednesday October 01 2014, @04:39AM

    by kaszz (4211) on Wednesday October 01 2014, @04:39AM (#100293) Journal

    The more important factor is likely how long time it takes to before the language becomes useful for your problem solving ability.

    • (Score: 2) by TheGratefulNet on Wednesday October 01 2014, @02:49PM

      by TheGratefulNet (659) on Wednesday October 01 2014, @02:49PM (#100478)

      I'm a 25+ yr c programmer (I'm over 50) and it took quite a bit of convincing to accept python as a 'get stuff done' language. but I have to say, its the new favorite for things that don't demand native c interfaces.

      it gets as much done as perl, has a rich (or even richer) library from contributors and the syntax is NOT aweful like perl's is.

      I'm luck to be taking a python class at work (week long) by a python developer who knows his shit. makes all the difference in the world. I now have drunk the koolaid, I guess, and am ready to dive in and do work in python instead of c or bash.

      the language has some strange things if you are so used to C, for example. but the amount of things you can get done quickly and without too many hassles is impressive.

      I expect that it will overtake java, in time. I never liked java and now that python can be as fast (it can, amazingly) - and it has all the api support you need for networking, files, etc etc - I can't see starting new projects in java unless you have to link in other java things (and even then, there are ways around that).

      again, it took a bit of effort to convince me. I'm pretty stubborn and have too many decades of C to just give it up for (say) java - but I -am- ready to do new projects (big and small) in python. the only thing I don't like is the python2 vs python3 stuff. that kind of makes things difficult, at times, but I'm sticking with python2.x for now and that seems to be good enough for most projects that I've seen.

      --
      "It is now safe to switch off your computer."
      • (Score: 2) by No.Limit on Wednesday October 01 2014, @07:01PM

        by No.Limit (1965) on Wednesday October 01 2014, @07:01PM (#100591)

        I love python too (it's gotta be one of the first languages that I've learned). However, I doubt that python will take over Java anytime soon or even replace it.

        Python is too different from java:
        - no static typing (sometimes people will prefer the safety that comes with static typing, sometimes people want that a function takes this type and only this type as its parameter and they want that it's being checked at compile time and not at runtime)
        - global interpreter lock [wikipedia.org] (prevents some types of parallelism in python, which is quite important these days)
        - speed, I doubt that python will ever be as fast as Java due to many reasons (less cooperate support, dynamic typing which means that attributes get looked up in hash tables instead of vtables)

        But python can do things that Java simply can't. For example it's soo quick to develop programs with it. Python is soo great for small useful programs.

        When it comes to programming languages I think the "the right tool for the right job" approach is the best.

  • (Score: 1) by acharax on Wednesday October 01 2014, @04:49AM

    by acharax (4264) on Wednesday October 01 2014, @04:49AM (#100297)

    Becoming productive and proficient are two very different things. You can hurl out huge quantities of code in almost any language within mere month but none of it will be good. The current software landscape is a testimony to this.
    It took me many years to become proficient with the languages that I use (C and X86 ASM), but in today's environment with its tendency to favor the current flavor of the month-scripting/managed language the luxury of dedicating time and effort onto a single language appears to have been all but lost on many programmers. This lack of proficiency has caused a shift toward feigning productivity by tacking multitudes of badly implemented garbage onto existing software all in the name of some ephemeral "progress" that we never get to see (in reality they're just dropping established paradigms because they frequently get in the way of the incompetent and lazy).

    • (Score: 2) by frojack on Wednesday October 01 2014, @05:18AM

      by frojack (1554) on Wednesday October 01 2014, @05:18AM (#100302) Journal

      Great deal of truth to what you say.
      About a year in, you become competent enough that your output doesn't have to be rewritten. Unless its your first programming language.

      I've seen many projects blown up by brash young programmers who thought the latest scripting language was salvation, and managed to convince some pointy haired boss to go with it.

      --
      No, you are mistaken. I've always had this sig.
    • (Score: 1) by anubi on Wednesday October 01 2014, @05:55AM

      by anubi (2828) on Wednesday October 01 2014, @05:55AM (#100310) Journal

      I second you, Acharax.

      Admittedly, I was raised on Fortran. From there it was 8080 assembler ( IMSAI ), then 6502 basic/assembler ( Commodore64 ) then continuing onto 8086 basic/assembler ( IBM PC/AT ). Then it was 68000 assembler ( custom CPU board I built for a company ). I am now getting a lot into C++ ( Arduino/embedded)

      In all cases, I could generally write code - bad code - within days of getting the book and playing around.

      I have had two years of classroom work on C/C++ ( the works... you know - file structures, data structures, numerical techniques, statistics, DSP, and how to integrate code and math - especially vector calculus and statistical functions ). This was several years ago. I note I still am learning how to do things better and there are still many things I still drop back to 8086 assembler to do on an old PC because I had finally spent enough time on the machine I knew pretty well what I was doing.

      As far as I can see, a computer language is yet another foreign language, maybe similar to those you already know, but there are all sorts of nuances specific to that language and ignorance of them often leads to unintended consequences.

      Yes, I can get a quickie language course and properly ask for directions to the toilet in another country. If that is all that will be expected of me, so be it.

      However, if I am designing a bridge for them, improperly specifying the concrete may pass delivery but not meet the tests of time.

      Now, like most of us, I am far more critical of myself than others seem to be of me, and I tend to hemhaw around until I am satisfied its really right, because I have all too often seen the results of prematurely released undercooked code. However, business seems to place a time value on code as well and undercooked code is more profitable to make than tested stable code. Matter of fact, it seems once code becomes mature and stable, its obsoleted, and its off to the next moneymaking rounds of releasing hopeware.

      I figure its at least a decade of involvement before I consider myself half-decent in this. How long does it take a pianist or golfer to hone his skill? A few really gifted ones seem to have it come natural to them... me, I have to do it - over and over - and make mistakes each time. And hope my boss does not fire me for being incompetent, however the boss will also fire his competent people for not being flexible, so that kinda evens out the score, and explains why today we are having so much problem with hackers.

      I think a lot of us would write code like the old stone masons built cathedrals... it might not go up quick, but the structures would last darned near indefinitely. More like a common language of the people rather than a fashion fad of the day. This software we have today seems to me too much like a fashion trend - it does not have to have to be durable; it just has to look good on the rack. Like cheap jeans. Designed from the start to be useless in a few years. Especially these days with digital expiry enforcement mechanisms and the legislation to back them up in place. I for one am extremely disgusted seeing what our computational infrastructure could be and what it actually is. Linux is my greatest hope and even that is getting pushed to being way more complex than it should be.

      --
      "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
      • (Score: 1) by clone141166 on Wednesday October 01 2014, @06:23AM

        by clone141166 (59) on Wednesday October 01 2014, @06:23AM (#100321)

        Really liked your last paragraph. I feel it very accurately sums up modern software development (especially in the corporate setting).

      • (Score: 3, Insightful) by frojack on Wednesday October 01 2014, @06:50AM

        by frojack (1554) on Wednesday October 01 2014, @06:50AM (#100328) Journal

        In on sentence you say a computer language is just like a foreign language.
        Then you say:

        I figure its at least a decade of involvement before I consider myself half-decent in this. How long does it take a pianist or golfer to hone his skill?

        So which is it, a language or a physical activity?

        Your first programming language might take a year to learn to be proficient. But realistically, most languages have such limited syntax that it really isn't even as difficult as learning a language, and nothing at all like becoming proficient with a 9 iron.

        Your second programming language is even easier, because by that time you know what to do, and its merely assembling the new syntax.

        People seem to have confused the process of learning programming, with learning a programming language. You learn programming once, you learn plumbing once. You don't need to relearn programming or plumbing just because you use a new language or or new a new pipe wrench.

        I've swapped out of and into several programming languages, and only the first one took any significant amount of time. Never more than a year in any new language to be handling entire large corporate systems, on platforms I've never seen before, or different operating systems.

        --
        No, you are mistaken. I've always had this sig.
        • (Score: 2) by Dunbal on Wednesday October 01 2014, @07:31AM

          by Dunbal (3515) on Wednesday October 01 2014, @07:31AM (#100336)

          "So which is it, a language or a physical activity?"

          It's the difference between being a good speller and a great writer.

        • (Score: 2, Interesting) by anubi on Wednesday October 01 2014, @08:32AM

          by anubi (2828) on Wednesday October 01 2014, @08:32AM (#100357) Journal

          I think a lot of it is whether you are driving the car, or you are the mechanic.

          For me, its like learning to play a guitar, then someone hands me a saxophone. I'll get it eventually, but there are gonna be a lot of sour notes first.

          Right now, for example, I am cleaning up a mess left behind by someone who understood electronic design quickly. His design is in the field. The product is acting funny sometimes. The customers are all in a tizzy - they do not know what the heck is going on....

          What this guy did... used aluminum electrolytic capacitors as long-term timing capacitors...

          Did not put any high-frequency bypass capacitors around the 7812 voltage regulator - a lot of them "sing" should any wires be as much as moved.

          Put huge filtering capacitors at the output of the 7812, little tiny ones where the power rectifiers from the transformer feeds it.

          Yes, he learned electronic design in a year....

          And I am having to explain why the circuit won't work with an new upgrade which requires at least a steady 12 volt source without a 500KHz singing. They would drive a relay coil OK, but having any sort of logic anywhere around it is going to be a circus.

          Yes, his circuit does work. Most of the time. He got paid big bucks for this.

          Now, I am supposed to clean it up for janitor wage?!!?!

          This kind of "clean up the messes the quickie design guy made" is driving me nuts.

          For him, it was physical. Cut and paste from application notes.

          For me, its understanding exactly how it works and why it sometimes doesn't, so for me it is a language, trying to know all the nuances of its components.

          Maybe a better analogy to language is to study law in another language - where the slightest difference in the meanings of words can have consequences.

          --
          "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
          • (Score: 2) by VLM on Wednesday October 01 2014, @12:47PM

            by VLM (445) on Wednesday October 01 2014, @12:47PM (#100429)

            Put huge filtering capacitors at the output of the 7812

            Another classic LOL is most of the 78xx series has great reverse biasing / backfeeding protection up to 6 volts or so higher on the output than the input. So you get 5 volt supply "designers" putting 10000 uF on the output of a 7805 and it'll work forever, what with zero to 5 volts always being a bit less than 6 volts.

            Don't try that with a 7812, it might work for awhile... maybe. What with 12 volts being a bit larger than 6 volts quite often. Instant failure is not guaranteed either. There's probably a note about this somewhere on some data sheet. I see this all the time.

            A simple cheap high current protection rectifier will take care of that problem which I'm sure the guy spec'd if he cut and pasted outta the app note... errr... maybe.

            For the non-EEs the difference between a diode and a rectifier is if you only care about reverse capacitance and switching delay and "RF" stuff then you call it a diode and if you only care about 50 ms surge current, constant forward current, die to package thermal resistance, and how flat the graph of forward voltage drop vs forward current, then you call it a rectifier. In the old days you cared about peak reverse voltage but everything since 1985 seems to be rated to 1000 unless downrated for marketing, so it doesn't matter anymore. Its pretty much the same chunk of silicon and theory of operation.

      • (Score: 3, Insightful) by Wootery on Wednesday October 01 2014, @01:34PM

        by Wootery (2341) on Wednesday October 01 2014, @01:34PM (#100452)

        undercooked code is more profitable to make than tested stable code

        Generally this isn't true. Much of the cost of working with code is in maintaining it, not in creating it, and fixing bugs costs more than getting it right the first time.

        Perhaps if your goal is to get the thing working well enough that you can say the contact is fulfilled, then this sort of thinking makes sense, but if your business actually depends on software quality, then I don't see how it can.

        • (Score: 2) by Thexalon on Wednesday October 01 2014, @02:44PM

          by Thexalon (636) on Wednesday October 01 2014, @02:44PM (#100475)

          However, business seems to place a time value on code as well and undercooked code is more profitable to make than tested stable code.

          The business folks aren't entirely crazy when they make those decisions.

          Let's say you have a project which, if implemented correctly, nets you $100,000 a week. Now, if you release it with bugs, there are going to be significant problems that will end up costing you $15,000 a week. That means that delaying the release until it doesn't suck is costing you $85,000 a week. So the correct action, from a short-term business perspective, is to ship the bad version now and deal with the problems, rather than sit on it and get it right, and go back later to fix the bugs at some future date.

          Ah, but you cry "Why don't they go ahead and fix the bugs as the next project?" That's easy to explain too: Now fixing the bugs (for $15,000 a week gains) is now competing against another $100,000 a week project for resources within the business, and since $100,000 is more than $15,000, you would again be throwing away $85,000 a week for the time spent fixing the old bugs if you delay the new project.

          It is quite possible for a business to go through many rounds of this before the code maintenance advocates can put together a case for combining a bunch of $15,000 a week maintenance projects into 1 project that is now big enough to compete with the other projects available for management to choose from.

          And even better for management is when the tech team gets so fed up dealing with the $15,000 maintenance problems that they go ahead and work overtime on their own accord to fix them. Then their cost of fixing the bugs is basically $0, since tech workers are not eligible for overtime pay.

          --
          The only thing that stops a bad guy with a compiler is a good guy with a compiler.
          • (Score: 1) by anubi on Thursday October 02 2014, @06:17AM

            by anubi (2828) on Thursday October 02 2014, @06:17AM (#100835) Journal

            I wish I could have modded you "Insightful" for that, Thex.

            I know this as "Tragedy of the Commons", where anybody trying to "do the right thing" ( well, from a certain viewpoint, anyway ) will be economically annihilated.

            Especially your last paragraph. I get so fed up with seeing "my baby" with blemishes that I will work on my own accord to clean it up, even though I know full good and well that had I refused to deliver "on time", I would have likely detected and fixed that before our customers had to see it.

            You know, sometimes I wish our customers would be more insistent on stuff like this, cancel the credit card payment, and simply return the blemished thing so as to give management a clear signal that crappy release is not acceptable. The fact most people accept it only goads management into even higher pressure "send the dinner to the customer before its properly cooked" antics.

            This is not fun for either the design engineer or the customer. No design engineer I have ever spoken with likes producing an ugly baby. No customer has ever liked being used as a beta tester.
             
            Bounced payments hit management hard and gives them the corrective feedback an engineer cannot give.

            --
            "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
            • (Score: 2) by Wootery on Tuesday October 07 2014, @08:09PM

              by Wootery (2341) on Tuesday October 07 2014, @08:09PM (#103292)

              Bit of a nitpick but this isn't a tragedy of the commons. It's entirely possible that this would happen even if there were a total monopoly; indeed, it would be more likely to happen in such a case, as there'd be less pressure on software quality.

              I believe it is a tragedy of the commons regarding the consumer, though: if the consumers collectively refused to buy buggy software, then things would change, but each individual consumer benefits from buying the buggy software. (Of course, were such a boycott to happen, we might then see less worthwhile software overall, so it wouldn't necessarily be a good thing.)

      • (Score: 2) by tibman on Wednesday October 01 2014, @02:46PM

        by tibman (134) Subscriber Badge on Wednesday October 01 2014, @02:46PM (#100476)

        I'd love to get a 68k up and running. Do you have any recommendations?

        --
        SN won't survive on lurkers alone. Write comments.
        • (Score: 2) by VLM on Wednesday October 01 2014, @06:12PM

          by VLM (445) on Wednesday October 01 2014, @06:12PM (#100574)

          You may or may not like this web page. Its a hobbyist project and I have no idea if bare boards are available at this time.

          http://www.s100computers.com/My%20System%20Pages/68000%20Board/68K%20CPU%20Board.htm [s100computers.com]

          I'm sure you'd like everything about this book "68000 Microcomputer Systems Designing & Troubleshooting by Alan D. Wilcox. Prentice-Hall Inc. Publishers, 1987" Everything except the price. (its kinda rare, and in eternal demand, thus expensive). I have a copy and it's pretty good. Before going all S100 the book does descend into breadboarding work. That might be a good place to start.

          Another interesting home project is here

          http://n8vem-sbc.pbworks.com/w/browse/#view=ViewFolder¶m=ECB%20mini-M68000 [pbworks.com]

          I have this stuff on my desk and not enough time in the day to mess with it.

          • (Score: 2) by tibman on Wednesday October 01 2014, @07:46PM

            by tibman (134) Subscriber Badge on Wednesday October 01 2014, @07:46PM (#100618)

            Thanks for the links. For some reason it never occurred to me that offline documentation would be better than online.

            --
            SN won't survive on lurkers alone. Write comments.
          • (Score: 2) by tibman on Thursday October 02 2014, @04:37AM

            by tibman (134) Subscriber Badge on Thursday October 02 2014, @04:37AM (#100817)

            Bought the book your recommended. Breadboarding sounds like a perfect place to start.

            --
            SN won't survive on lurkers alone. Write comments.
          • (Score: 1) by anubi on Thursday October 02 2014, @06:43AM

            by anubi (2828) on Thursday October 02 2014, @06:43AM (#100837) Journal

            Excellent links, VLM.... same book I used. I recently gave up my 68000 design tools as I could not see me ever designing another one, given the amount of really super hardware available to me today. I ended up putting everything in a box and giving it to the company I designed the 68000 module for.

            If he is gonna mess with a 68000, he might wanna go for the 68HC000 - draws less power and if I remember right, is a static device and does not have a minimum processor speed.

            However, he might wanna look into the Motorola ColdFire processors. NetBurners use these. The ones I had came with Micrium's uC/OS software kernel loaded... a really nice high-reliability solution. There is yet another player in this... XCore .. but I haven't had the time to work with their board yet - but it looks like a really nice board.

            I figured I would never design another 68000, rather I would probably go NetBurner, XCore, or Raspberry PI for anything like what I used to use the 68000's for. Not to diss a wonderful chip - I really liked those things. Both hardware and software are quite elegant on this design. Its just that no-one seems to be using them for new designs much.

            One I would really like to horse around with is the Parallax next version of the Propeller chip. I feel if their design is gonna fly, it has to have at least enough ram in each cog to support a VGA frame. There are all sorts of things in robotics that chip would be useful for, and many will need enough ram to display to VGA or contain acceleration curves for stepper motors.

            Your OP might find this link useful [faqs.org]

            --
            "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
  • (Score: 2) by c0lo on Wednesday October 01 2014, @04:53AM

    by c0lo (156) Subscriber Badge on Wednesday October 01 2014, @04:53AM (#100298) Journal

    Malbolge Unshackled [wikipedia.org] - it's incredible productive if you have an eternity [xkcd.com] to invest on the learning curve.

    But with an eternity to spare, one doesn't need time [xkcd.com] or productivity [xkcd.com] management [xkcd.com].

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
  • (Score: 2) by Lagg on Wednesday October 01 2014, @05:59AM

    by Lagg (105) on Wednesday October 01 2014, @05:59AM (#100315) Homepage Journal

    It takes a week to be productive, it takes ten years to be competent.

    --
    http://lagg.me [lagg.me] 🗿
    • (Score: 0) by Anonymous Coward on Wednesday October 01 2014, @11:26AM

      by Anonymous Coward on Wednesday October 01 2014, @11:26AM (#100398)

      It takes me a lot less than a week to replace /usr/bin/python with a symlink to clang Or gcc.

  • (Score: 2) by mendax on Wednesday October 01 2014, @07:41AM

    by mendax (2840) on Wednesday October 01 2014, @07:41AM (#100340)

    For example, it took me a while to learn Fortran, less time to learn Basic. Then I learned Pascal and that took a bit of time because it has features that were important to understand and master. It's hard to believe but pointers were difficult for me at first. C as a language was simple but I had some trouble with aspects of the runtime libraries. C++ was very complicated because I had to overcome a cognitive hurdle of object-oriented programming, although the language itself was child's play because it's essentially C with "extra" stuff. Java was what C++ ought to have been in my opinion but it had its own problems because it operated in a different paradigm. Because I knew Java, Groovy was very easy... so long as I ignored the scripting language bits. But then I came to appreciate them and now I'm decent in Groovy. When the time finally came to learn Javascript (after many years of avoiding it) it was very, very easy, much easier than I could have imagined, although Javascript's object-oriented features have drive me crazy because they are bizarre from my previous experience. But I'll understand them eventually.

    In short, pretty much what really matters is experience. The more experience you have, the easier the task is to learn something new.

    --
    It's really quite a simple choice: Life, Death, or Los Angeles.
  • (Score: 3, Insightful) by khakipuce on Wednesday October 01 2014, @07:47AM

    by khakipuce (233) on Wednesday October 01 2014, @07:47AM (#100343)

    "What languages took you the longest to feel "fluent" in?"

    The first one! or at least the first of a new paradigm, OO took a while coming from structured but much of that was laziness and habits rather than actual difficulty in learning.

    In general I find it's not the language itself that people are slow to learn it's the libraries and new features. I'm currently stuck on C# stuff and my colleagues cannot be bothered with Linq even though it improves readability and productivity.

  • (Score: 1) by dltaylor on Wednesday October 01 2014, @08:02AM

    by dltaylor (4693) on Wednesday October 01 2014, @08:02AM (#100348)

    I started with the 8080/8085, then the Z80, then the 8086. Compared to the contemporary 6800/6809 and 68000, the architecture is a complete (and truly sick) joke, which, BTW, is why Intel is where it is (more in a moment). Keeping track of all of the register-specific operands and side effects of opcodes is a PITA.

    The 8088 was selected by IBM PURCHASING, over the engineer-selected Z8000 (nice little orthogonal, IBM 360-like, CPU) because Intel's CPUs were so bad that Intel was desperate for sales and would cut IBM any deal they wanted. Zilog, OTOH, was then owned by a little company called Exxon, which gave them financial heft even IBM couldn't match (remember that Exxon bought COUNTRIES). This is how the absolute worst possible, barely functional CPU architecture became the most-proliferated and financially successful, and my antipathy toward "materials people" remains to this day.

  • (Score: 2) by q.kontinuum on Wednesday October 01 2014, @08:08AM

    by q.kontinuum (532) on Wednesday October 01 2014, @08:08AM (#100349) Journal

    No, seriously. Any shell command is basically also a primitive shell script. Learn your first command, and if it fits your purpose it will increase your productivity. E.g. using "grep" command can save you a lot of time when processing log files. Learning the basic syntax of grep is hardly longer than 5minutes.

    On the other hand you can easily invest several months and still not be productive. E.g. if you want to learn Java to develop a Jenkins plugin, it will probably take considerably longer because you also need to learn some JavaScript, the Jenkins API, etc.

    --
    Registered IRC nick on chat.soylentnews.org: qkontinuum
    • (Score: 2) by hankwang on Wednesday October 01 2014, @04:13PM

      by hankwang (100) on Wednesday October 01 2014, @04:13PM (#100519) Homepage

      "E.g. using "grep" command can save you a lot of time when processing log files. Learning the basic syntax of grep is hardly longer than 5 minutes."

      Huh, depending on what you mean by 'basic syntax', I don't think most people will grok simple regular expressions in 5 minutes. Even I am regularly cursing at grep because I can never remember whether ( | ) { } ? are backslashed or not and work the same as in perl or not.

  • (Score: 4, Insightful) by PizzaRollPlinkett on Wednesday October 01 2014, @11:09AM

    by PizzaRollPlinkett (4512) on Wednesday October 01 2014, @11:09AM (#100392)

    What I've been confronting recently is the extreme hyper-fragmentation with so many languages. I often know what I want to do, but can't remember how to do it in a specific language. I would be incredibly productive if I could pick a language and specialize in it. Languages all have the same basic C-like syntax, but do the same things differently in small ways. (Consider getting the length of a string some time. Is it strlen(), length(), len(), obj.length, obj.size, obj.len, obj.length(), etc?) To me this causes cognitive overload. I have to look up basic library routines and methods because I can't remember how to do something in a specific language.

    I don't know how long it would take to learn a language these days. Back in the day, after a few years I became a C expert. I knew the language inside and out, and could do anything. But these days, I don't really want to become an expert in any language. I try to learn as little as possible to get things done, and not clutter my brain, because I know that next year some new language will come along, or today's language will revamp itself (like Python 3). The less I learn, the better, because things change too quickly.

    --
    (E-mail me if you want a pizza roll!)
    • (Score: 3, Interesting) by VLM on Wednesday October 01 2014, @12:21PM

      by VLM (445) on Wednesday October 01 2014, @12:21PM (#100415)

      Consider getting the length of a string some time. Is it strlen(), length(), len(), obj.length, obj.size, obj.len, obj.length(), etc?

      Sounds like a lack of standardization. Someone should create a new standard ... oh wait I've seen where this leads.

      I've got another one for you thats even worse. The first item in all arrays in all languages has an index of 0 or 1 ? I can think of two famous examples beginning at 1.

  • (Score: 3, Insightful) by VLM on Wednesday October 01 2014, @12:35PM

    by VLM (445) on Wednesday October 01 2014, @12:35PM (#100421)

    What language concepts do you still have trouble grasping?

    None of them, I waste my time on libraries and idioms.

    "Wait, I can replace this entire development project with one module from CPAN, the entire project becomes a couple line library wrapper, what?"

  • (Score: 2) by VLM on Wednesday October 01 2014, @12:54PM

    by VLM (445) on Wednesday October 01 2014, @12:54PM (#100430)

    productive in different programming language

    Productive means different things in different langs. Based on historical observation of many projects:

    You might squirt out 1000 LOC per day in Java, which sounds great, until you realize "hello world" in java is about 20K LoC in length. Don't forget you get to troubleshoot all 20K lines.

    In Perl you might spend all week Fing around with CPAN modules that do the whole task so you might be "less productive" as measured by LoC, maybe just one or two lines per day, but still get done quicker and fewer bugs and more features than the java guys.

    • (Score: 2) by darkfeline on Wednesday October 01 2014, @07:11PM

      by darkfeline (1030) on Wednesday October 01 2014, @07:11PM (#100597) Homepage

      >until you realize "hello world" in java is about 20K LoC in length

      Ah yes, good old Enterprise Java TM (c) (r) 2014. No program can be complete without a little AbstractBeanFactoryFactory.

      --
      Join the SDF Public Access UNIX System today!
    • (Score: 0) by Anonymous Coward on Thursday October 02 2014, @04:15AM

      by Anonymous Coward on Thursday October 02 2014, @04:15AM (#100813)

      Oh yes, Perl. That big, complicated language. But, if you want to do text processing, it is so fucking powerful. Sooo much faster than C++.

  • (Score: 2, Interesting) by barnsbarns on Wednesday October 01 2014, @03:44PM

    by barnsbarns (4730) on Wednesday October 01 2014, @03:44PM (#100502)

    There's a certain level of difficulty in getting used to the stack that you're working on that lies outside of your specific compiler/interpreter as well. Once you understand how streams, pipes, files, &c are used on your stack, then you have a considerable amount of insight when working in new languages that might drive your learning and productivity. I work in data analytics, so one of the first things I'm interested in when learning a new language is how does it handle opening, writing/reading and flushing files for accessing data. I usually look at examples of file handling before going through other basic features like control structures, which are typically similar between a vast majority of major languages.

    If you focus on specific features that you need sooner rather than later, you can become more productive faster. That may sound like common sense, but sometimes I feel as though the focus on languages becomes more about new features and less about solving problems.

    • (Score: 2) by tibman on Wednesday October 01 2014, @04:45PM

      by tibman (134) Subscriber Badge on Wednesday October 01 2014, @04:45PM (#100535)

      I like that approach. No need for closures, lambdas, dependency injection until after learning control structures and io. For me it is usually learning the new IDE that takes longer than the new language.

      --
      SN won't survive on lurkers alone. Write comments.
      • (Score: 1) by barnsbarns on Thursday October 02 2014, @03:01PM

        by barnsbarns (4730) on Thursday October 02 2014, @03:01PM (#100973)

        Agreed! Though I had had some experience with emacs before learning a Lisp, it took some time to get the SLIME keybindings under my fingers and feeling comfortable. That's a major productivity boost to have a tool that you're very familiar with.