Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 19 submissions in the queue.
posted by janrinok on Monday April 07 2014, @02:51PM   Printer-friendly
from the I-forget-more-than-I-remember dept.

I've historically always tried to stick to one or two big languages, because as soon as I start deviating even for a week, I go back to my primaries and find that I, humiliatingly, have forgotten things that anyone else would be completely incapable of forgetting. Now, I'm going to be learning assembly, since that kind of thing falls in line with my interests, and I'm concerned about forgetting big chunks of C while I learn. I already often have the standard open in a tab constantly despite using C since 2012, so my question is, how do you guys who are fluent in multiple languages manage to remember them? Have you been using both for almost forever? Are you all just mediocre in multiple languages rather than pro in one or two?

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by cmn32480 on Monday April 07 2014, @02:56PM

    by cmn32480 (443) <reversethis-{moc.liamg} {ta} {08423nmc}> on Monday April 07 2014, @02:56PM (#27534) Journal

    Where a new piece of information pushes out an old piece? It means your brain is full.

    I have the same issue when learning things that are related, but totally different, and am curious to see the results of this question.

    --
    "It's a dog eat dog world, and I'm wearing Milkbone underwear" - Norm Peterson
    • (Score: 2) by The Mighty Buzzard on Monday April 07 2014, @04:16PM

      It's simple, you don't. The more years you use something the more of it will indelibly stick but you always forget things. What you do your damnedest to remember is where to look to quickly refresh your memory so you can say 'duh' and get back to work.
      --
      My rights don't end where your fear begins.
      • (Score: 1) by rochrist on Monday April 07 2014, @06:18PM

        by rochrist (3737) on Monday April 07 2014, @06:18PM (#27649)

        Yeah, I find that with things like programming languages, I just /can't/ remember the details that I'm not using every day. The thing is, anything like that you get back extremely quickly when you start using it again. The key is to have a few familiar references that you are familiar with.

    • (Score: 2) by davester666 on Monday April 07 2014, @06:17PM

      by davester666 (155) on Monday April 07 2014, @06:17PM (#27648)

      Do you really want to know the answer, if it means the loss of a memory of something else...

    • (Score: 1) by chewbacon on Monday April 07 2014, @06:21PM

      by chewbacon (1032) on Monday April 07 2014, @06:21PM (#27650)

      Yes, but it doesn't mean you're getting old. When my college buddies taught me what liquor was, I forgot how to drive!

  • (Score: 4, Interesting) by Geezer on Monday April 07 2014, @03:07PM

    by Geezer (511) on Monday April 07 2014, @03:07PM (#27545)

    Learned COBOL and Fortran in college. Self-taught C and Turbo Pascal to do embedded automation. Studied ladder logic and object-oriented flowcharting for PLC's. Learned HTML to do modern SCADA and HMI.

    I am by no means a 1337 h4x0r on any of them, but I get my job done.

  • (Score: 5, Interesting) by lhsi on Monday April 07 2014, @03:13PM

    by lhsi (711) on Monday April 07 2014, @03:13PM (#27551) Journal

    A couple of my professors at Uni taught the idea that you can learn how to program in general and after that you just need a reference to the specific syntax to switch between languages. A loop and a conditional statement will look similar in most languages, just with different words and symbols (some might use curly braces whereas some would use BEGIN and END for example). The claim was that relatively little regarding new concepts had been added to programming languages for a couple of decades and that you should learn the concepts and just pick up the syntax when needed.

    I think their claim makes more sense from an academic point of view, when looking at a language on its own, but nowadays a lot of the more popular languages (in industry) make use of large libraries, which are going to be different for each language they are supporting.

    • (Score: 5, Interesting) by iwoloschin on Monday April 07 2014, @04:35PM

      by iwoloschin (3863) on Monday April 07 2014, @04:35PM (#27610)

      ...but you're doing the same thing, regardless of how the library works, right? In which case, it's just a case of knowing where to look up the reference.

      I did a large project at work a couple years back with embedded C. I had used it before, about 5 years earlier, and had spent much of the intervening time in MATLAB. Some things were the same, others (like typed variables or strings) were completely different and very sad to relearn. But I knew what needed to happen in each case, just not how to do it, which meant it was a matter of finding the correct reference. Now I'm moving to "embedded" Linux projects (think BeagleBone Black), which means I can be "lazy" and use Python, hardly an embedded language. Setting up an interrupt in Python is much harder than a true embedded system, but it's a matter of figuring out how to do it, then adapting the examples to what I need. There's a lot of other things you need to learn (like how is a GPIO pin mapped into userland in Linux), but the concepts are the same, assuming you understand the basics of interrupts.

      Your professors were right. Maybe slightly off scope, but in general, I'd agree with them. Then again, my only formal programming training was AP CS in high school. After that, I went to school for an EE degree and I've learned programming on the side...so to me all software is just another tool to get the job done.

    • (Score: 3, Informative) by Koen on Monday April 07 2014, @10:42PM

      by Koen (427) on Monday April 07 2014, @10:42PM (#27834)

      A couple of my professors at Uni taught the idea that you can learn how to program in general and after that you just need a reference to the specific syntax to switch between languages. A loop and a conditional statement will look similar in most languages, just with different words and symbols (some might use curly braces whereas some would use BEGIN and END for example). The claim was that relatively little regarding new concepts had been added to programming languages for a couple of decades and that you should learn the concepts and just pick up the syntax when needed.

      That is true if you stick to the Fortran/Algol/C families of programming languages (including their Object Oriented offspring).

      Give a functional [Lisp, etc...], non-imperative language [like Prolog (which is Turing complete but does not have loops: all repetition is done by recursion) or even SQL (without procedural extensions, which is not Turing complete)] or a stack based language [such as Forth & Factor] a try: it will force you to wrap your mind in different ways.

       

      I think their claim makes more sense from an academic point of view, when looking at a language on its own, but nowadays a lot of the more popular languages (in industry) make use of large libraries, which are going to be different for each language they are supporting.

      Popular libraries often have wrappers/bindings for use in other programming languages.

      --
      /. refugees on Usenet: comp.misc [comp.misc]
    • (Score: 2, Informative) by zsau on Monday April 07 2014, @11:51PM

      by zsau (2642) on Monday April 07 2014, @11:51PM (#27858)

      By day I program in PHP, SQL and Javascript; by night I program in Haskell and (at times) TeX. I can assure you that a function definition or an if condition is nothing alike in the PHP and Haskell (although it's possible to do functional style repetition --- that is, map and fold).

      But even if this kind of stuff was the same, there's the fact that in PHP you want iteration for best performance; in many functional languages, you want tail recursion; but in Haskell iteration is bizarre and tail recursion will exceed your stack space unless you use a non-lazy operator so you use foldl (fold left: a non-tail recursive function).

      So I think you do need to know what language you're programming in. This will generally only come with experience though (for me). If you read up on a new language, they might tell you the gotchas (like some of the implications of laziness in Haskell), but I never remember them at that point. It takes time to consolidate the knowledge.

  • (Score: 5, Informative) by VLM on Monday April 07 2014, @03:14PM

    by VLM (445) on Monday April 07 2014, @03:14PM (#27552)

    Its exactly like literature or math where the first time takes forever and it seems impossible, and the twentieth time you glance at a google search result just to make sure and you're right back up to speed.

    Ditto big old codebases... Took days to write this three years ago and I haven't looked at it since... it'll only take an hour to totally understand it again next time around, if that, not days.

    There's also a level of detail, like memorizing geography. Its kind of important to remember printf has an octal output format character, well, kind of. But in the era of google its far more important to remember that it exists, than to memorize its a lowercase "o".

    Fixing compile time bugs because you forgot if SQL INTERVAL are plural or singular takes seconds, at most a minute. Fixing architectural bugs can take weeks/months. So invest your time and memory bandwidth appropriately. Especially domain specific / business analyst type requirements need to be memorized.

    Getting the syntax perfect in a n ** 3 algorithm is pointless if you should have been storing up the brain cells to implement a log n, maybe with a few dumb typos to fix at compile time that'll only take seconds or minutes to fix.

    Memorize features and abilities and geographies, sorta. Maybe limitations. Don't sweat the small stuff.

    It is very much like moving to a new city and trying to find your way around.

    • (Score: 1, Interesting) by Anonymous Coward on Monday April 07 2014, @04:13PM

      by Anonymous Coward on Monday April 07 2014, @04:13PM (#27593)

      Some job interviews require you to sweat the small stuff though. e.g. write a simple working C or C++ program on a whiteboard without looking stuff up...

      I can't do that and I'm not a good coder so I'd rather not work for such places anyway. BUT I often have a good idea of what should be done though. For example in a previous workplace I had just joined the company and they were discussing what a new system should do before it figured out which port a client was attached to. They were initially on the side of showing something to the client ASAP in the interest of responsiveness. But I said we shouldn't - it's much better to wait first. We avoid showing clients the wrong thing[1] and we avoid race conditions. Fortunately I managed to convince them to do it that way.

      [1] Yes initially we may only need to show all clients the same thing no matter what port they are on, but once we build the system that way it's harder to change it later on if we need to do things differently. Plus we still can't let clients do anything further till we figure out what port they are on anyway, so we gain little, and the users might get confused and complain (see first page immediately but not be able to do stuff). Better to not show them anything first.

      • (Score: 3, Insightful) by VLM on Monday April 07 2014, @04:28PM

        by VLM (445) on Monday April 07 2014, @04:28PM (#27604)

        "Some job interviews require you to sweat the small stuff"

        Invariably the kind of place that puts you thru a wringer writing a syntactically correct python implementation of a B-tree is the kind of place that wants you to write RoR CRUD apps. Not that there's anything wrong with Btrees or CRUD apps or python or ruby, its just the presence of that kind of stuff implies the actual job requirements will have nothing to do with the interview, which makes your intel gathering way harder. Its a bozosity bogosity warning indicator. Might still be a nice place to work, if only their interviewing is screwed up. But, certainly a warning sign to keep a look out.

        So ... I read my Sedgewick and my Knuth now wheres the completely unrelated job that entitles me to?

  • (Score: 5, Informative) by pe1rxq on Monday April 07 2014, @03:18PM

    by pe1rxq (844) on Monday April 07 2014, @03:18PM (#27555) Homepage

    Using a language for 2 years is not a very long time. Having the standard open is a good thing.
    Please keep using it. I have been programming in C for a good decade longer and still occasionally use the standard. My assumptions usually turn out to be right, but checking once in a while never hurts.

    I would not trust code from someone who claims to never use any language standards or documentation.

    As for the assembly: If you also look at the assembly generated by your C compiler (start with small functions) you will improve not only your assembly skills but also get a good insight into the relation between C code and the generated assembly.
    C is a great language for such a comparisson as it is low level enough that you can easily recognize what the compiler is doing.

    • (Score: 3, Interesting) by Subsentient on Monday April 07 2014, @08:51PM

      by Subsentient (1111) on Monday April 07 2014, @08:51PM (#27775) Homepage Journal

      I have a rule, "If I forget something or wonder about something, I must always immediately drop everything and look it up in the standard until I understand it".

      --
      "It is no measure of health to be well adjusted to a profoundly sick society." -Jiddu Krishnamurti
  • (Score: 1) by DNied on Monday April 07 2014, @03:21PM

    by DNied (3409) on Monday April 07 2014, @03:21PM (#27560)

    Not with assembler, that is. It's so different that it won't clash with C (or other high-level languages) inside your brain.

    In my experience, confusion usually arises between languages whose syntaxes present similarities. Like C and PHP, for example.

    Background info: I'm mediocre in several languages, used to program in 6502 assembler in my early teens.

    • (Score: 3, Interesting) by NCommander on Monday April 07 2014, @03:28PM

      by NCommander (2) Subscriber Badge <michael@casadevall.pro> on Monday April 07 2014, @03:28PM (#27567) Homepage Journal

      Depends how many different architectures you work with; I work with ARM, AArch64 and x86 related, and I still occassionally get my opcodes crossed.

      --
      Still always moving
      • (Score: 1) by GmanTerry on Monday April 07 2014, @11:43PM

        by GmanTerry (829) on Monday April 07 2014, @11:43PM (#27855)

        I used to get Z80 and 6502 mixed up. Ahh, the good old days.

        --
        Since when is "public safety" the root password to the Constitution?
    • (Score: 1) by moondoctor on Monday April 07 2014, @03:48PM

      by moondoctor (2963) on Monday April 07 2014, @03:48PM (#27575)

      'mediocre in several languages, used to program in 6502 assembler in my early teens'

      same for me.

      i totally agree, assembly is so different it doesn't really make a difference to me.

      java and c? yeah, bit like the romance language thing. can be confusing when getting up to speed on both at once. i had latin class following french when i was little and it caught me out a few times standing in front of the class babbling the wrong words.

      as others have said, 2 years is not that long. if at this point it's taking you a week to completely switch gears and get your mind spewing one language over another that's not too bad.

      in my experience it's things you use/do on a daily basis for a long time that become second nature. i'd wager if you used 2 languages extnsively every day for many years, you'd be master of both and also very likely have some interesting insight on programming that people versed in one or the other might not.

    • (Score: 2) by threedigits on Tuesday April 08 2014, @07:22AM

      by threedigits (607) on Tuesday April 08 2014, @07:22AM (#28016)

      It's so different that it won't clash with C

      Au contraire my friend. "C" is basically an enhanced version of Assembly. You can think of it as a little coat of syntactic suggar. Learning Assembly will actualy improve his C skills.

      • (Score: 0) by Anonymous Coward on Tuesday April 08 2014, @01:09PM

        by Anonymous Coward on Tuesday April 08 2014, @01:09PM (#28117)

        This may be true for early (K&R) C. But for modern C (with type based aliasing rules and all that), thinking too low level may actually cause you to write erroneous programs. For example, the following code is not guaranteed to give you the result you'd expect from knowing your low-level stuff:

        #include <stdio.h>
         
        int main()
        {
          int value = 13;
          short* ptr = (short*)&value;
          *ptr = 5;
          printf("%i\n", value);
        }

        The C standard allows this program to print 13. And some compilers may actually produce code that does.

  • (Score: 3, Insightful) by umafuckitt on Monday April 07 2014, @03:23PM

    by umafuckitt (20) on Monday April 07 2014, @03:23PM (#27563)

    The one language I have used consistently over the last ~15 years is MATLAB. Other than that I've jumped in and out of Perl, Python, and little C++ over that time. I even dabbled a little with PHP and JavaScript. At one point I was starting to get relatively fluent in Perl, and even read the camel book from cover to cover, but now I've dropped it and I'd have to start again from the beginning. I've "learned" Python on three 3 occasions so far, each time I hope I'll have a use for it and each time I end up leaving it again. The C++ is picking up quite well recently, because I have microcontroller projects that need to be pushed along. Now I'm also learning LabVIEW for work reasons, but that sucks balls and I don't enjoy it.

    What I've realized is that you have to use the languages you learn. Pick a selection of about 3 languages that complement each other and that you actually use regularly. If they're not really critical to you, you will just half-learn them for a particular job, then forget them soon after. In these circumstances it might be better to use a language you already know and polish your skills in that, rather than wasting time half-learning something new.

    • (Score: 2, Interesting) by physicsmajor on Monday April 07 2014, @04:22PM

      by physicsmajor (1471) on Monday April 07 2014, @04:22PM (#27599)

      I am a core dev on a scientific Python toolkit, and in my humble biased opinion Python, today, is far superior to Matlab in ease of use, ergonomics, and capability. Everyone I've sat down with and explained the differences has agreed, but realizing I have a biased view, would you mind elaborating on the issues you encountered with Python?

      Usability? GUI? The Microsoft Office effect (everyone uses ______, including collaborators)? Honest question. Because we need to know to direct and fix any ways we may be failing our users, or failing to attract and keep those who try Python out.

      Thanks.

      • (Score: 3, Interesting) by umafuckitt on Monday April 07 2014, @05:19PM

        by umafuckitt (20) on Monday April 07 2014, @05:19PM (#27628)

        I'd love to discuss with you, since shifting to Python would be nice. It'll make nicer GUIs (Qt) if I need that and it's a general-purpose programming language, so larger projects and stand-alone applications become easier. The OO is much nicer than even the newer system in MATLAB. I can see all those advantages. The lack of a MATLAB-like GUI isn't a problem for me, I use MATLAB from a console prompt and edit in Emacs. Two things have put me off Python as a MATLAB replacement. The first isn't Python's fault: I have a big MATLAB codebase for my current projects and it would take months to re-code it all in Python. Furthermore, my colleagues also use MATLAB so it would hurt collaboration. In the future, if I start from scratch on a new project, I will consider Python from the outset. The second problem I had was that data analysis in Python with numpy, etc, felt more clunky and less cohesive than MATLAB. Last year I messed around for a month re-implementing some of my MATLAB code in Python. I was successful with a few things but the process didn't convince me that I want to move just yet. The following are the issues I recall having:

        A. I re-wrote my data import MATLAB routines in Python. This involves parsing an XML file and loading a bunch of associated TIFFs and turning them into a multi-page TIFF that can viewed a "movie." This process took a while, as I'm not very familiar with Python, but the resulting class did everything I wanted. I then went on to make an plotting function based on this class. I wanted to make animated graphs (load the multi-page TIFF file, display it as a "movie" and have some dynamic graphs alongside the "movie"). After a few hours screwing around with matplotlib I discovered that it wouldn't do this (at least not fast enough). I searched for a while and chose PyQtGraph. I recall it being a bitch to install all the dependencies for PyQtGraph in OS X. Eventually I got everything together and made my animated plots. The resulting plots arguably look better than MATLAB and the rendering speed was faster. The annoying thing, though, was the amount of time I wasted choosing and installing PyQtGraph and then figuring out the new syntax (different from matplotlib and not intuitive) to make my plots. The fact that there are multiple plotting system with overlapping abilities is a bonus in some ways, but it also disrupts workflow. It also slows you down having to learn the syntactic idiosyncrasies of each package.

        B. The second issue that I had involved re-implementing image registration. I use a MATLAB function from the FileExchange that performs an FFT-based image translation correction. I decided to re-write it in Python as an exercise. As I recall, there were some things I didn't like so much in Python. The linear algebra is a little more long-winded in Python; it's easier to read in MATLAB. I also had issues getting my Python code running at the same speed as the MATLAB original. I wasted quite a lot of time on that [stackoverflow.com] and had trouble finding a fast way to do the FFT. I got some speed improvements by tinkering and I learned more about programming, but I never got it as fast as the MATLAB version due to the FFT issues (I gave up at that point). Overall, it seemed like a lot more work than it was in MATLAB.

        C. The MATLAB toolboxes are often very mature, have excellent documentation, and are consistent with respect to each other. So it's very easy to get started in a new branch of analysis in MATLAB. In Python I felt like I was spending ages on Google trying to find the right package for the job, say image processing. I know there's image processing stuff out there, but there's research involved in finding the package that does what you want, is well documented, etc.

        D. I found a few profilers [stackoverflow.com] for Python, but nothing I liked as much as MATLAB's profiler, which I have used often to improve my coding practices.

        Overall, I can see that Python could work very well as a MATLAB replacement but the amount of time I'd have to put into it it (improving Python skills, researching packages, learning multiple packages where one ought to suffice) isn't worth it for me right now. In the future I hope to switch but I can't justify the time at the moment.

  • (Score: 4, Insightful) by Lagg on Monday April 07 2014, @03:31PM

    by Lagg (105) on Monday April 07 2014, @03:31PM (#27569) Homepage Journal

    I sometimes wonder if I'm in the situation you are. But then remember that I already know programming in general and so am able to get back into rhythm regardless of language once I jog my memory and remember the relevant syntax. This is a big issue in academia and I'm happy to see that people still self-teach. You see, these days people are being taught how to write Java or whatever other crap they use these days instead of being taught how to program. i.e. being told what pointers are, what references are, the fundamentals of OOP and so forth. There is a trend where they forego all that and instead tell people "This is the lines you should write to render a window" and nothing beyond that. It's a terrible thing really and is probably why you're concerned about forgetting the language.

    In any case, having the standard open is a good thing. And referring back to TCPL and such is also good. It keeps you in good practice. As long as you remember the fundamentals of programming and what happens when those raw opcodes are read and ran you will be fine. Considering you already write C and will be writing assembly I don't think you'll have much issue with that. Just remember that syntax is irrelevant. After 15+ years of writing code in a plethora of languages I can easily say that sometimes you will forget things about syntax. It's just too much to retain and be able to readily reproduce.

    --
    http://lagg.me [lagg.me] 🗿
  • (Score: 4, Informative) by sl4shd0rk on Monday April 07 2014, @03:48PM

    by sl4shd0rk (613) on Monday April 07 2014, @03:48PM (#27576)

    I've spent a considerable amount of time developing in roughly 10 different languages since the late 80s. Perl is probably the craziest both in terms of it's developer community and syntax. Objective C had the most annoying learning curve due to it's deliberate attempts to try and resemble something other than C. Perl continues to be the most masochistic both in terms of syntax and community. Learning Perl is no fun either - even Google tells you to RTFM. Assembly was the scariest to learn as reproducible errors are usually not something you want to reproduce. Java was probably the most fun to develop in however also the most disheartening as you realize once the spec is complete, you now need to upgrade the hardware in production.

    Once you get a few languages under your belt, it's not too bad switching between them. The way you write code, break out methods/functions/subroutines, comment things, approach problems, design flow control, all really kind of stays the same except when you are working around a language constraints (Python). Even languages you haven't used in a couple years begin to come back to you after an hour or two moving around in them. It's a lot like learning to ride a bicycle. Once you get it, you got it. Your brain just kind of remembers even when you don't.

    • (Score: 2) by VLM on Monday April 07 2014, @04:39PM

      by VLM (445) on Monday April 07 2014, @04:39PM (#27612)

      "Learning Perl is no fun either"

      FYI 2014 edition of "modern perl" just released.

      The 2012 edition feels like my first copy of K+R back in the mid 80s, how did they put so much information into so few well designed pages? Its got everything needed, yet so few pages. Its not a "paid by the page" 1000 page textbook.

      The worst part of learn-by-google is you often pull up some tutorial last touched in '96 about perl 4.0 and then get some very peculiar ideas about what modern Perl development looks like.

      The second worst problem is people think its funny to talk about dumb ideas. Google will find lots of blog posts along the lines of

      goal: implement OO myself in Perl without using MOOSE, for the sheer heck of it.

      result: sucks like the festering dung of 1000 camels so I'm going back to visual basic

      Then noob finds that via google and becomes terrified, while old guru sees it and merely says "use Moose;"

      • (Score: 2) by sl4shd0rk on Monday April 07 2014, @09:32PM

        by sl4shd0rk (613) on Monday April 07 2014, @09:32PM (#27800)

        Very good point on the learn-by-google approach. I would not recommend it as a starting point for the very reasons you point out. A good book is, imho, still the best way to get up to speed in a reasonable amount of time. Typically, publishing companies have some decent talent behind the technical editing.

    • (Score: 3, Insightful) by bucc5062 on Monday April 07 2014, @08:10PM

      by bucc5062 (699) on Monday April 07 2014, @08:10PM (#27735)

      Was going to post at root, but you voiced my thoughts. I started professionally in the early 80's, but cut my teeth back in college with a professor who's focus was not on a specific language, but on the concepts of language. He taught taht once you could find these foundations, the language then broke down into syntax and semantics. I know i have forgotten more then I learned for my business career toke me through COBOL, Pascal, RPG II, (c RatFor in college), IBM JCL, then the big switch to client server and Basic, .net stuff, and more currently Javascript, and Java.

      When I've interviewed I have tried to put across that it is not so important being an "expert" in a language these days, ffs they change so much now, but it is more important how to design good code, to listen to client requests, to think. Before Google I had a lot of books. These days Google puts at my finger tips the specifics I need for an idea or concept I already know. Maybe that makes me a hack, don't care. I am good at what I do and I enjoy it (to some degree). WHat IT and languages will be in the next 10 years who knows. I'd rather be caring of horses by then and not worry any more about staying current on a language that will be replaced.

      --
      The more things change, the more they look the same
  • (Score: 0) by Anonymous Coward on Monday April 07 2014, @03:55PM

    by Anonymous Coward on Monday April 07 2014, @03:55PM (#27580)

    I bought ginkgo to help improve my memory but it doesn't work because I keep forgetting to take it. Any suggestions?

  • (Score: 2) by istartedi on Monday April 07 2014, @03:59PM

    by istartedi (123) on Monday April 07 2014, @03:59PM (#27582) Journal

    I learned C when some of the young ones here weren't even born. Before 'net access was good I'd look in my book called "C for Programmers" for things like format strings. Who remembers things like "%i3.*", which is probably not even valid because I don't feel like looking it up? Now that we're online there are some handy things like this [rigaux.org] or similar things that are more specific to your language. I handled VI like that too. There was a printed cheat-sheet in my cube, then I learned the common commands... then I was no longer in situations where I had to use VI (or Emacs, so let's not go there). If I had to use it again, I'd probably print out the cheat-sheet again.

    --
    Appended to the end of comments you post. Max: 120 chars.
  • (Score: 2, Informative) by Aiwendil on Monday April 07 2014, @04:27PM

    by Aiwendil (531) on Monday April 07 2014, @04:27PM (#27603) Journal

    Save a few snippets of code you can read to refresh you memory and use a good reference. Both of them are great for refreshing your memory and should cut the "relearn"-stage down to a mimimum.

    Also, do revisit old languages every now and then - no need to code in them but just read some code written in it every couple of months.

    And as many others has pointed out - if you learn the concepts rather than the syntax it is very little difference between two languages of the same type.

    But whatever you do - get your hands on a good reference, it will be useful even when it is for your primary language

  • (Score: 1, Informative) by Anonymous Coward on Monday April 07 2014, @04:28PM

    by Anonymous Coward on Monday April 07 2014, @04:28PM (#27606)

    Unix is a superior development environment.

    You forget some small detail, you drop out to the shell, type "man ...", check the manual, find the detail, and get back to coding.

  • (Score: 1) by pmontra on Monday April 07 2014, @04:32PM

    by pmontra (1175) on Monday April 07 2014, @04:32PM (#27607)

    I can split my developer career in two: C (without ++) and Perl in the '90s, Ruby and JavaScript (client and server side) in the last 8 years, plus a little Python (I feel dirty after that). Some project management and some Java in between but that's out of scope. Actually, I started programming with JavaScript since the very first day (it was called LiveScript) but it was mostly an tool to make sites look a little fancy back then.

    If you ask me how to write a C program (include, static main, argc, argv, etc) I think I have to google it. If you ask me the exact syntax for extracting a substring in Perl I think I'll do a man perlfunc. I wrote that kind of code by hearth hundreds of times, but I also computed integrals hundreds of times when I was a student (probably thousands) and I remember only the formulas for the polynomials now. Not using something makes it fade away.

    Even in Ruby and JavaScript there are some less used functions and methods that I can't remember and I have to look in the reference. I do that daily. Example: how do you invert an array in Ruby? Was that revert, invert, reverse, what? Well, I know that's a method of the Array class so 10 seconds on the language site reveal that it's reverse, or reverse! if you want to change the object itself.

    Googling an answer is often faster than trying to remember it. I usually only need to look at the short summary under the first results in the first page of Google. Furthermore the APIs we are programming against are fairly large: there is not only the language and its standard library. There are the countless libraries that we must use not to have to reinvent a much worse wheel for every functionality we need to implement. An example: I remember how to use the Paperclip Ruby gem to load an image from a web form (or Carrierwave, if you wish) but I usually end up copying the 3 or 4 configuration lines from a previous project. It saves time and I'm sure I'm not making mistakes, so more time saved.

    I don't feel ashamed. There are a number of things I can remember and knowing which tool I must use and the general idea behind it is more important than knowing the exact details of the tool.

    • (Score: 1) by germanbird on Monday April 07 2014, @05:08PM

      by germanbird (2619) on Monday April 07 2014, @05:08PM (#27623)

      ...I wrote that kind of code by hearth hundreds of times...

      I too enjoy programming by the fireplace.

      (Sorry. Couldn't resist.)

      • (Score: 1) by pmontra on Monday April 07 2014, @05:19PM

        by pmontra (1175) on Monday April 07 2014, @05:19PM (#27629)

        Thank you (really). English is not my native language. Maybe I'll remember how to spell that now, both of them :-)

    • (Score: 2) by RamiK on Monday April 07 2014, @05:41PM

      by RamiK (1813) on Monday April 07 2014, @05:41PM (#27637)

      Same here. C and post-C languages are so similar that anything curly braces and/or semicolon just throws me off in the most embarrassing ways possible.

      But you shouldn't feel too bad about it. The fact Go (golang) was made explicitly with the aim at staying small enough so programmers could get a handle on the whole language is proof enough this problem is recognized as a serious concern even for it's very smart and capable inventors.

      --
      compiling...
  • (Score: 2) by cosurgi on Monday April 07 2014, @04:39PM

    by cosurgi (272) on Monday April 07 2014, @04:39PM (#27613) Journal

    My basic approach is to just remember what kind of stuff exists. Not particular syntax.

    It is quite normal that I google: "python read text file", "python ternary operator", "ocaml loop", "c++ template", "mathematica conditional". Then I just look at the first example to check the right syntax and problem solved - I can go forward with coding.

    --
    #
    #\ @ ? [adom.de] Colonize Mars [kozicki.pl]
    #
  • (Score: 1) by jdccdevel on Monday April 07 2014, @04:40PM

    by jdccdevel (1329) on Monday April 07 2014, @04:40PM (#27614) Journal

    I've been programming off and on for 15 years. Switching between language syntax isn't (and shouldn't be) a big deal. Once you've learned the concept of how a language construct works (Not just loops and arrays, but hashes, Closures, anonymous functions, objects, etc) transitioning from one to the other should be easy. Most "Procedural" programming languages (C, Perl, PHP, etc) have a similar syntax, same with "Object Oriented" languages. "Java, C++, Python). I haven't dabbled that much with functional languages (Lisp, Scheme, etc) but what I've seen the syntax is very similar. Each style of programming language has a particular logic flow, and that's the part you're really reading and writing anyway.

    That being said, the hard part is the API. The API is what you need to be productive, and how you interface with the world. Each language has a very unique one, and there's a LOT of detail to remember. This is where a good reference is invaluable. Making good notes helps a lot too. So does reviewing your old code written in the same language before you start.

    Usually I find that I can READ most code easily, because what the API does is usually obvious from the way it's called. WRITING the code, on the other hand, takes lots of time. I've honestly given up on trying to memorize the api details. Maybe if I could stick to one language for a couple of years, I wouldn't need the API reference open all the time, but so far I haven't had that opportunity.

  • (Score: 1) by fadrian on Monday April 07 2014, @05:18PM

    by fadrian (3194) on Monday April 07 2014, @05:18PM (#27627) Homepage

    Just the vocabulary and syntax has changed. Plus the stupid idioms of whatever framework pidgin you happen to be trying to absorb this week.

    Anyone who's learned a couple dozen computer languages can do it. Just be happy you're not living in the early eighties, when PL research wasn't yet conflated with type theory and new programming languages (and new constructs) came out in about every journal. Not to mention UNIX being hoarded and each workstation vendor having its own OS, editors, languages, and (if you weren't lucky) UI.

    When I was in grad school, I not only had different systems for my classes, but for the teaching and research assistantships I held. And that didn't count the three other systems I had to use to do my thesis research - one for running the simulations (big iron, that), another for everyday use and getting jobs ready to submit to the big iron, and the final one for actually editing/formatting the thesis. I counted - one semester I was using ten different computers with ten different OS'es and twelve different editors (think TOPS-10 with TVEd, HP 1000, HP 300 with the Amigo OS), a Burroughs 5700, CDC Cyber 175, two flavors of IBM and associated systems, DEC PDP-11's (one with RSX-11M, one with RT-11), and a couple of VAXen, one using VMS and one running UNIX. And even if I forgot one or two, it's not counting things I used infrequently like the decrepit Bendix G-20 kept by the EE department that I got to run SPICE on (along with the IBM 026 keypunch I got to use to prepare input for the same).

    I'll tell you, the world of computing has become much less interesting these days with the convergence of platforms and hardware. Various languages? FORTAN, PL\I, Algol, Pascal, SNOBOL, COBOL, ICON, tons of assemblers (and you haven't seen weird until you saw the CDC computers' assembly - thanks, Seymour), PASCAL, BASIC, Smalltalk, PROLOG, various rule-based and chaining languages, Simula, Objective C, C++, etc., etc., etc.) But that's over 30+ years in the industry.

    So, how do you keep it straight? Learn the fundamentals so you can think in the dominant praradigms without the language involved - event-driven, OO, procedural functional, logic. Be prepared to write (or find) other code that supports the proper paradigm(s) that best support the code you're writing. Google and Stack Overflow are your friends. Do that and you can pick up everything pretty quickly. You'll still need to read code in the language you're going to be writing in to pick up stuff like which language idioms lead to elegance of expression, but at least you'll have your architecture and logic chops to fall back on. Plus, do code reviews - lot's and lot's of them.

    But keeping them straight? It's just a lot easier to relearn syntax and libraries when you switch back.

    --
    That is all.
  • (Score: 0) by Anonymous Coward on Monday April 07 2014, @06:02PM

    by Anonymous Coward on Monday April 07 2014, @06:02PM (#27644)

    Most of the C-related syntax family is impossible to remember, at least for me. This situation is really insane if you think about it. Why are so many languages almost the same but different in details?

    Is the length of a string...

    strlen()?
    length()?
    stringobj.len? - ie a property
    stringobj.lenth?
    stringobj.len()? - ie a method
    stringobj.length()?
    stringobj.size()?

    Who can remember all the variations? Not me. I have to switch among C-like languages all the time, and get really confused.

  • (Score: 2) by mcgrew on Monday April 07 2014, @06:06PM

    by mcgrew (701) <publish@mcgrewbooks.com> on Monday April 07 2014, @06:06PM (#27645) Homepage Journal

    I haven't done any real programming at all for a decade. Knowing more than one language never hampered abilities in other languages provided you use them. Rust never sleeps.

    Ten years ago I was using both javascript and Clipper for various projects at work, but using both almost daily. Now? Last year I dug up a fifteen year old page from my old web site and couldn't get the javascript to work for the life of me. It used to be easy.

    --
    Our nation is in deep shit, but it's illegal to say that on TV.
  • (Score: 1) by qwerty on Monday April 07 2014, @06:51PM

    by qwerty (861) on Monday April 07 2014, @06:51PM (#27665) Homepage

    My skill in a particular language definitely rusts when I don't use it for a long time. When I know I'm about to switch to another language for a while, I hit the library and checkout books with good reviews and skim through them. I've also used free online courses to good effect. They effectively prod my rusting neurons back to life. The good thing about the online courses, is that they often have assignments which highlight what I've forgotten.

    BTW I've coded in C for about 1.2 million years. I recently taught myself some Python because it looked interesting. I went to a job interview last week where I was asked to write C on the whiteboard and, of course, I accidentally mixed some Python print statements into the code. I expect the interviewer assumed I was writing pseudo code. :-)

  • (Score: 2) by tibman on Monday April 07 2014, @07:20PM

    by tibman (134) Subscriber Badge on Monday April 07 2014, @07:20PM (#27674)

    The often overlooked part of this is not transitioning languages. It is transitioning IDEs!
    I'm sure many people people will talk about remembering concepts over language specifics in this thread. But changing your dev tools can often be more jarring than changing the language. Especially if you have historically been a .Net/visual studio dev and you're going to something more primitive like text editor and compiler in a console. Reshaper won't help you and auto-complete may not exist as you know it. Even from VS to Eclipse has some differences like Tab versus Enter to select the auto-complete. Doesn't sound like a big deal but enough of those and it can feel like you spend more time fighting the IDE than coding.

    Anyways, since you are a C developer, you'll probably be more flexible than most. Going from fancy tools down to a console can be rough. Going from a console to a fancy IDE and you'll want to shoo the help away.

    --
    SN won't survive on lurkers alone. Write comments.
  • (Score: 2) by Common Joe on Monday April 07 2014, @08:06PM

    by Common Joe (33) <reversethis-{moc ... 1010.eoj.nommoc}> on Monday April 07 2014, @08:06PM (#27722) Journal

    I think I'm mediocre and nothing special, but my previous employers really liked me. I rely a lot on IDEs to help me remember syntax because I don't have enough space in brain to remember little stuff like that.

    I keep copies of O'Reilly's [oreilly.com] (I like their Java series) and APress [apress.com] (C#) around. Wrox [wrox.com] was pretty good a number of years ago, but more geared for beginner programmers so you had to wade through more pages to get to what you wanted or needed. I've since not really bought from them in a while.

    The reason why I like books is because they have whole chapters dedicated to topics like basic syntax, inheritance, exception handling, events handling, multithreading, file I/O, etc. I've never found Google able to give me one place to go with good, organized information about any of these topics in a specific language. When I need to get into something hip deep, I pull out the proper book, read for a while, then dive into the code so I know the pitfalls. Googling after reading a chapter or while reading a chapter can help supplement information too or give me further examples to look at.

    You ask a great question and I'm glad to find out that I'm not alone. I forget a ton of stuff and frequently look stuff up. There were many days when I kept that Java doc open constantly. (I always hated MS documentation, though. I usually looked in my books or went online to find the answers.) Typically, I only remember where to look and where really bad pitfalls are in specific languages.

    I wish more perspective employers took note of this question and the answers that are being given by the experts here. I'm looking for a job and employers seem to want experts in all sorts of crazy languages with perfect memory recall and perfect online presence and no life and... sorry. I'll get off my soap box. I'll just say this kind of expectation from those that sign the paychecks are driving me nuts right now.

  • (Score: 2) by gringer on Monday April 07 2014, @08:27PM

    by gringer (962) on Monday April 07 2014, @08:27PM (#27758)

    I consider myself competent in over 20 programming languages, and most frequently will flit between python, R, Java, C++, perl, javascript and sh. I have a good memory for the language syntax, and an idea of strengths and weakness of different languages, but have found that it's not particularly useful to know the specifics of functions. A general idea of easily available / usable functions is sufficient for the work that I do.

    When I want to know how to read and write CSV files in python, I'll read the python documentation on it. When I want to know how to load a pair of variables into a sorted map in C++, I'll look at the standard documentation. R has built-in help and code examples, so that makes things a bit easier. There is an extensive man page for shell scripts (useful when confirming the oddness of the for loop syntax). When I want to recall that regular expression symbol that reads ahead but doesn't capture, I look at the perl documentation (sometimes even for python code, because the perl documentation explains those better). I find the official java documentation to be wonderful, because it explains with code examples as well as explaining quirky behaviours that might impact performance. Most common languages have very good official documentation, which together with code examples, provides a great refresher when you've spent a bit too long on a single language to remember even the simple stuff.

    --
    Ask me about Sequencing DNA in front of Linus Torvalds [youtube.com]
  • (Score: 1) by koja on Monday April 07 2014, @09:20PM

    by koja (3832) on Monday April 07 2014, @09:20PM (#27797)

    I am not convinced that you can keep advanced proficiency of any language without regular use AND continuous studying. Transfering from one language to another and keeping high level of expertise seems impossible to me.

    Major obstacles I experience is continuous progress in language/library/idioms/patterns development and language specific idioms.

    My pet language being C++ I can't see anyone coming from different language background not appreciating books like Meyer's Effective C++ and More Effective C++ - basically few hundreds pages of C++ specific idioms and caveats.

    Sure if we are talking functions, simple inheritance and string manipulations, it could be possibly doable. But there's more interesting stuff like template metaprogramming, functional programming or "just" using inheritance correctly in your design which any newcomer IMHO just has to learn.

  • (Score: 2, Insightful) by nemasu on Tuesday April 08 2014, @01:32AM

    by nemasu (2059) on Tuesday April 08 2014, @01:32AM (#27891)

    "Never memorize something that you can look up."
    - Albert Einstein

    If you use it enough to keep looking it up, then you will remember it.

    This is my method, YMMV.

    --
    I made an app! Shoutium [google.com]
  • (Score: 3, Interesting) by sjames on Tuesday April 08 2014, @05:22AM

    by sjames (2882) on Tuesday April 08 2014, @05:22AM (#27963) Journal

    You WILL feel like you've forgotten a lot when you first switch back, but in fact, it will all come flooding back once you start using it again. It's only natural. I happens in human languages as well, I knew someone who traveled to Germany for a few months and after returning had a few problems finding English words for a week or two.

    It gets easier after you switch a few times.

    The interesting (and VERY useful part) is that the brief re-learning time will deepen your understanding of the language because you are re-learning in the context of knowledge you didn't have the first time around.

  • (Score: 1) by MaximumFerry on Tuesday April 08 2014, @05:32AM

    by MaximumFerry (416) on Tuesday April 08 2014, @05:32AM (#27969) Journal

    A lots of cheatsheet can be helpful in this case.

  • (Score: 2) by wonkey_monkey on Tuesday April 08 2014, @11:12AM

    by wonkey_monkey (279) on Tuesday April 08 2014, @11:12AM (#28072) Homepage

    My pet weakness is remembering to switch from "=" back to "==" when going from SQL to PHP (which is even more annoying because the SQL is inside the PHP).

    --
    systemd is Roko's Basilisk