Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Sunday March 16 2014, @03:28AM   Printer-friendly
from the premature-optimization-is-the-root-of-all-evil dept.

Subsentient writes:

"I've been writing C for quite some time, but I never followed good conventions I'm afraid, and I never payed much attention to the optimization tricks of the higher C programmers. Sure, I use const when I can, I use the pointer methods for manual string copying, I even use register for all the good that does with modern compilers, but now, I'm trying to write a C-string handling library for personal use, but I need speed, and I really don't want to use inline ASM. So, I am wondering, what would other Soylenters do to write efficient, pure, standards-compliant C?"

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by hubie on Sunday March 16 2014, @01:56PM

    by hubie (1068) Subscriber Badge on Sunday March 16 2014, @01:56PM (#17177) Journal

    I am curious what language you use for most of your work.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by VLM on Sunday March 16 2014, @02:16PM

    by VLM (445) Subscriber Badge on Sunday March 16 2014, @02:16PM (#17180)

    Depends on "most" and "work". And how old the legacy (if any) code base is and what it used.

    Everything runs under shell scripts run by cron and a batching system mostly to prevent hopeless thrashing meltdowns if cron fired off everything at once or there was a backlog. You generally clean up a mess like that once and then make your own system or implement off the shelf like torque or whatever.

    Something that is a couple stereotypical unix tools ends up as a shell script. If fundamentally all you're trying to do is run "grep -c something somefile" and then pipe that number into an email for automated system status alerts, then its possible to turn that one line into 10 lines of perl or 100 lines of java, but why?

    "harder" problems that require large-ish state machines seem to live in Perl. Especially if there exists a CPAN library than makes a simple solution. Historical and compatibility reasons. Some fooling around with Ruby and other languages have made an appearance.

    You can run into definition games where a shell script that runs an old perl script to slightly cook some raw data before feeding it repeatedly into R or octave for serious mathematical analysis and then some new ruby that eats the octave output and outputs something gnuplot likes and some simple html wrapper to reference it, is that shell, perl, octave, ruby...

    Sometimes I get the depressing feeling that if a language package exists in Debian, its probably system critical somewhere here, which is annoying.