Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Friday August 30 2019, @11:03PM   Printer-friendly
from the renaming-for-the-nanny-state dept.

https://www.theregister.co.uk/2019/08/28/gimp_open_source_image_editor_forked_to_fix_problematic_name/

(Emphasis in original. --Ed.)

GIMP is a longstanding project, first announced in November 1995. The name was originally an acronym for General Image Manipulation Program but this was changed to GNU Image Manipulation Program.

The new fork springs from a discussion on Gitlab, where the source code is hosted. The discussion has been hidden but is available on web archives here. A topic titled "Consider renaming GIMP to a less offensive name," opened by developer Christopher Davis, stated:

I'd like to propose renaming GIMP, due to the baggage behind the name. The most modern and often used version of the word "gimp" is an ableist insult. This is also the colloquial usage of the word. In addition to the pain of the definition, there's also the marketability issue. Acronyms are difficult to remember, and they end up pronounced instead of read as their parts. "GIMP" does not give a hint towards the function of the app, and it's hard to market something that's either used as an insult or a sex reference.

[...] The subject of the suitability of the name is not new, and is enshrined in the official FAQ:

"I don't like the name GIMP. Will you change it?"

With all due respect, no. We've been using the name GIMP for more than 20 years and it's widely known ... on top of that, we feel that in the long run, sterilization of language will do more harm than good. ... Finally, if you still have strong feelings about the name "GIMP", you should feel free to promote the use of the long form GNU Image Manipulation Program or maintain your own releases of the software under a different name.

The Glimpse project is therefore entirely within the spirit of open source. "We believe free software should be accessible to everyone, and in this case a re-brand is both a desirable and very straightforward fix that could attract a whole new generation of users and contributors," says the About page.

Is now the time to accept that, to get GIMP into the mainstream, it needs a rename?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Informative) by Anonymous Coward on Saturday August 31 2019, @10:18AM (7 children)

    by Anonymous Coward on Saturday August 31 2019, @10:18AM (#888170)

    Human eye barely sees 8 bits. LCDs of the day cannot properly reproduce even that. Camera sensors put essentially nothing but noise in low couple bits of that 8. Any lossy image compression cuts those bits off (no choice, them being incompressible noise) and fills them by its own artifacts when decompressing.
    And then marketdroids go and sell "deep color" to people who really should know at least part of the above, if dabbling in image processing.

    What you really need is *linear* color, and yes you do need 16 bits per channel to work with that; but nothing and no one forces you to *store* images in that representation, it is only needed for *processing* them. And processing is better done in floating point anyway; upconvert image data on reading them in a filter, downconvert before storing final result, it only takes a negligible fraction of processing time, and everything outside the specific filters can peacefully remain 8bpc.

    Starting Score:    0  points
    Moderation   +2  
       Informative=2, Total=2
    Extra 'Informative' Modifier   0  

    Total Score:   2  
  • (Score: 3, Informative) by fyngyrz on Saturday August 31 2019, @02:59PM (6 children)

    by fyngyrz (6567) on Saturday August 31 2019, @02:59PM (#888228) Journal

    Human eye barely sees 8 bits.

    Actually, the vast majority of human eyes see less than 8 bits of monochrome brightness, if we ignore the function of the iris and consider only a steady state dilation. However, the iris lets the eye move its relatively limited dynamic range quite a ways up and down so that far more than 8-bits worth of brightness can be perceived as long as those additional levels are not in the presence of brightnesses considerably higher than they are, thus causing the iris to contract and let in less light overall, consequently pushing those lower levels below the eye's ability to distinguish them from one another.

    In deep-color presentations, when the scene is very dark, the eye can adapt to those dark ranges and see them just fine — the iris dilates open, and bingo, you can resolve those fine distinctions in brightness. But there is a problem. When the display returns to a brighter set of imagery, then the iris must contract, and if the difference is significant (which it certainly can be), that can actually be somewhat painful.

    So in actual use as a to-the-viewer display technology, it turns out that high dynamic range imagery isn't all that great an idea after all.

    However, high dynamic range image capture is a great idea. This is because in processing that data for later display, the details one wants will be there in both gross and fine detail, and so can be processed as one chooses for actual viewing. The end product, ideally, is one well matched to the normal eye's typical full range without requiring iris dilation, which again is just slightly less than 8 bits, or 256 brightness levels in grayscale; and considerably less than 24-bits in color, again evaluated at a fixed dilation of the iris.

    There are other subtleties such as linearity vs. non-linearity, etc., but in the end, the representation in 8/24 bits is just fine if handled correctly.

    --
    Having cats is like having living art.
    That randomly throws up on the floor.

    • (Score: 2) by sjames on Saturday August 31 2019, @05:20PM (5 children)

      by sjames (2882) on Saturday August 31 2019, @05:20PM (#888260) Journal

      Of course, the CCDs we capture the image with are likewise limited. The same image that causes the iris to shift the window on will also force a shift in the CCD (handled by altering the exposure time between reads). The bright areas will saturate the cells in the sensor while the darker areas will be swamped by noise. The best you can do is choose a few helpful exposure times (captured as a single value global to the frame) and hope the camera and the scene aren't moving enough to lose more detail than you gain.

      • (Score: 2) by jmorris on Saturday August 31 2019, @11:04PM (4 children)

        by jmorris (4844) on Saturday August 31 2019, @11:04PM (#888333)

        Dunno about that. I have a digital camera over a decade old that captures 12bit linear for RAW files. Pretty safe bet new stuff can do better in both bits captured and quality of the least significant bits.

        And while 8bit might be barely enough for end delivery, it really isn't Watch a movie on DVD/BD on a display without any Deep Color capability and if you are watching for it you will often spot banding where the 8bit limits are revealing themselves. Once we push that to 10 or 12 bits the consumer world will be fine, production is going to need the full 16bit for storage of work product and floats for processing.

        • (Score: 2) by sjames on Sunday September 01 2019, @06:58AM

          by sjames (2882) on Sunday September 01 2019, @06:58AM (#888408) Journal

          It may well be that more than 8 bits will be useful for enhanced images, but it will still need to be mapped somehow for human viewing (for example, dynamic range compression). I suspect the artifacts you see on the DVD are compression related rather than the limits of 8 bits per color channel.

          The CCD in your camera still needed some form of exposure control to avoid saturation or loss in the noise. Some CCDs used for astronomy measure exposures in minutes (they need to be kept quite cold to reduce noise). Astronomers were never fond of the iridium satellites because one iridium flare could wash out the whole image.

        • (Score: 0) by Anonymous Coward on Sunday September 01 2019, @10:39AM

          by Anonymous Coward on Sunday September 01 2019, @10:39AM (#888430)

          I was hard pressed to find a display which can faithfully reproduce even 7 bits out of 8-bit color (regular non-linear one, naturally). They bury the data on LCD matrices' specs far away from the public for a reason. The marketed Flashy Name "capabilities" that get displayed so prominently on the pages with a "Buy" button, do not have any proper technical description for a reason too.

          When you are watching for it, you can always find signs of better quality proportional to money spent. $50 wine always tastes better than $5 one, even when the two bottles are filled from the same barrel: https://www.sciencedaily.com/releases/2017/08/170814092949.htm [sciencedaily.com]

        • (Score: 0) by Anonymous Coward on Sunday September 01 2019, @11:00AM

          by Anonymous Coward on Sunday September 01 2019, @11:00AM (#888431)

          These things are sold rather widely: TN matrices are 6 bit.
          https://www.pchardwarehelp.com/guides/lcd-panel-types.php [pchardwarehelp.com]
          https://www.dell.com/community/Inspiron/Inspiron-G3-15-3579-LCD-yellowish/td-p/6203118 [dell.com]
          A normal buyer is unlikely to have a clue even to find out that he's been had in this way, and most anyone won't want to go to court over it even then. LCD costs are not worth legal fees.

        • (Score: 2) by fyngyrz on Thursday September 19 2019, @10:16AM

          by fyngyrz (6567) on Thursday September 19 2019, @10:16AM (#896038) Journal

          if you are watching for it you will often spot banding where the 8bit limits are revealing themselves

          This is always a result of over-compression and/or crappy editing, not limits in the 24 bit space. The various compression technologies that are used to deliver movies often discard data either for bandwidth reasons, or like music compression, under the assumption that "it isn't significant", but of course, it is.

          When a presentation uses only distant-from-each-other levels to represent an image, a dark one for instance, the presentation is poorly done. It isn't the color space that's at fault; it's the dynamic range that has been encoded for display, which is a production / editing issue, not a colorspace issue.

          As long as the scene capture is done in a deep colorspace, there's no good reason whatsoever that the final production should have any visible flaws at all in a 24-bit colorspace.

          Finally, sometimes a display isn't even capable of a full 24-bit reproduction, marketing hype aside, and that can cause mach banding to arise.

          ---
          inSIGnificant