Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday August 04 2020, @01:35AM   Printer-friendly
from the I-use-Lynx,-you-insensitive-clod! dept.

Firefox Browser Use Drops As Mozilla's Worst Microsoft Edge Fears Come True

Back in April, we reported that the Edge browser is quickly gaining market share now that Microsoft has transitioned from the EdgeHTML engine to the more widely used Chromium engine (which also underpins Google's Chrome browser). At the time, Edge slipped into the second-place slot for desktop web browsers, with a 7.59 percent share of the market. This dropped Mozilla's Firefox – which has long been the second-place browser behind Chrome – into third place.

Now, at the start of August, we're getting some fresh numbers in for the desktop browser market, and things aren't looking good for Mozilla. Microsoft increased its share of the browser market from 8.07 percent in June to 8.46 percent in July. Likewise, Firefox fell from 7.58 percent to 7.27 percent according to NetMarketShare.

[...] As for Mozilla, the company wasn't too happy when Microsoft first announced that it was going to use Chromium for Edge way back in December 2018. Mozilla's Chris Beard at the time accused Microsoft of "giving up" by abandoning EdgeHTML in favor of Chromium. "Microsoft's decision gives Google more ability to single-handedly decide what possibilities are available to each one of us," said Beard at the time. "We compete with Google because the health of the internet and online life depend on competition and choice."

[...] Microsoft developer Kenneth Auchenberg fought back the following January, writing, "Thought: It's time for Mozilla to get down from their philosophical ivory tower. The web is dominated by Chromium, if they really *cared* about the web they would be contributing instead of building a parallel universe that's used by less than 5 percent."

Is the browser monoculture inevitable or will Firefox hang in there?

Previously:


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Insightful) by Anonymous Coward on Tuesday August 04 2020, @12:59PM (2 children)

    by Anonymous Coward on Tuesday August 04 2020, @12:59PM (#1031213)

    You can't blame HTML4.1 for shitty plugins. However I do agree parsing a shitty JS/HTML5 page with JSON apis is way easier than it was trying to parse a website made entirely with Flash. The good thing about this modern web from my perspective, is that websites resemble an online database if I ignore the actual HTML. It is trivial to create tools to extract the information I want in a nicely formatted JSON format.

    I very much prefer HTML4.1 where the webmaster know that going beyond the standard with plugins is non-standard. At that time before XHR, JS couldn't do things behind your back. It was confined to process the document it came with. Now shitty webmasters can build crappy webapps that toast bread using usb and bluetooth and have a clear conscience that they followed the "standard".

    It baffles me how anyone can hear the term "living standard" and not start the ROFLCOPTER. It's as much of a paradox as "alternative facts".

    Starting Score:    0  points
    Moderation   +2  
       Insightful=1, Interesting=1, Total=2
    Extra 'Insightful' Modifier   0  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Wednesday August 05 2020, @08:34PM (1 child)

    by Anonymous Coward on Wednesday August 05 2020, @08:34PM (#1031936)

    At that time before XHR, JS couldn't do things behind your back.

    It was also as reactive as a pedal powered moped

    https://upload.wikimedia.org/wikipedia/commons/d/db/Honda_Hobbit.jpg [wikimedia.org]

    The days of server-side graph creation and related horse crap are long gone and relegated to 1990s and good riddance. What is crap is DL JS from 50 different domains but that is problem not with JS but the actual shitty websites.

    So yeah, you can have websites without CSS and JavaScript but in that case you may as well just stick with a text/plain. It works just fine for news content though, not so much for real applications. Even this site is using CSS and JavaScript.

    • (Score: 0) by Anonymous Coward on Thursday August 06 2020, @01:37PM

      by Anonymous Coward on Thursday August 06 2020, @01:37PM (#1032245)

      So yeah, you can have websites without CSS and JavaScript but in that case you may as well just stick with a text/plain. It works just fine for news content though, not so much for real applications. Even this site is using CSS and JavaScript.

      ..and that's the real issue here we disagree about. Should the browser be a webapp bazaar or present hypertext? I would be very happy if we had a clear seperation of the two. A browser and a search engine for websites and one for the webapp bazaar.

      btw this site runs fine without JS. It's the primary reason I'm here.