Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Tuesday October 14 2014, @09:27AM   Printer-friendly
from the MAX-FORWARDS dept.

Twenty years ago today (13 October 1994), Mosaic Communications Corporation released the Mosaic Navigator, the first commercial browser for the World Wide Web. This was just six months after the company was founded by ex-Silicon Graphics CEO Jim Clark, and Marc Andreesen, a recent computer science graduate of University of Illinois at Urbana-Champaign. Andreesen had co-developed the Mosaic Web browser while working for the National Center for Supercomputing Applications (NCSA), on UIUC's campus; Clark, who had been losing a power struggle at Silicon Graphics, the company he'd founded, was restless and looking for an adventure and revenge. Andreesen quickly convinced the band of programmers from UIUC he'd worked with on Mosaic and web server development, to relocate to Silicon Valley.

Both the company and the browser were re-branded 'Netscape' a month after the product was released, settling a lawsuit by the UIUC, who regarded Mosaic as intellectual property belonging to the university.

Andreessen and Netscape moved fast, even by the standards of the personal computing business at the time. After Microsoft entered the game (they jump started development by buying rights to a web browser created by Spyglass), Netscape pumped out Navigator 2.0 a little more than a year later, unveiling JavaScript, frames, cookies, plug-ins, SSL (2.0, the first released version), and integrated mail and news readers. Oh, and client-side integration with a mysterious new language called Java.

Bill Gates broadcast his famous "Internet Tidal Wave" memo to the troops at Microsoft in May 1995. Internet Explorer 1.0 was released in August 1995; future versions of IE were bundled with Windows 95, as Microsoft tried (rather successfully) to "cut off Netscape's air supply", as Microsoft Vice President Paul Maritz is alleged to have ranted at the time. Microsoft's actions against Netscape and numerous other competitors in the software industry became the subject of an antitrust suit brought by the US Department of Justice.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by cafebabe on Tuesday October 14 2014, @01:38PM

    by cafebabe (894) on Tuesday October 14 2014, @01:38PM (#105922) Journal

    popular and influential

    That was in the days before pre-forking webservers and other solutions to solve the c10k problem [wikipedia.org]. Running httpd from inetd had a noticeable lag. And with clients making four concurrent connections and servers honoring 20 concurrent connections, it took less a dozen people before a site became awkward to access. That created a rubbernecking phenomena because people wanted to see the cause of the commotion.

    Nowadays, servers honor a minimum of 250 connections and - due to the modern selection of content - you're unlikely to raise a big a enough rabble to hit the limits.

    This inadvertent scalability has definitely reduced the consequences of bad design. Previously, if a website had particularly gaudy images, it was likely to exceed its bandwidth allowance (often 100MB per month or less) and so there were financial incentives to fix a disaster. Nowadays, this problem is "fixed" with a CDN subscription.

    Even so, it was easy to work within the limits of available hardware. About 17 years ago, Yahoo's total bandwidth was 8Mb/s. That's now less than the bandwidth of many casual home users. Likewise, servers of that era would typically have 16MB or 32MB RAM. That's now considered a low spec for an embedded system.

    --
    1702845791×2
    Starting Score:    1  point
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 0) by Anonymous Coward on Tuesday October 14 2014, @02:04PM

    by Anonymous Coward on Tuesday October 14 2014, @02:04PM (#105932)

    Another, unrelated bottleneck was Common Gateway Interface, which was dirt simple to write apps for (in any programming language) but which forked a new process for each client connection.

    I remember all the commercial web server vendors had a proprietary alternative API that didn't fork - Netscape had NSAPI, IBM had ICAPI, etc. There were two problems: the app usually had to be written in C or C++, and a buggy app (a single web page from the end user's POV) could bring down the web server.

    • (Score: 2) by WillR on Tuesday October 14 2014, @07:05PM

      by WillR (2012) on Tuesday October 14 2014, @07:05PM (#106037)
      Another instance of /bin/sh, even.

      CGI is still around, doing helpful things like making "Shellshock" a serious remote code execution vulnerability instead of an interesting design flaw.