Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Thursday November 06 2014, @05:26PM   Printer-friendly
from the like-Coltrane-in-a-cutting-session dept.

Silviu Stahie at Softpedia reports

Someone had the curiosity to see what would happen in Ubuntu 14.04 LTS (which is powered by the Unity desktop environment) if he opened 100 applications at once. That's usually more than enough to bring a powerful system down, but that's not the case with Ubuntu.

"When I went crazy enough to open all of my 100 apps (100 windows) on the desktop to see the performance hit on Ubuntu 14.04 with Unity. Nothing happened really, except that Launcher and Switcher got full. You can see the good smooth animation of Switcher icons. Beautiful!" wrote Ali Najafi on discourse.ubuntu.com.[1]

He did use a powerful Core-i7 2GHz CPU, which is not very expensive however, so it should be available to numerous users. With 100 apps open at once in Ubuntu 14.04 LTS, the processor didn't go above 50% and the RAM memory was filled at about 75% (out of 6GB). As you can see from the video he posted, everything is running quite smoothly, although that's one crowded Alt-Tab menu.

Silviu mentions Window's notoriously bad memory management. So, how many apps can Redmond's minions get running simultaneously? Anyone up to the challenge?

[1] All content is behind scripts.

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Informative) by Nerdfest on Thursday November 06 2014, @05:30PM

    by Nerdfest (80) on Thursday November 06 2014, @05:30PM (#113576)

    The real test is to have them all doing something and still have the UI to the ones being used or viewed be smooth and responsive. I find that this is what I get in Linux most of the time, especially since the tweakes done to task scheduling in the last couple of years. I find Windows does not deal with this well.

    • (Score: 3, Interesting) by opinionated_science on Thursday November 06 2014, @05:36PM

      by opinionated_science (4031) on Thursday November 06 2014, @05:36PM (#113582)

      I too, am using Linux on a machine with 32GB of main memory and 512MB quadro card. Desktop is 4480x1600. I have (according to X an ps ) 463 processes running and about 200 using the display.

      I dropped windows years(!) ago, when a 1024x768 desktop was too much. Better still using KDE4 , when video window is running, the ICON also moves!!! Very useful to monitor processes that are not onscreen!!

      Ok enough penguin pride....

    • (Score: 3, Interesting) by Hairyfeet on Friday November 07 2014, @01:24AM

      by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Friday November 07 2014, @01:24AM (#113711) Journal

      Obligatory XKCD [xkcd.com] which is sadly sooo true when it comes to articles like this.

      I'll get hate from the FOSSies but fuck it, time for a big dose of truth. So you can get 100 Bash prompts or whatever running at once...so the fuck what? I'm sure I could get 100 C64 emulators running on windows 7 but that don't make it any less dumb and about as in line with reality as saying those guys that use liquid nitro to cool a CPU for 60 seconds are somehow "proving" anything other than how to kill a board and CPU really quick is to use liquid nitro!

      What matters is the tasks Joe and Jane Average use their PCs for and in that regard? Linux sucks, sorry but it does. Hardware acceleration? A bad joke, in fact only by using the proprietary Nvidia blob are you gonna get close to where Windows was 5 years ago, sound? Pulse is STILL the most fragile part of any Linux system and why nearly every consumer oriented distro fails the Hairyfeet challenge after the first update, Wireless? We heard for years "oh just use Aetheros, its "Linux Friendly hardware"...BTW wanna guess what wireless was broken on last year's Ubuntu release?

      There is a REASON why MSFT puts out the worst release since MSBob and Linux doesn't gain even a single point, why every major OEM avoids Linux like the plague (even Dell buries it on the back page behind a dozen warnings like its toxic) and its because the higher ups will NEVER EVER let the damned thing get even close to fucking stable! Its like welcome to bizzaro world "Oh people like KDE 3 and Gnome 2, things am stable? That's no good! Quick throw them out for undercooked new release that won't have feature parity for years, this will make users am miserable, aren't we wonderful?" Oh noes, ALSA is making more major sound work OOTB without tons of crashes, lets stick in shitty buggy Pulse, it will make sound more fragile than WinME! We are soo smart!"...I swear if I didn't know better I'd think there was somebody pulling an Elop in the Linux camp because between the DE shits, Pulse, Torvalds fucking with major shit just for the lulz which causes drivers that worked in Foo to break in Foo+1 and now systemd doing for the underpinnings what Pulse did for sound? Nadella really should send the devs a fruit basket, least he can do for making his job sooo much easier!

      I could wallpaper this page with citations backing every single thing up with multiple sources but why bother? All I'll get in response is insults and memes, the same memes going back a decade [tmrepository.com] and which have the same basis in fact as LOLCatz. All one has to do is take just one little list of show stoppers [narod.ru] and compare it to the same list 5 years later [narod.ru] to see what my Linux admin friend said when he went to Mac after his wireless got shat upon for the fifth time is true "Linux doesn't get any better, it just gets different". Its a damned shame but you can see the writing on the wall, the future is Google (proprietary, see the excellent article of Ars about how Google is pulling a EEE with android), Apple (proprietary) and MSFT (proprietary) owning everything but the server which is quickly becoming a dead end job thanks to pre-packaged VMs making it so one guy can run a server farm.

      --
      ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
      • (Score: 0) by Anonymous Coward on Friday November 07 2014, @02:59PM

        by Anonymous Coward on Friday November 07 2014, @02:59PM (#113822)

        thank you! true!

      • (Score: 2) by urza9814 on Friday November 07 2014, @06:23PM

        by urza9814 (3954) on Friday November 07 2014, @06:23PM (#113903) Journal

        What matters is the tasks Joe and Jane Average use their PCs for and in that regard? Linux sucks, sorry but it does.

        Yup. Linux does perform better in the real world though. Or at least it used to -- I haven't really used Windows since XP. But part of the reason I switched to Linux was to run Freenet -- the same settings that ran perfectly fine under Linux for months at a time would cause my entire Windows PC to lock up and crash after running just a few minutes.

        Hardware acceleration? A bad joke, in fact only by using the proprietary Nvidia blob are you gonna get close to where Windows was 5 years ago

        What driver are you using on Windows that is NOT a proprietary blob...?

        sound? Pulse is STILL the most fragile part of any Linux system and why nearly every consumer oriented distro fails the Hairyfeet challenge after the first update

        Works on Arch, Fedora, SuSE, Debian, and Ubuntu on my System76, Dell, HP, and custom desktop. I haven't seen a problem with Pulse in about six years. But hey, if it's a problem for you, use ALSA or OSS or Jackd or whatever. With Windows, if it doesn't work...you're screwed.

        Wireless? We heard for years "oh just use Aetheros, its "Linux Friendly hardware"...BTW wanna guess what wireless was broken on last year's Ubuntu release?

        Another thing that I haven't seen a problem with in 6+ years, although yes it used to be a huge issue. I don't run Ubuntu though, so I dunno about whatever bugs they've had. Everything from Intel cards to the Dell branded Broadcom garbage has worked flawlessly out of the box on Linux for the past 6 years or so. And half these suckers don't even work out of the box on Windows!

        Actually, I have had one USB wifi card that wouldn't work on Linux. Spent hours trying to figure that out...then I discovered it didn't work under Windows either. No 64-bit drivers IIRC.

        Oh, and I think I'm going to start a reverse Hairyfeet challenge. Upgrade Windows through several versions without everything dying. 'Cause I know when I went from XP to Win7 on my desktop I had to buy a bunch of new hardware. Still using a wifi to ethernet bridge thing instead of an actual wifi card on that system because of it, I went to BestBuy three separate times to buy and then return various wifi dongles that wouldn't work until I found that thing! And my parents are still running XP SP1 on their old Dell desktop because SP2 and 3 never worked...

        • (Score: 2) by Hairyfeet on Friday November 07 2014, @11:35PM

          by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Friday November 07 2014, @11:35PM (#113955) Journal

          You DO realize that ALL YOU JUST DID WAS SPOUT MEMES just as I said would happen, yes? Its the same anecdotes and bullshit, from you are using the wrong hardware [tmrepository.com]/ Linux friendly hardware [tmrepository.com] to works for me [tmrepository.com] its NOTHING but anecdotes and memes, that is ALL YOU EVER GET from a FOSSie! Any bets that the AC that posted below you invoked the shills and trolls and vampires oh my! [tmrepository.com] meme?

          if you TRULY believe in what you say then put your money where your mouth is and take the Hairyfeet Challenge, its celebrating its eighth year with ZERO consumer distros passing, if you think yours can then follow the steps and post it to youtube..

            Take ANY mainstream consumer oriented (not LTS, because even Ubuntu advises against mainstream users using LTS) from FIVE years ago, this simulates a 5 year typical lifecycle. This BTW is less than HALF a windows support cycle, so I'm cutting linux a break. Lets say you use Ubuntu, that would be Ubuntu 9.10 and can be downloaded from their archive. Install it on ANY PC, desktop or laptop (NOT VM as that isn't real hardware and comes with special drivers) that has a wireless card. Wireless is required because more and more mainstream users are ditching wires and nobody wants a laptop that doesn't have wireless, do they?

          During this phase you are the system builder so CLI (which is usually required because Linux driver support is poor) IS ALLOWED. Once its installed you are no longer the system builder but THE USER, so like a windows user you are ONLY allowed to use the GUI. You then get to "enjoy the freedom" of using nothing but the GUI (because if you can't even update the thing without CLI you're no match for windows are you) of updating to current...with ubuntu that is SEVEN RELEASES, just FYI. You will film this and post it to youtube, you only have to upload the final install process of each release and a pic of the device manager showing working hardware complete with wireless showing WPA V2 connection, but the complete video should be hosted on dropbox to prove you aren't faking it.

          BTW in case it isn't clear working hardware means WORKING HARDWARE, it does NOT mean wireless that can't use WPA, it does NOT mean a PC with no sound or VESA video, it means FULLY WORKING HARDWARE and again if you are unclear please see the highlighted areas as completing the challenge REQUIRES vids of the final install of each upgrade (last I checked that would be EIGHT for Ubuntu, and around SIX for most others, be sure to have room on your SD Card!) along with a 5 minute video of the end of each install showing that upon completion you could go to hardware manager and had 100% functional hardware with NO FUTZING. After all if you have to futz with the thing just to have functional drivers it isn't on the same level as Windows now is it? BTW the first Windows that passed the challenge was Win2K (RTM to EOL with ZERO failed drivers, 10 years of support) WinXP (14 years, ZERO fails) and both Vista and 7 can go from RTM to current with ZERO failures. So lets see them snappies!

          --
          ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
  • (Score: 2) by bob_super on Thursday November 06 2014, @05:34PM

    by bob_super (1357) on Thursday November 06 2014, @05:34PM (#113579)

    I'd like to see anything affordable running 100 Supreme Commander at the same time smoothly.
    But one hundred HP-48 emulators?
    Is it somehow harder to run an "App" than the 50+ processes that are the minimum to run a modern OS?

    I remember the days when you could crash the Solaris station of the guy remotely login into yours by filling his with junk. That stopped working 20 years ago.

    • (Score: 2) by TheRaven on Friday November 07 2014, @09:51AM

      by TheRaven (270) on Friday November 07 2014, @09:51AM (#113770) Journal

      Yup, seems like a very odd metric. This laptop (OS X) is currently running 325 proceses, and around 1600 threads (the latter number changes plus or minus 30 every few seconds). And the CPU is basically idle - 97% idle and most of the remainder is taken up by the process monitor.

      100 apps may be a problem if you run out of physical memory, but even then if they're all instances of the same program then a lot will be shared so it's pretty hard. If they're all trying to run then you're going to get a lot more TLB misses than is healthy, but that's a CPU problem not an OS one (although if the OS supports transparent superpages then it can alleviate it slightly). Similarly with cache churn, but again that's a hardware problem. Basically, the point of this article is 'I have a fast CPU, but have no idea how to do anything that requires it'.

      --
      sudo mod me up
  • (Score: 2) by urza9814 on Thursday November 06 2014, @05:35PM

    by urza9814 (3954) on Thursday November 06 2014, @05:35PM (#113581) Journal

    Jesus, it's not uncommon for me to find X.org using >90% of my CPU, even with nothing but a browser and a terminal window open! While running Enlightenment. On Arch Linux. On a system with a 2.8GHz i7 and 12GB of RAM!

    I think the last time I looked it seemed like a very rare bug in Enlightenment actually. But hell my system is powerful enough that I haven't cared to look into it further, all it does is peg a few bars on my CPU utilization graph, otherwise I barely notice :P

    • (Score: 1) by axsdenied on Thursday November 06 2014, @10:28PM

      by axsdenied (384) on Thursday November 06 2014, @10:28PM (#113675)

      Don't forget that high CPU usage generates heaps of heat which will:
      - increase temperature in the case
      - decrease lifetime of components
      - run your fans harder and noisier

      And let's not forget the increase in your power bill.

      • (Score: 2) by urza9814 on Friday November 07 2014, @01:21PM

        by urza9814 (3954) on Friday November 07 2014, @01:21PM (#113791) Journal

        Meh, yeah, it's not that much of an issue though. It's not something that is going on all the time, usually after 1-2 weeks of uptime the CPU load will start creeping upwards. And it's a laptop, so my uptime is usually measured in days, not months.

        Also, after never owning a system that wasn't literally falling apart (because hey, Linux runs on anything, right?) I massively over-corrected. This laptop is a freakin' tank. I'm not worried about anything failing from the heat. Hell, I'm also pretty sure I could drop-kick this fucker and it'd just keep chugging along ;) And besides, might as well get it all burned in while it's still under warranty!

  • (Score: 2) by skullz on Thursday November 06 2014, @05:47PM

    by skullz (2532) on Thursday November 06 2014, @05:47PM (#113586)

    I can open 100 Notepad.exe's on Win 8 without an issue (slight lag opening when you get past 80 or so) but once they have had a chance to settle they all function and the rest of the system responds as normal. I tried it with Firefox but only got up to 60. There was a spike of lag while the new browsers called home to the mothership but otherwise the OS kept working well. Memory was being used up faster (of course) and after 60 open browser windows (not tabs, actual new windows) the newer ones stopped redrawing them selves and had that hall of mirrors effect. The system handled it just fine.

    • (Score: 2, Interesting) by evk on Thursday November 06 2014, @07:09PM

      by evk (597) on Thursday November 06 2014, @07:09PM (#113612)

      It's not uncommon for me to have 50 appLICATIONS running in Windows. Most of them would be gvim, but also several instances of Visual Studio, SQL Management studio. Firefox, maybe some word and excel and plenty of other stuff. I complain a lot about win, but it usually work as good as, better than Linux.

      • (Score: 1) by evk on Thursday November 06 2014, @07:13PM

        by evk (597) on Thursday November 06 2014, @07:13PM (#113615)

        Forgot to mention that I currently have 5 Hyper-V instances running on the same machine, a mix of Linux, Open/FreeBSD and Win 8.1. But these are mostly for test and isolation of some tasks and don't run that many applications.

  • (Score: 2) by WizardFusion on Thursday November 06 2014, @05:50PM

    by WizardFusion (498) on Thursday November 06 2014, @05:50PM (#113588) Journal

    ...but why.?
    You can only use so many windows at a time when working, so why does it matter.

    • (Score: 2) by skullz on Thursday November 06 2014, @05:52PM

      by skullz (2532) on Thursday November 06 2014, @05:52PM (#113589)

      Because it is EXTREME windows management.

  • (Score: 2) by jackb_guppy on Thursday November 06 2014, @06:08PM

    by jackb_guppy (3560) on Thursday November 06 2014, @06:08PM (#113594)

    Been doing that for years on Ubuntu 14.04, 12.04 and 10.04 before that. I do on WinXP, Win7 and Win8.1 too. All machine are basic quad core Q6660 processors with 4GB of memory.

    Oh, my test is actually a modified network scan, each window/app is pinging one of about 1100 different machines dump the data into IP based name files (so 1000 open files too). Only real problem was WinXP - the start of new thread/app/window is a higher priority than servicing current open windows. It corrects after about 5 minutes when all the windows are open.

      Like Linux, creating and maintaining a SWAP/Pagefile is the requirement. Do not allow Windows to manage the pagefile. It believes it can be deleted on each boot... that does fragments over time. Create it once and create big (2x memory per se) and make min and max the SAME.

    Now his videos are cute - better than cats.

  • (Score: 4, Insightful) by PizzaRollPlinkett on Thursday November 06 2014, @06:35PM

    by PizzaRollPlinkett (4512) on Thursday November 06 2014, @06:35PM (#113599)

    I have 217 processes active right now on my desktop box, 442 threads. And I'm not even building anything or running Oracle. If Linux has a least upper bound for the number of apps, I've never found it.

    I don't know what exactly the definition of an "app" is. Does it have to be a GUI foreground app? I've had Audacity and GIMP and umpteen Evince windows going. Not all apps are equal - Audacity chews up memory, and Oracle goes nuts using memory. I have a lot of browsers running for tests. I run sandboxed web servers, VirtualBox images, and who knows what else. I don't remember any real performance impact.

    My biggest problem is running out of swap space, but Linux lets you dynamically add files as swap now which is nice.

    The only thing for me that has ever crashed desktop Linux, other than hardware failure, has been graphics driver bugs. Linux locks up every few weeks because of the graphics. Nothing is wrong with the OS itself.

    Windows is about the same difference these days. I don't think I actually have 100 apps installed on my Win8 box, but I have several Visual Studio versions and umpteen web browsers. In the old days, the resource limits in GDI were low, but that hasn't been the case for a long, long time.

    --
    (E-mail me if you want a pizza roll!)
    • (Score: 2) by cosurgi on Thursday November 06 2014, @06:56PM

      by cosurgi (272) on Thursday November 06 2014, @06:56PM (#113608) Journal

      yep. And also I use sawfish, because I think it's easier with it to handle 300-500 windows opened on 24 desktops. I wrote few list scripts to make it even easier (like undo/redo desktop change). There's also a script that would undo/redo window movement,resize or any other WM action on windows, except closing them. Sometimes after three or six months I do "archeology" to discover some long forgotten gvim, gnuplot or xterm windows with stuff that I was doing half a year ago. Pretty funny :) A working hibernation+UPS helps with that too.

      --
      #
      #\ @ ? [adom.de] Colonize Mars [kozicki.pl]
      #
    • (Score: 0) by Anonymous Coward on Thursday November 06 2014, @08:42PM

      by Anonymous Coward on Thursday November 06 2014, @08:42PM (#113648)

      Our production system at work:
      sh-3.2$ who -u|wc -l
      380
      sh-3.2$ ps -ef|wc -l
      1750

      Actually the count is lower since it's late in the day. During the morning peak, it's not uncommon to see 450+ users and 2500 processes. Running a 100 apps hardly seems like news.

  • (Score: 4, Informative) by LoRdTAW on Thursday November 06 2014, @07:30PM

    by LoRdTAW (3755) on Thursday November 06 2014, @07:30PM (#113621) Journal

    Opening 100 apps on a PC in 2014 isn't really a big deal. CPU's are plenty fast with two or more cores, and RAM is dirt cheap with most mainstream PC's running 4-8GB and enthusiast rigs stuffed with 16+ GB. Maybe if it was the year 2000 it would be quite impressive. All you are doing is telling the OS to load a bunch of crap from disk to memory. Games do that all the time. The only bottleneck is the disk and the applications are sitting idle not chewing up and CPU cycles once opened. Now if we compared launching 100 of the same or similar applications on the same hardware testbed using different operating systems, then we have an article. E.g. Linux vs *BSD vs Windows 7 vs Windows 8.x vs Windows 10 preview. Then I might be interested to see how they stack up against each other when loading the same bunch of crap from disk to memory.

    You know what was a good demo? Time for a trip down memory lane:
    Back in the day there was a very interesting OS called BeOS. It was heavily threaded, had partial POSIX compatibility with a bash shell (though it was not unix based) and its scheduler was amazingly adept at managing resources. It was media based and simply dropping a decoder dll into the proper system folder now meant any application could use that media file. So a simple paint program could open up and save png's if the decoder was present. Same for audio and video, any player or recorder could manipulate a format as long as its decoder was present. The idea being everything was abstracted by the OS API (vs the hell that was both Windows and even Linux). On a single P2 400MHz you could play four or six AVI files and the UI was just as responsive without any applications running. Instead of the videos stuttering, the video playback simply skipped frames and the audio continued to play. There was even an application that let you map video files onto a rotating OpenGL cube. Each of the six sides could play a different video file and you could rotate the cube and have it spin on its own (All software, no hardware video or GL). If you had an SMP box (rare back then until the ABit BP6 came around) you could open tons of applications, have them do work and everything was responsive. Dragging a windows around under 100% load was smooth. The best demo was when someone happened to get their hands on an 8-way 533MHz Xeon (P3) system, installed BeOS, opened 30 videos and they all played smoothly. That was impressive. The downside to BeOS was: closed source, crap networking, no hardware OpenGL drivers, and was single user. The crap networking was the result of the OS pretending it was a microkernel OS when it really wasn't; JBQ once told me flat out it was marketing BS. It was a monolithic kernel that ran everything in user space servers. Though when a server crashed because of a bad driver, the OS happily kept running and the server was restarted. It had a pretty good community built up around it, I used to frequent the IRC channel and everyone was pretty friendly. After Be Inc. went bankrupt and sold the remains to Palm Inc., the wind was knocked out of its sails. Be Inc. open sourced a few components and it lives on as a community led effort to re-implement the entire OS as an open source clone: https://www.haiku-os.org/ [haiku-os.org] One of my favorit things about the OS was when things went wrong, the error dialog box had a Damn button instead of an Ok button. Made much more sense.

    I ran it on systems ranging from Pentium 133's to a Pentium 3 1GHz and it always ran smooth. And that was all circa 2000. Now that was impressive.

    • (Score: 1) by novak on Thursday November 06 2014, @08:21PM

      by novak (4683) on Thursday November 06 2014, @08:21PM (#113638) Homepage

      Oh man, there's a name I haven't heard in a long time. I still have a BeOS install disk in my box of OS disks.

      --
      novak
  • (Score: 1) by novak on Thursday November 06 2014, @08:24PM

    by novak (4683) on Thursday November 06 2014, @08:24PM (#113640) Homepage

    It really is no biggie. I remember opening up over 100 konqueror windows and comparing the memory usage to 100 firefox windows, and that must have been at least 8 years ago on a P4 with 1gb of RAM.

    --
    novak
    • (Score: 0) by Anonymous Coward on Friday November 07 2014, @03:23AM

      by Anonymous Coward on Friday November 07 2014, @03:23AM (#113734)

      That would have already been old news 8 years ago, except to people who were new to *nix. ;) 8 years before that, already old news. In those old days (`90s) before tabbed browsing I often had 50+ windows open spread between a few virtual desktops. Actually, things slowed down substantially in between then and now, but only for people running crappy heavyweight (post-gtk2) desktops.

      I've always been a couple years behind average with my hardware, and this has never been a real problem. It has pretty much "always" been the case that if you still have free RAM, you can open more crap and you won't notice unless it is a game that eats CPU, or some really awful software.

      browser, gimp, emacs, dozens of xterms, media player, other media player paused and minimized, browser for the SO, filesharing client, IM, weather app, etc. And that doesn't stop a *nix user from compiling something large, like the linux kernel. We never had to shut down programs to use other programs, unless we were out of RAM. Even if half of the programs were swapped, performance would be pretty good, assuming you had RAM to hold the ones you were actively using. I'm talking about the 90s here.

      Some people are easily impressed.

  • (Score: 5, Insightful) by darkfeline on Thursday November 06 2014, @08:27PM

    by darkfeline (1030) on Thursday November 06 2014, @08:27PM (#113641) Homepage

    It makes me uncomfortable seeing all of the people posting here missing the point entirely. What this shows has almost nothing to do with RAM size or CPU speed, but how well the kernel juggles memory and processes. Historically, Linux did a much better job than Windows in these areas. If the kernel is good, it can instantly swap in an idle process out of hundreds onto an available processing unit and have it respond immediately, but a poor kernel will thrash around getting it back into memory and running.

    --
    Join the SDF Public Access UNIX System today!
  • (Score: 2) by Marand on Thursday November 06 2014, @08:35PM

    by Marand (1081) on Thursday November 06 2014, @08:35PM (#113644) Journal

    If he's running 100+ different applications and only using ~4.5GB, he's clearly not doing anything with those apps. Firefox and Chrome will happily eat anywhere from 200MB to around 1GB when in actual use, IntelliJ IDEA uses similar, and a graphics editing app like gimp or Krita will use multiple gigabytes of memory working on a single large image. Krita alone will easily use half or more of that 6GB if it's in use with a high-res image open.

    That's not even getting into games, which love using a gig or more at a time. Or, for the all-time best example: Minecraft, Devourer of RAM. Thanks to memleaks (including one nasty one with dimension loading), Minecraft will steadily consume your RAM until there's nothing left if you leave it open long enough, and adding mods makes it worse.

  • (Score: 1) by markus40 on Thursday November 06 2014, @10:01PM

    by markus40 (4862) on Thursday November 06 2014, @10:01PM (#113668)

    Just tried on my Lenovo 520, 16GB, 256 SSD, Arch, Gnome 3.14
    35 Firefox windows
    25 Gedit Windows
    25 Nautilus Windows
    25 Gnome-Terminals
    15 Libreoffice Write Windows

    6% cpu and ~8GB used, everything, including Gnome, responsive and smooth.

    Opening tasks with Plank, autohide functioning smoothly. Closing also smoothly, Task switching with overview instant and smooth.

    • (Score: 0) by Anonymous Coward on Thursday November 06 2014, @11:22PM

      by Anonymous Coward on Thursday November 06 2014, @11:22PM (#113687)

      everything, including Gnome, responsive and smooth

      So now we know what GNOME3's system requirements are!

      Lenovo 520, 16GB, 256 SSD, Arch, Gnome 3.14

  • (Score: 0) by Anonymous Coward on Friday November 07 2014, @01:06AM

    by Anonymous Coward on Friday November 07 2014, @01:06AM (#113706)

    I have 2 identical dual core PC's, same ram, hard drives CPU, etc. One has Win7 and the other has Ubuntu Gnome 14.04. Winblows takes just under 2 minutes to load to a usable state, that is, you can't do anything until it's at this point. Ubuntu only takes 20 seconds and has already checked email and weather via conky. Winblows is sluggish, has to be restarted at least once per day, crashes, same old BS as all other versions all the way back to win95 days. Ubuntu is snappy, has never crashed all the way back to ver 10.04. Winblows takes about a minute to shutdown, that is if it doesn't pop up with the usual "somethings still running" error or whatever. Ubuntu shuts down in 8 seconds. Yeah, I can't wait for the final "Windows is unable to load" BSOD, it'll get wiped and loaded with Linux when that happens. The only thing I dislike is Unity, it sucks so I always get the Gnome version then install gnome fallback/flashback.

  • (Score: 2) by meisterister on Friday November 07 2014, @01:28AM

    by meisterister (949) on Friday November 07 2014, @01:28AM (#113712) Journal

    I can probably run as many "apps" as I want. It's when I start running actual programs that things start getting bogged down.

    --
    (May or may not have been) Posted from my K6-2, Athlon XP, or Pentium I/II/III.
  • (Score: 2) by halcyon1234 on Friday November 07 2014, @04:36AM

    by halcyon1234 (1082) on Friday November 07 2014, @04:36AM (#113746)
    I call bullshit. There's no way he's posting on Discourse AND doing anything else with his system. (because it's 2014, and a forum needs 1.5GB of browser ram to run, of course. Obviously. Fuck.)
    --
    Original Submission [thedailywtf.com]
  • (Score: 2) by shortscreen on Friday November 07 2014, @07:20AM

    by shortscreen (2252) on Friday November 07 2014, @07:20AM (#113754) Journal

    I booted Windows 2000 on an Athlon X2 desktop to see what it would take to overload it. I started off by launching just about every desktop icon and start menu program, and a few other Windows programs I saw on the drive. About 40 different things (see below), but no games. No problems so far. 600 and some MB of memory usage (out of 2GB of physical RAM).

    Then I opened 40+ instances of Wordpad. Then I ran into a problem. "Cannot create new document." So I closed one Wordpad and tried to open more instances of Pbrush. Same problem again. OK, so then I went to Opera, which seemed to be working fine and tried to open a zillion tabs. After only 19 tabs (which isn't much really, I've had far more tabs than that open at once before, although not with 80 other programs running in the background) the right-click menu stopped appearing. And when I took a screenshot and pasted it into Irfanview, I could not save the file (until after closing something) so I guess I have reached some kind of limit.

    Handles 13661
    Threads 536
    Processes 108
    Commit Charge 871368KB

    programs started:
    CrystalCPUID, Notepad, Winamp, VLC, Irfanview, MAME32Plus!, Firefox 3, ePSXe, Opera 9.25, Opera 9.6, Foxit Reader, KbMedia Player, QupZilla, Hyper Terminal, Sound Recorder, Calculator, Command Prompt, Imaging, Pbrush, HD Tach, MS GIF Animator, 3DMark2001 SE, 3DMark03 Pro, 3DMark99Max, ImgBurn, Sisoft Sandra, CINEBENCH R11.5, vStrip GUI, SmartRipper, VirtualDub, CDRWIN, JoyToKey, Media Player Classic Homecinema, Excel 97, Word 97, RivaTuner, WinPlay3, Word For Windows 2.0

    plus a few more cmd, and 40 WordPads

  • (Score: 0) by Anonymous Coward on Friday November 07 2014, @08:27AM

    by Anonymous Coward on Friday November 07 2014, @08:27AM (#113757)

    15 years ago the limit was 128 applications, which was some hard-coded limit in the X server (top-level windows). I don't know if that limit still exists.

    If you stuck to the command line, you didn't have that limit. I tried with a process using 100% CPU, and after starting a thousand of those, top reported a load average of 1000. The system was still fully responsive (I don't remember whether or not I "niced" those 1000 processes).

    That was probably with 128 MB of RAM and a 600 MHz Pentium 3. On modern hardware, even Windows should do a lot better than that. I'd actually expect Windows to do better than Linux in the GUI test, assuming that the 128 top-level windows limit in X hasn't been fixed.

  • (Score: 1) by pgc on Friday November 07 2014, @12:33PM

    by pgc (1600) on Friday November 07 2014, @12:33PM (#113785)

    This is NEWS ?!?