from the it-worked-until-it-didn't...-now-what? dept.
Apple has had an incredible decade. Since the iPhone debuted in 2007, the company's sales have jumped tenfold. The stock has soared over 700%. And up until last November, it was the world's largest publicly traded company. But two weeks ago, Apple issued a rare warning that shocked investors. For the first time since 2002, the company slashed its earnings forecast. The stock plunged 10% for its worst day in six years. This capped off a horrible few months in which Apple stock crashed about 35% from its November peak. That erased $446 billion in shareholder value—the biggest wipeout of wealth in a single stock ever.
[...] Despite the revenue growth, Apple is selling fewer iPhones every year. In fact, iPhone unit sales peaked way back in 2015. Last year, Apple sold 14 million fewer phones than it did three years ago.
[...] In 2010, you could buy a brand-new iPhone 4 for 199 bucks. In 2014, the newly released iPhone 6 cost 299 bucks. Today the cheapest model of the latest iPhone X costs $1,149! It's a 500% hike from what Apple charged eight years ago. [...] In 1984, Motorola sold the first cell phone for $4,000. The average price for a smartphone today is $320, according to research firm IDC. Cell phone prices have come down roughly 92%. And yet, Apple has hiked its smartphone prices by 500%!
[...] Twelve years ago, only 120 million people owned a cell phone. Today over five billion people own a smartphone, according to IDC. [...] now iPhone price hikes have gone about as far as they can go. [...] A publicly traded company that makes most of its money from selling phones is no longer telling investors how many phones it sells!
Related Stories
Apple's 'courage' to remove the headphone jack has created a brave new world
It was barely two years ago when we lamented the loss of the headphone jack on the iPhone. The iPhone 7 had just arrived with a gorgeous jet black color, a solid-state home button, and a dongle in place of the 3.5mm headphone jack. At the iPhone 7 introduction, Apple VP Phil Schiller talked about having the "courage" to make the change, to leave the headphone jack behind.
At the time it was kind of cringe-worthy. Rather than try to convince the audience of the benefits of wireless charging or the annoyances of wired earphones, Schiller basically told the audience that they might not understand now, but one day they will. You could hear the snickers in the auidence when he said that removing the headphone jack required the "courage to move on and do something new that betters all of us." It sounded ridiculous. All we could see was the inconvenience ahead.
But you know what? He was right.
It might have sounded like the reality distortion field on steroids, but Apple's decision to remove the headphone jack from its most popular product wasn't a flippant design whim. It was the start of a new strategy that would bring convenience, simplicity, and downright delight.
The move led to courageous sales of AirPods.
See also: Poll: Looking back now, did Apple exhibit 'courage' in removing the headphone jack from iPhones?
Related: New Moto Z Omits Analog Headphone Jack; Adds Moto Mods
Bring Back the Headphone Jack: Why USB-C Audio Still Doesn't Work
Apple on the Decline
(Score: 4, Interesting) by SomeGuy on Tuesday January 22 2019, @07:46PM (6 children)
Who would have thought. You fill a market to saturation, and eventually many people realize they don't even NEED the product. iPhones are DEAD! Bury them right next to the DEAD laptops and DEAD desktop computers! And leave DEAD pencils and paper on its grave!
And you will still have to throw it away after a couple of years.
Meanwhile, I am using the same desk phone I purchased (probably from Radio Shack) in the 1980s, and when people call me from other real phones they actually don't sound like robots trying to fuck my ear!
(Score: 4, Insightful) by bob_super on Tuesday January 22 2019, @08:12PM (5 children)
> And you will still have to throw it away after a couple of years.
I positively hate Apple, but you can expect their phones to get updates for 4 or 5 years.
Thought the people who spend over a grand on a phone rarely want to ever be seen with a phone from three years ago.
All the gimmicks of making phones bigger and adding more cores and a ridiculous amount of cameras are coming to an end in the next three years. I'm not sure what the marketing guys are gonna invent next, but the overall market will look a lot different soon.
Saturation, commoditization... expected until the next "killer" app/device.
(Score: 5, Insightful) by DannyB on Tuesday January 22 2019, @08:36PM (2 children)
Like typical desktop computers, phones eventually have enough cpu, memory and storage. Once that point is achieved, adding more does not really benefit most users, but continues to increase the price.
Most PC users now could do with a PC that easily costs under a few hundred dollars. A similar thing will be true with phones.
Once most phones have plenty of capacity and capability, the thing users start looking for is for the SAME phone to get cheaper, not to have newer phones with more capacity than they need. Although cameras seem to continue to be improved. But again, I speculate that there comes a point where it is simply enough.
If we sing a slaying song tonight, what tools will be used for the slaying?
(Score: 3, Informative) by bzipitidoo on Tuesday January 22 2019, @09:29PM (1 child)
> Most PC users now could do with a PC that easily costs under a few hundred dollars.
This is a big reason why I've gone for low power PCs-- the stick computers, that Intel NUC, and of course the laptops and tablets. Cuts down on the noise, and still has enough power to run a desktop reasonably snappily. And can still play most high end games, albeit at lower framerates and resolutions. Though one problem I've had is that the embedded Intel HD graphics are prone to overheating. The computer can grind away at a CPU intensive math problem, but not actually use the 3D accelerated graphics, not without some adjustments.
(Score: 2) by Freeman on Wednesday January 23 2019, @05:17PM
You still get more bang for you buck building your own AMD machine. I've only built AMD over the past decade and am quite happy with my choices. None of the computers I've built have died over that time period. I had some stability issues with my current setup, but I got 1st gen Zen and motherboard. Not sure whether the AMD updates or Motherboard updates eventually fixed the issues, but during that time I know I also had some issues with Windows 10. I think I just paid the first adopter tax on the hardware and the software.
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
(Score: 2) by Mykl on Tuesday January 22 2019, @10:51PM (1 child)
Agreed - though I think we have already reached the point where the specs are "good enough".
I'm running with an iPhone 6s that's now about 3.5 years old, and have no intention to change any time soon. The replaced battery is not fantastic, but I have a charger at my desk so it doesn't create huge problems. Not only are the specs good enough for me, I actually prefer the slightly smaller form factor (it is still just a little too big in my opinion - the iPhone 4 was a great size as I recall because you could reach the whole screen with your thumb when holding with one hand).
(Score: 0) by Anonymous Coward on Wednesday January 23 2019, @03:51AM
You need the new iThumb extension!!
or,
You're holding it wrong.
(Score: 5, Informative) by EvilSS on Tuesday January 22 2019, @08:30PM (1 child)
This is a bit disingenuous. Yes, you could, but that included heavy carrier subsidization and a lock-in contract. You paid full price, you just didn't realize it because the carrier built it into their plans. Full price for an iPhone 4 was $650-$750 ($770-870 in 2017 dollars) depending on specs. Yea still a leap compared to today but comparing subsidized prices then to unsubsidized prices today isn't a fair comparison.
(Score: 0) by Anonymous Coward on Wednesday January 23 2019, @02:18AM
Ya, that subsidized iPhone 1 vs non-subsidized iPhone Xs comparison was nothing more than shock-and-awe clickbait, but it did make me actually take a few moments to think "Waaaaaait a minute...I'm sure the original iPhone was more like $600". I just looked at my unfriendly-neighborhood cellphone company's website, and a 2-yr subsidized iPhone 6,7, 8, & X all start at $0 (Canadian dollars), and the iPhone Xs Max is $589.
Until Apple released the X, the iPhone price hasn't increased that much since then it was released-they all started around $600-$800.
I have a 7 and wont upgrade until it dies or long after it stops receiving updates, which I am happy Apple provides for at least 5-6 years.
(Score: 5, Interesting) by DannyB on Tuesday January 22 2019, @08:33PM (12 children)
I remember when Apple was a great company. The classic Mac days. Apple led in technology. BYTE magazine wrote that the history of the microcomputer industry was an effort to keep up with Apple. Mac fanboys used to laugh at PCs. Dip switches? Motherboard jumper pins. Cryptic autoexec.bat and config.sys? The difficulty of what should be simple: memory expansion, hard drive expansion or addition, CD-ROM addition. Mice that plugged into the serial port and required configuration, didn't just work at boot time. The fact that there was a BIOS settings instead of just a happy mac face starting the boot with no opportunity to intervene. Drive letters? Really, drive letters? Drive C? Eight character file names, and no international character sets?
Macs that had a "cpu unit" with separate monitor could be taken apart with the bare hands. No tools. Where PCs often required tools, and I swear the manufacturers actually PAID someone to sharpen the metal edges inside the computer's interior.
Those were fun days.
Apple tried and failed three times to build its next generation OS.
They ended up buying NeXT. (I had been hoping they would buy BeOS which had an OS that IMO was a much better fit.) But No. NeXT had Steve Jobs. I guess Apple forgot that there was a reason why Steve Jobs was stripped of his power at Apple. . . Steve chose to leave Apple on his own. Once he left Apple, Steve built a computer (NeXT) that had everything that he wouldn't allow the Mac to have. Color. Separate cpu box from the monitor. Large memory capacity. Etc, etc.
Once Steve was back at Apple, thus was planted the seeds of Apple's eventual downfall.
Mac OS X based on NeXT. Very different, but was adapted to look more like a Mac. Incompatible with classic Mac apps. And in a single stroke, Apple abandoned compatibility with all of the expensive Macs (some were $5000 or more) that were in existence. Mac OS 9 was the end of the line.
Then came the iPhone. I thought it was great. But I noticed how Steve was going to repeat past mistakes and not license the iOS to other phone manufacturers who had been around for a very long time in the phone business before Apple joined. Sort of like if Apple had licensed the Mac OS in the 1980s there probably would never have been a Microsoft Windows. But OTOH, in hindsight, I think Steve Jobs with concentrated power would have been worse than Bill Gates and Steve Ballmer put together.
Then came the inevitable patent wars. Because patents could be had on things like bouncy scrolling, or slide to unlock. I could say more about the patent wars, but I'll draw to a close.
After the end of Mac OS Classic, I got interested in Linux and never looked back. I looked at everything that could eventually make Linux mainstream and successful. One thing I was interested in was any type of Linux powered phone. I liked Palm's Web OS, but Palm was too inept to even understand what they had. I settled for Android becoming the winner because I could do some of the things I dreamed of. Rooting it for example. I viewed the very closed iOS as something that worked against the rise of Linux. So I no longer thought so fondly of the classic Apple.
I suspect Apple will be around a long time. Hopefully a lot less powerful. If it passes, I will have some sadness remembering the good ol' days, but I wouldn't be too sad.
Do remember this: not so many years ago Microsoft was so powerful it seemed invincible. I recognized around 2010 that there was a day coming where Microsoft's best days would be behind it. And we have long since crossed that point now. Microsoft is trying to embrace everything open source. Who would have thought.
If we sing a slaying song tonight, what tools will be used for the slaying?
(Score: 2) by bzipitidoo on Tuesday January 22 2019, @09:34PM (1 child)
For me, classic times were the days of the Apple II, not the Mac. In the early 80s, I had an Apple II+, with a 16K expansion for a total of 64K RAM. And then, I never made the jump to the Mac. Instead, seeing that PCs were everywhere, I switched to the PC, and that's where I still am today.
(Score: 2) by DannyB on Wednesday January 23 2019, @02:42PM
I remember the Apple ][. We used the p-System (and Pascal language). On the Apple ][ and Apple ///, we used Apple Pascal which was extremely close, except the binary formats were different. (Apple Pascal was forked from the p-System 2, and we used the p-System 4 on all other platforms, IBM PC, Corvus Concept, etc.) But it was source code compatible, which was fantastic to build a commercial accounting system and sell it on multiple platforms.
Our favorite was the Apple /// over the Apple ][. The SOS (operating system) was really nice. Ahead of its time. The 3 was a great software design, plagued by a few hardware problems.
Alas, we had to give up on the Apple ][ due to limited maximum memory. But it was fun times. We looked at Lisa very carefully, but did not bite. When we saw pre-release Macs, and even had access to one ahead of release, we decided to get into Mac development -- which meant buying Lisa's because they were the development platform for the Mac at that time. Then we were pleasantly surprised that the Lisa Pascal, and later MPW Pascal were surprisingly compatible with Apple's p-system based Pascal. Once I was working on the Mac I never gave the PC another look. (But I had written 8086 code for the PC to do our fancy quick snappy text window scrolling and updating within 'windows' on the text screen. That x86 code was called from p-System pascal.)
It was sad to see the end of the classic Mac. I had my hopes on Apple's great new Mac OS which never materialized. Except in the ugly form of OS X.
If we sing a slaying song tonight, what tools will be used for the slaying?
(Score: 4, Insightful) by Thexalon on Tuesday January 22 2019, @09:47PM (2 children)
I've generally assumed that the overall geek-friendliness of Apple started dropping more-or-less the day Woz stopped being part of the day-to-day. Woz was always more about making cool stuff than about making oodles of money, and it showed in what he was involved in designing.
But the obvious problem now is that Steve Jobs' reality distortion field is no longer in effect, and the iPhone never was even close to as insanely great as he was capable of convincing others they were.
"Think of how stupid the average person is. Then realize half of 'em are stupider than that." - George Carlin
(Score: 2) by PartTimeZombie on Tuesday January 22 2019, @10:50PM (1 child)
To be fair, the original iPhone was pretty cool. I was supporting a bunch of Nokias and a couple of Windows phones (Win CE 6.5 maybe?) when they came out.
Setting up email for the user was an absolute doddle, and it was a huge amount better for the end user too.
The various Nokias we had were pretty good phones, but the "smart" bit was stupid as hell, and the displays were too small to do much with.
The other thing the iPhones did was show just how shit Blackberries were for the end users. We had one executive who demanded a new Blackberry every time a new model came out. After the other senior people all got iPhones they sniggered behind his back. To be fair he was an absolute bellend.
The Win CE phone (we had two I think) were shit from whatever angle you looked, poorly thought out interface, buggy as hell, they dropped calls, missed txts, and needed restarting every morning. Just rubbish.
(Score: 3, Interesting) by PocketSizeSUn on Wednesday January 23 2019, @10:52AM
For business people (and especially those that travel outside the US) e-mail and BB PIN messaging were the killer apps. Nothing else was even worth wasting time on and while there was a lot of 'ooh ooh shiny' uptake and people 'demanding' iPhones because they were status symbols the bulk of the business world kept using blackberry because:
- e-mail arrived faster w/o blowing out the battery.
- unlimited data plans (including unlimited international data plans).
For people doing business (like your bellend) those e-mails *are* the business .. and for your iPhonys I would stipulate that there may not have been 'bellends' but they were most probably utterly useless middle management and it would take 6 months to notice if the just stopped showing up.
It wan't until Android matured enough to have push gmail that blackberry actually started to lose market share. That caused BB to try to 'catch up' to Android/iPhone by being more of an app-phone (like iPhone/Android). At that point their devices necessarily got data hungry which cause the unlimited plans to go away. All that was left was the PIN-PIN messaging. WhatsApp happened to be where everyone went and at that point the deal was sealed and the rest is history.
(Score: 2, Interesting) by Anonymous Coward on Wednesday January 23 2019, @02:52AM (1 child)
I don't agree with your assessment of Steve's return to Apple in the 90's. That Sculley guy almost destroyed Apple by releasing mediocre and overpriced products, like Cook is doing now. People stopped buying Macs. OS9, or OS8, or maybe both I don't remember, was a disaster. They didn't respond to Windows 95 properly. Win95 offered good enough plug-and-play that worked (most of the time), good enough stability, good enough memory management, and for $2000 you could get a high-end Pentium 200 PC that outperformed and was half the price of anything comparable that Apple had. Almost all software companies were jumping ship and developing Windows software. Apple was dying and had just enough life left to maybe barely survive into the early 2000's.
If Jobs didn't come back then Apple and the Mac would be just a memory today and would be in the same "What if..." category of the Amiga. Microsoft even invested a few hundreds of millions into Apple in the 90's, just to keep the antitrust people off their back.
(Those were exciting times in personal computing....god I miss those days.)
I'm not a Jobs fanboy at all, but he is the reason why Apple went from a niche, almost bankrupt company in the late 90's to the richest company in the world based on stock valuation.
I must agree that BeOS 4 was an amazing OS. If it just would have been released 5 years earlier...who knows what might have been?
(Score: 3, Interesting) by DannyB on Wednesday January 23 2019, @02:51PM
As I seem to recall, and from regular MacWEEK issues of the day, was that Apple was almost destroyed by Scully's successor. Can't quite remember his name.
Apple was pushing the PowerPC as the future. First they had to convince developers. And they did. Then they had to convince customers, and slowly, they did.
But Apple didn't believe it themselves. So they made a billion dollars worth of 680x0 Macs that -- surprise -- nobody wanted to buy because PowerPC was the future.
That billion dollar writeoff was part of what brought Apple to its knees. Along with failures in developing it's next generation OS.
Mac OS 8 and 9 came out. But clearly the new OS was nowhere to be seen. Meanwhile Apple's Mac PowerPC hardware ran BeOS which was looking pretty cool.
I do have to agree with you about that. But by that time I had seen through the reality distortion field and saw Jobs for what he really was. A huckster, IMO.
It is amusing that when Jobs returned he had given up on the errors that got him stripped of his power by Scully. He no longer put such ridiculous artificial technical limitations on products. Like the original Mac having only 128 K and never to ever have any more than that. Forever, amen. And never ever to have color. And never to have a separate cpu box and high quality monitor. And never to have expansion slots. None of those things came until Jobs was gone from Apple. Apple might have also been a distant memory if Scully had not taken action to limit the damage Jobs was doing. Developers were screaming for memory, powerful CPUs, high end color graphics and monitors. Expansion slots. In fact, it wasn't until the Mac II (post Jobs) that the Mac really came into its own.
If we sing a slaying song tonight, what tools will be used for the slaying?
(Score: 0) by Anonymous Coward on Wednesday January 23 2019, @06:15AM (2 children)
CMP killed BYTE.
(Score: 2) by DannyB on Wednesday January 23 2019, @02:52PM (1 child)
What is CMP? An assembly language instruction? Or some kind of PC magazine I never heard of? (But I was a Mac fanboy by then.)
If we sing a slaying song tonight, what tools will be used for the slaying?
(Score: 0) by Anonymous Coward on Wednesday January 23 2019, @08:11PM
CMP Publications, I believe is what the poster was referring to. Now a part of UBM Technology Group.
(Score: 4, Interesting) by ledow on Wednesday January 23 2019, @08:41AM (1 child)
You remember differently to I, or we had very different experiences.
The 80's - Apple was non-existent. I don't remember a single mention of them until I saw one guy who had a Macintosh (128Kb). The ones with the tiny integrated screen. I was mad-keen to have a look at it because it was unusual, and I had no idea what it was. The screen seemed ridiculous even back then. It was, I realise now, a status symbol. Everyone else who had a computer (which commercially wasn't many people, admittedly, unless you count those home computers we all had that plugged into the TV) thought it ridiculous. Nothing serious was ever done on it. I mean... you couldn't. Squinting at that thing for hours on end was silly. And the cost was exorbitant.
The 90's - Pretty much Apple-less. The iMac G3 was the highlight and the only thing Apple that I remember at all. It seemed a nice idea - integrate that TV that we'd all been using into the computer itself. However, PC's suddenly were everywhere. There was only a clear distinction between those people with "a PC" (meaning an x86/IBM-compatible) and those people who only had non-computers (games consoles, etc.). That was it. That was the difference. The average person didn't have a computer unless they'd used them for work. The G3 highlight was "look how pretty it is". That's why they put it in movies in kid's bedrooms, and why it came in 20+ colours. I literally never knew a single person who owned one, but I helped one friend throw one away years later. I have absolutely no memory of them ever owning it. This was still the 68000-era. Macs were completely different architectures. My university had a single Mac suite on site. And no less than four PC suites (at least, that's as many as I had access to), and thousands of PCs everywhere. The Mac suite was full of graphics and multimedia students, and serviced by a guy who did nothing else but Mac because the normal tech support wanted nothing to do with them. Not one lecturer had a Mac, everything was done on PC.
The 2000's - Again, Apple-absent. This is the decade they copied PCs and became incredibly expensive x86 PCs. They grew in terms of mobile devices and iPods, but I still didn't know anyone who had a Mac for anything more than show (i.e. they did no real work on them, and when they tried they weren't even aware that Apple Pages etc. wasn't the format to send stuff around in - which shows that they rarely, if ever, exchanged documents with the wider world). Yet by now everyone had a PC even if it was a big clunky thing in the corner of the living room. This is where Macbooks started to come out and were just incredibly expensive toys that only students seemed to be able to afford (which, having graduated before then, I found very odd as I'd literally not been able to afford even a basic laptop for my courses in the 90's).
I've never seen them as anything more than a status symbol. People buy them because other people buy them. To a man, the people who buy them I rarely witness using them as a tool. Merely a gadget. A TV-watching browsing device. They are big iPhones, basically, same as the iPad. Even the "creators" and arty people have moved away from them.
I manage IT in schools. We've ditched all Apple technology and are just using what remains on site until it dies (not just "no plans for replacement" if they break but "deliberate plans to get rid of them all and replace with something else that isn't Apple"). The only Macs in real use are in the hands of people who tell me they "trained in design". These people can barely adjust a photo in Photoshop despite literally elbowing others out of the way to "show them how its done" because they trained on Photoshop. A few of the young kids who work for the schools (on their gap-years, teacher-training etc.) have Macs that - and I know this for a fact as I have their web logs - get used for Chrome, Netflix and iTunes. That's it. The biggest use of them is "opening things we've been sent in Apple Pages" (I kid you not) and "video-editing" using iMovie. I introduced VSDC video editor three years ago and iMovie literally died there and then. Finished. Done. Game over. Because of a bit of freeware. Not least because freeware-on-the-oldest-business-PC without any graphics acceleration literally wiped the floor with iMovie on any Mac we had.
I don't get it. Apple's "revolution" happened far away and out of the public eye, in my experience. I have seen / touched / used literally a thousand times more PCs than Macs. But, hey, I only work in IT. About the only "common" item I see is iPhones. I joke - but have also as part of the joke started to keep count and take notice - that every iPhone I ever see has a cracked screen. I think the running total (of unique iPhones that have come to me for anything - wifi settings, app install, etc.) is currently up in the 50's. That's literally 50 different devices, on the trot, without seeing one that didn't have a cracked screen. By comparison, other brands (which has included Windows Phones!) it's the opposite - it's 50-something before I *see* one with a cracked screen. Given that I get to see everyone's devices before they can join our networks, that's rather telling.
I never got it. I never bought the hype. And everything they sell is at least 2-5 times more expensive than it needs to be. I think there is a literal "I've paid the money now, I need to tell people it wasn't a waste" element, as well as a "Look what I have" and even a "I don't know how to use it, and I have no idea what I bought, but isn't it pretty?" factor. And that has applied to every Apple device I've ever seen in my life. Which is less than one percent of all the computers that I've ever dealt with in my life.
(Score: 2) by DannyB on Wednesday January 23 2019, @02:57PM
We do have different experiences. Once I got started on the Mac in late 1984, I never looked back. I lived in an entirely Mac centric world until 1993 when I did go back to doing some cross platform work with Mac and PC Win 3.1. But I thought of Win 3.1 as quite an inferior toy. In every possible way.
But . . . I was impressed with VB and Access at that time. I even suggested to the company president that maybe we should not do cross platform development any more and our next platform should be VB / Access or something similar. We surveyed our customer base and even at that time it was 56 % Mac. That was because our accounting system was one of the very few that ran on the Mac, so naturally we got a lot of the Mac customers who needed our system.
So we continued to be cross platform to this very day where everything is now web based. :-)
But I stopped being a Mac fanboy in about 1999 when I got my first Linux box with SuSE 5.1.
If we sing a slaying song tonight, what tools will be used for the slaying?
(Score: 4, Insightful) by Anonymous Coward on Tuesday January 22 2019, @08:35PM (9 children)
Apple has moved to larger phones, and gotten rid of the headphone jack. Their recent iPhones ditched the TouchID fingerprint reader as well, in favor of face-scanning.
This past Saturday, a limited stock of new iPhone SEs were put up on the clearance section of their online store for the US. These are the smaller factor phones with those outdated headphone jacks and archaic TouchID that Apple apparently thinks no one wants anymore. The iPhoneSEs sold out within hours.
They went four years between updates of the Mac Mini. They added a Touch Bar to their Macbooks to replace function keys. It's been over five years since they updated the Mac Pro. Half their products are using USB-C connectors, half are still on Lighting cables.
If they have a strategy, it is unreflected in their product lineup. It looks like they're thrashing around without a clue as to what to do next.
But Microsoft's still stupider, and Google's still eviler, so they got that going for them, which is nice.
(Score: 4, Interesting) by DannyB on Tuesday January 22 2019, @08:41PM
About the headphone jack . . .
I look a few years back. The Samsung Galaxy S5 seemed like the peak of phones. Consider:
* headphone jack
* replaceable battery
* WATER PROOF -- that was amazing!
Since then, it has been down hill. Even for Samsung who made the S5. The next generations had glass backs. Non replaceable batteries. Were not water proof. Its like the disappearing headphone jack.
Why could they once make good phones, and now phones evolve in a direction that is actually consumer unfriendly?
Oh, and the preloaded non-removable apps.
What I now use: Google Pixel 3 XL. Was using: Google Nexus 6P. Both are nice. But expensive. Do these phones really have to be that expensive?
If we sing a slaying song tonight, what tools will be used for the slaying?
(Score: 2) by takyon on Tuesday January 22 2019, @08:55PM (7 children)
The whole industry has moved towards larger phones and getting rid of the headphone jack (even OnePlus [gizmodo.com] killed the jack). But I'm sure there will always be a handful of Android phones with smaller screen sizes and/or headphone jacks.
Fingerprint scanners and face ID are both bad since cops/feds could physically force you to unlock, and just lie about what happened if called out on it in court. Don't use biometric features if they are included.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Tuesday January 22 2019, @08:59PM (3 children)
Alternatively: don't break the law.
(Score: 5, Touché) by takyon on Tuesday January 22 2019, @09:05PM
The law is whatever the man with the gun and badge says it is. Keep licking boots.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 1, Touché) by Anonymous Coward on Wednesday January 23 2019, @08:10AM
Tell that to the government.
(Score: 2) by DannyB on Wednesday January 23 2019, @03:00PM
That is good advice. Too bad the cops don't follow it.
Sorry officer, my mistake. When I saw all the police cars parked here I thought I had found the donut shop I was looking for.
If we sing a slaying song tonight, what tools will be used for the slaying?
(Score: 2, Interesting) by Anonymous Coward on Tuesday January 22 2019, @09:33PM (2 children)
Yes, the whole industry is certainly moving towards no headphone jack. When I can't get any phone with a jack, I'll get a feature phone and use my iPod classic and iPod nano (both still working) until they die. Then I'll get some standalone mp3 player with a jack.
I move from one audio source to another all day long. My phone, to the computer, to the console, sometimes even the radio -- and back. There's no way in hell I'm going to resync Bluetooth headphones that many times a day, nor will I get separate headphones for each source.
I assume one reason they're ditching the headphone jack is because they want users to stream everything to that device, and jacks make it too easy to switch audio sources. All must come through the single device, the better to charge you for it, and track what you do...
(Score: 2) by Mykl on Tuesday January 22 2019, @11:02PM (1 child)
I don't think it's that sinister:
I prefer the headphone jack too, but I can see the writing on the wall.
(Score: 0) by Anonymous Coward on Tuesday January 22 2019, @11:33PM
You're both right.
Not intentionally sinister, good reasons for Apple to have gotten rid of it, more challenging to waterproof.
But also beneficially ties people deeper to more expensive technology, Apple didn't HAVE to have it so think you can't get a jack in, and waterproofing IS possible and would have been a good feature for a company REALLY trying to look out for its users instead of giving users more candy.
And my anecdote isn't data, but I'm with GP. My iPhone 6P (which I got on contract after 7 or 8 was out...) will be the last iDevice I own unless they bring back the jack and there is any smartphone that offers one. Lost too many Newton Dongles to care about Apple's adapter and all the software I really care about is device agnostic.
(Score: 2, Interesting) by Apparition on Tuesday January 22 2019, @09:15PM (10 children)
Here's the problem as I see it, and I know that this will be a bit contentious here: The iPad is still the best tablet, bar none. I'm sorry, but Android is a terrible OS in general, and an even worse OS for tablets. Is the iPad expensive? Heck yes. Is iOS a walled garden? Definitely. But, unlike an Android tablet, the apps work, and iOS spies on its users far less than Android does. If you want, you can read this article [bleepingcomputer.com].
Yes, I know. LineageOS. Root your device. I'm sorry, that's just a complete non-starter for your average user, especially since most recent Android devices sold in North America are completely unrootable.
So the options are to buy cheap Android devices that literally spy on you fourteen times per hour, or to buy expensive iOS devices that respect your privacy far more than Google does.
So until something better comes along (I was really rooting for the Jolla tablet), I'll stick with Apple and iOS.
(Score: 5, Insightful) by jmorris on Tuesday January 22 2019, @09:30PM (5 children)
You can't easily copy your own media to a iProduct, you don't get an SD card slot, you can't sideload your own apps and you can't get root. On what planet is that "best" when it isn't even an option?
(Score: 2, Informative) by Apparition on Tuesday January 22 2019, @09:45PM
There's this thing called Nextcloud that lets you copy your own media. It even has an iOS app, you should look into it [nextcloud.com]. As for a SD card slot, and not being able to get root, that's also the case for most modern Android devices so I don't see any difference there.
(Score: 2) by PartTimeZombie on Tuesday January 22 2019, @11:01PM (3 children)
Well, goodness. Here I am agreeing with jmorris. Quick, someone check if the lion is lying down with the lamb and whatnot.
In about 1992 I worked in the pre-press industry and we began using Macs. They were better than PC's and it was not even a debate. They were better. Full stop. (Or period, if you must).
As the years went by, Windows happened, then new versions of Windows happened, and Windows became more and more stable, and worked pretty well, then very well, then Apple's products were not "better" any more. Either option was fine.
Somehow, all my friends who still work in that industry still claim Apple stuff is "better", because it "just works", even though any other option "just works" too.
My point is, that iPads are fine. They work fine.
Android tablets work just as well though. Anything you want to do with an Android tablet you can, and just as well.
It is not 1992 anymore.
(Score: 2) by Apparition on Wednesday January 23 2019, @12:09AM (1 child)
Most Android apps look and work like blown up phone apps on tablets, (which is because they are). As I pointed out above, iOS also respects its users' privacy a lot more than Android does. So while Android may also work, I'd rather use a mobile OS that respects my privacy.
(Score: 0) by Anonymous Coward on Wednesday January 23 2019, @09:14AM
Then use Replicant, even if the phone selection is limited. The problem with iOS is that it's a proprietary jail; it does not respect users' freedoms at all. And even if it does respect your privacy now, that could change at any time, given that you have no real control over it.
(Score: 2) by hendrikboom on Wednesday January 23 2019, @12:41PM
I want to use monotone to sync the documents I write between my Android tablet and my computer.
(Score: 3, Interesting) by crafoo on Tuesday January 22 2019, @11:28PM (2 children)
The Surface Pro is far superior to Apple tablets. Even encumbered with Windows 10, it's still far better.
(Score: 0) by Anonymous Coward on Wednesday January 23 2019, @03:10AM
I've played with my girlfriends surface and I really liked it and was very impressed. She squealed with delight as she could see how impressed I was and how much I was enjoying it.
Like most people, I am not a fan of Windows 10, but as far as the Surface goes, for a similar price of a iPad Pro you get a real laptop that works also very well as a tablet and comes with a real OS that can run real applications and not just dumb single-function apps.
This is Apple's problem: They have lost their imagination and are currently releasing products based on momentum built up during the Steve Jobs days. Their obsession with "thin" is an obsessive compulsion. A Surface-like device running full-fleged version of OSX is something I really would have expected Apple to come out with, certainly not Microsoft. Apple today feels like the stagnant Apple from the mid-90s that almost ended up going bankrupt. Unfortunately there is not Steve Jobs to come to the rescue this time.
(Score: 2) by hendrikboom on Wednesday January 23 2019, @12:45PM
The Surface Pro looked very nice, indeed. But can it dual-boot Linux and Windows? Will it still do that after a Windows forced upgrade?
(Score: 2) by PocketSizeSUn on Wednesday January 23 2019, @11:12AM
Yes .. you can tell the phone not to do that .. not sure how much the google apps honor the request though.
And of course that location information could be misused, or used to incriminate.
One of the primary uses of that data (well that I use a lot) is related to maps ... like when a place is busy and traffic information.
On the other hand if you are worried about your location history being used to discover that you went to a 'bad' part of town and spent a few hours at some 'shady establishment' then you probably ought to leave your phone at the office because it doesn't really matter what phone you have ... that data is available to the TLA's
(Score: 2) by arslan on Wednesday January 23 2019, @01:53AM (1 child)
That's the problem. There's so many opportunities for Apple to venture into with the size of their current warchest. But cook is not a visionary, he can man the ship once a direction is set but he ain't no explorer.
It wasn't just devices with Jobs, it was things like a marketplace, i.e. App Store and iTunes, etc. Cook needs to look beyond their current direction.
(Score: 0) by Anonymous Coward on Wednesday January 23 2019, @07:00AM
Maybe nobody is. The world is currently Jobless.
(Score: 2) by shortscreen on Wednesday January 23 2019, @10:58AM (1 child)
They already sold off their spare kidneys to buy the last phone, how are they going to afford the latest one?
(Score: 1) by anubi on Thursday January 24 2019, @12:28AM
They will just stop peeing.
More time to yap on the phone!
"Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
(Score: 1) by jman on Wednesday January 23 2019, @07:56PM
In Q4 2018, Apple computer revenue - around seven and a half billion dollars - was nearly twice their tablet sales, but only around twenty percent of what the phone brought in.
That doesn't look good for them; with the planet fairly saturated with cell phones, Apple, may need to consider alternate revenue streams. They should remember that they were born as a hack-friendly hardware company, and try selling some products that weren't internally glued together. It might be fun!
Last summer, my '09 MBP - a then "top of the line" Core-2 Duo running at 2.8Ghz, which had cost around twenty-five hundred - started getting "tired", and that now relatively "slow" cpu, combined with a max of 8gb memory, was making Adobe - and thus, me while operating it - unhappy. I'd gotten a lot of good use out of that laptop, but decided it was time for a new box. Nine years was the longest I had ever gone with one main rig; surely a testament to the quality of their crafstmenship.
One nice thing about that laptop was that I could service it myself. The very first thing I'd done to it was spend another $200 to upgrade from the four gigs of memory it originally had.
This time around, there was just no way was I going to spend anywhere near what I did on the old machine and not be able to upgrade ram or storage, or perform repairs as needed.
So, I let Amazon hit me up for a little over four grand (using a new credit card with 18 months zero interest), and built a Hackintosh desktop running High Sierra.
Hardware-wise, it's comprised of:
Case: Cooler Master HAF XB EVO
Motherboard: Gigabyte Z370N
CPU: Intel i7-8700K CPU (3.7Ghz, 6 cores)
CPU Cooler: Corsair H115i (Liquid)
Memory: Corsair 32GB LPX DDR4
Power Supply: Corsair RM850x
Storage: Samsung 970 Pro 512GB (NVMe M.2, not available on the iMac)
GPU: nVidia Titan XP (12GB DDR5. iMacs use the Intel Iris 640, meaning no dedicated GPU, which for a supposed high-end graphics machine is just plain dumb)
Monitor: Not one, but two 28" 4K Asus PB287Q monitors (I use them in landscape mode, but they can swivel to vertical if one prefers that view)
Keyboard: Matias FK488TS (Wireless Aluminum Silver, Bluetooth)
Mouse: Microsoft Bluetooth Mobile Mouse 3600 (Bluetooth)
Speakers: Elegiant USB Powered Sound Bar (Convenient volume knob, mic & headphone ports, sits comfortably beneath and between the monitors)
Backup: Western Digital 8TB My Book Desktop (USB "spinning rust", very reliable, have used their products for years)
The board has four SATA ports, so I re-purposed the MBP's 1TB Samsung 850 Pro SSD for additional storage, and installed a couple of other drives that had previously been intermittently connected to the laptop via USB:
A 2TB "spinning rust" drive (one of LaCie's "Rugged" orange-cased portable units) holding music, video, ISO's, etc. I'd purchased it as a companion to the laptop while on the road, but it kept losing connection. Turned out that was due to it having no external power supply, so gave up on it being a travel drive and opened the case to find a 2.5" Seagate inside which, once properly powered, has worked flawlessly.
A 512 GB Samsung 850 Pro SSD for WinDoze and Linux VM's, Docker containers, etc.
Don't use WiFi as the board has dual Gig Ethernet ports - not to mention that unless you really can't get a wire there, there are very few times you'd need WiFi on a desktop (how many times does one have to say: "Oh, the internet's down, and I REALLY need to finish that download. Let's just tether the desktop to my phone!") - but did spring for a Broadcom M.2 WiFi/Bluetooth card (BCM94352Z, around forty-five dollars) so the keyboard and mouse would not have to be tethered. Got the Broadcom as there were no Mac drivers for the Intel M.2 card that came with the board. Thankfully it was not soldered on, and easy to swap out.
As keyboards go, this one was fairly expensive; a little over a hundred dollars. I settled upon it after test-driving models from several different manufacturers at a local consumer electronics store, and found it felt most like that of the MBP, to which my fingers had grown very accustomed over the past nine years. I do a lot of typing, and didn't want to get used to a new feel. The mouse was nothing special. (I don't go crazy over mice. Use them for GUI work, sure, but coming from the Mainframe days still prefer the keyboard. Chose the Dvorak layout around 30 years ago when deciding to learn touch-typing. The fingers thank me.)
Haven't gotten a blue-ray burner, but that would only add another hundred or so, and am not sure I need one. I do have a USB optical drive, and realistically the only time I use it these days is to rip CD's I've purchased from the local record store. My current favorite USB stick, normally with no plug showing, can slide out front or back to present either a traditional A plug, or a Micro (the latter of which allows me to copy files to and from many mobile devices), and besides holding handy Windoze virus fighting tools and various scripts is loaded with multiple bootable distros (SystemRescue, CloneZilla, WinDoze recovery, and so on).
How about that horribly expensive GPU? Cinebench shows 130+ on OpenGL (the Titan XP is a 6-core running at 3.7Ghz. A 4-core Quadro K4000M running at 2.8 Ghz runs at half that). (For CPU, it shows shows 1400+, while a 12-core 2.6 Ghz Xenon X5650 would run a little under 1300) Yes, the system is fairly snappy. Adobe is *very* happy, and it returns the favor to me.
When running Adobe, the canvas of whatever program is active takes up one whole screen, its tool panels the other. It's very handy not having to close panels all the time so as not to obscure what you're actually working on.
So, I spent four grand on a new computer, which is blazingly fast, has an inordinate amount of screen real estate, and runs the OS I've been using as my main one for almost ten years. BSD under the hood, done Apple-style, so beautiful graphics and an actual *nix toolset - the best of both worlds!
Then again, when first acquired, the MBP appeared to be just as blazing, and when needed I was fine with plugging a 1920x1080 monitor into it for added real estate.
The MB/CPU/Mem portion of what I bought was under a quarter of the total system cost (150/350/420, respectively), so theoretically upgrading the core of the machine is possible while retaining most of the other parts, and repairing anything that goes out won't cost me the whole system again.
I know, this shiny new box will fade to a slow-poke down the road, but with what I put into it, hopefully that road will be a long one.
On the Apple side? Today you can purchase a 27" 5K iMac with just *slightly* faster CPU, the same 32GB of memory, and a 2TB SSD for around forty-five hundred; roughly ten percent higher than my cost. Not bad considering it's quadruple the base storage I purchased, though said storage is SSD, not NVMe (A 1TB Samsung 970 would have added around two-hundred twenty-five dollars to my cost; they don't make a 2TB - yet).
Add eight hundred to double the memory to 64GB, now making it twenty-five percent higher than my cost (with that crappy integrated graphics, you may wish to consider it). Seeing as how the nVidia I bought has 12GB all to itself - and DDR5 at that - my system actually has 48GB of memory, and the the flipside of the GPU having all that super-fast memory to itself means the CPU doesn't have to share any of its 32GB with graphics processing, either.
While the new 5K iMacs can go to 64GB, the Gigabyte board I chose only supports 32. The main reason for getting that board was its recommendation from many Hackinhosh websites. (Yes, before spending all this money, I did a bit of research to make sure the hardware would work with OSX!) Being able to quadruple my MBP's memory was certainly a bonus, but quite frankly, even with "only" 32GB, no matter how many programs I have running simultaneously, it hardly ever goes to swap (right now I have twenty-one programs running with those that can having files open: Activity Monitor, Adobe Acrobat/Audition/Bridge/Illustrator/Indesign/Photoshop/Premiere Pro, Android Studio, Atom, Chicken (Mac VNC client based on Chicken of the VNC), FileZilla, FireFox, Intel Power Gadget, iTerm2, LibreOffice Calc/Writer, Thunderbord, Vivaldi, and VLC(listening to a classical station out of Switzerland); swap is under two megs.
Note I usually don't have all those Adobe programs open at once, but do often have many of the rest, along with Illustrator and InDesign, plus often the Apple Script Editor. Almost always Atom/FireFox/Illustrator/InDesign/iTerm2/LibreOffice Calc/Thunderbird/VLC plus the two monitoring apps are open.
Not counting the WD Time Machine drive, I currently have a total of 4TB storage, and could easily expand that as needed. The most you can get on the iMac is 3TB, and that would be one of those horrible Fusion drives. Sticking with SSD, the most you can get is 2TB, and what you buy is what the machine will always have.
If the iMac has a hardware issue, let's just hope you have both Apple Care, the patience to trek into one of their stores at least a couple of times, and the serenity to deal with their "geniuses" without resorting to violence.
So, cost and power-wise, while you can get something from Apple comparable to what I built with their product being around ten percent more in cost, you would:
Never be able to upgrade it in any way
Lack an actual GPU
Be able to fix it yourself (unless, unlike me, you're as good as the super-talented folk over at iFixit)
Granted, it's no longer considered a "portable", but I never could have gotten anything like the power of that nVidia card into a laptop. The case - at 16"x17"x12" - resembles a Borg ship, but is super-easy to work with, and has convenient carrying "handles" built into the left and right covers (both of which sides, along with the top, being removable), so from a certain viewpoint, it actually *is* portable. I could see it being used for the server at a LAN party. It'll hold Micro, Mini, or full ATX boards, has two hot-swappable drive trays, and space for an additional four drives; more than the motherboard can accommodate, unless you were to add a PCI card for additional storage (which I could not, as the board only has one of those slots, and in my system it's taken up by the GPU).
So, Apple, you want to make more money? Go back to being "insanely great", and make it Apple-easy for end-users to perform their own upgrades and repairs. You could make a modular system and sell the parts for a tad more than one could get from Amazon, make a bundle, and maybe even actually get OSX market share up to respectable levels.
Speaking of market share, statcounter's website for December 2018 shows Android at almost thirty-eight percent, which is nearly two percent higher than WinDoze (thirty-six), which is nearly three times higher than iOS (thirteen), which is more than twice as high as OSX (six). Linux? Still under one percent, but that must mean they're only counting GUI installations. Seriously, if I, just a run-of-the-mill IT guy, currently maintain three CentOS boxes, the actual number must be *much* larger than that.
As phone and tablet usage goes, sorry, Apple, your stuff is very pretty, but no walled garden for me; it's Android all the way. Haven't done anything to the phone yet (it's not paid off) but on the tablet am running LineageOS (sucessor to CyanogenMod) with no Play Store in sight. It's a Samsung T813 10" with the keyboard/case, a few years old now. Tiny screen comparitively, but workable when on the road, and if I need something from home can always dial into the box. The current phone is a Samsung Galaxy 8. Used to really love my HTC One (had the M7 & M8), and hope that company can keep up, but when it came time for a new phone some years back, their M9 just couldn't match what Samsung had to offer, so ended up with a Galaxy S6+.
Would still be on the Galaxy 7 that I had in-between the 6+ and the 8, except that I stupidly left it on the roof of the car one day and took off, getting up to around 50 on the access road when I heard a little thump and saw what looked like leaves scattering behind me. Hitting the ground, it had come out of the case, along with all my credit cards. Walked up and down the road for awhile gathering everything back up. Never did find the debit card, which of course was where all the auto-draws lived.
The phone actually still worked for almost a week before the screen finally got to be too awful to work with. Naturally I had just paid it off less than a month prior.
Oh, that now going on ten year old MBP? Threw a 512GB SSD I had laying around into it, re-loaded El Capitan (last version that would work on that hardware), and for a utility box, so long as I don't really try to work Adobe, it's doing just fine.