from the public-utility-or-profit-center dept.
Last January, the FCC’s Net Neutrality directives were struck down in the US Court of Appeals, a ruling that was expected—the FCC in 2002 reclassified ISPs as “information services”, which prevents the agency from exerting much control over the infrastructure that powers these networks. FCC Chairman Thomas Wheeler indicated that the agency will hold a vote on February 26 which is likely to return ISPs to being “telecommunication services” under Title II of the Communications Act of 1934. This will provide the FCC the legal authority to open access to utility poles to any interested company to build Internet infrastructure, as well as prevent ISPs from throttling speeds from any remote servers, or prioritizing speeds from others in exchange for payment.
http://www.techrepublic.com/article/the-fccs-possible-reclassification-of-isps-signals-hope-for-net-neutrality/
(Score: 2) by frojack on Friday January 23 2015, @12:54AM
This will provide the FCC the legal authority to open access to utility poles to any interested company to build Internet infrastructure, as well as prevent ISPs from throttling speeds from any remote servers, or prioritizing speeds from others in exchange for payment.
Title II contains no wording about speed throttling or building internet infrastructures. It does have verbiage about poles, but nothing about trenched in cables, switches, fiber to the premisis, etc. Not many places distribute broadband hanging from poles any more.
The existing code in Title II will need extensive additions, and all of that promises to be a long dragged out process with tons of public hearings and brief filing and posturing.
Most cable plants are not in the proper shape to be shared. They are loop structures, and the cable bandwidth in many cases is already saturated. The Fiber plants are in better shape. But again they weren't built with sharing in mind in most cases.
The Conduit under your sidewalks and lawns is sized to handle what it currently contains, so nobody is going to be stringing more through those pipes.
No, you are mistaken. I've always had this sig.
(Score: 2) by c0lo on Friday January 23 2015, @01:14AM
This is why, I suppose, the cost of building new infrastructure is big enough to allow only the biggest players on the playground, thus reducing competition.
Once easy access to the support for that cable is opened, I suspect that local ISP-es will start to emerge.
I (again) suppose that is not about sharing the cable/fibre, but actually sharing the support for running those cables (be it utility poles or underground conduit)
https://www.youtube.com/watch?v=aoFiw2jMy-0
(Score: 3, Interesting) by frojack on Friday January 23 2015, @01:28AM
After years of fighting it in my brain, I've come to the conclusion that like water and sewer and roads, local fiber should be owned by the municipalities, and open to all ISPs (including big cable). They can pay their way onto the distribution grid, out of the fees they charge the subscriber.
Just as long as the municipality runs it at break-even, so that it can be maintained, and upgraded as needed. No siphoning off funds to to support other causes, no running it into the ground (tragedy of the commons), and no censorship or packet inspection, or three letter agencies. Maybe time to get elected to your local PUD.
No, you are mistaken. I've always had this sig.
(Score: 2) by c0lo on Friday January 23 2015, @01:51AM
Unfortunately, the data transmission is not that similar with other utilities.
You see, neither water nor electricity upgrades their "definition" (as in "HD video") the way datacomm does - I don't see any household increasing its water/electricity consumption fivefold over 3-4 years; but at a certain moment, you are going to saturate the existing "municipal fibre" (unless Netflix installs a CDN node in every burb).
Only about 5 years ago, a 20 Mbps connection was on a premium plan. Nowadays, thanks to Netflix (for increased demand) and Google fibre (as a threat to a reasonable priced competition), 100 Mbps is "meh, good enough, I suppose" [arstechnica.com].
https://www.youtube.com/watch?v=aoFiw2jMy-0
(Score: 2) by frojack on Friday January 23 2015, @04:49AM
All true, but probably not that germane.
WHOEVER maintains the plant will have that problem. And as long as the fees cover costs its just a technical exercise.
Not terribly unlike roads, where a traffic pattern can change near a new mall or school and affect neighborhoods for miles around.
Local (neighborhood) loops probably will handle the load. Or you break them up till they do.
Then keep putting in backhaul fiber to the Neighborhood till you can handle them all, without digging up everybody's yard.
But there is a gawd awful amount of bandwidth on fiber, and most is barely carrying the minimum.
No, you are mistaken. I've always had this sig.
(Score: 2) by urza9814 on Friday January 23 2015, @04:34PM
Some places. Other places, not so much. Most people would be fine with current speeds for quite some time though. Many places people get upgraded without even knowing it (happened to me not too long ago -- they doubled my upstream and didn't even tell me! Which was pretty nice...). Hell my parents have actually downgraded to save a few bucks. They had 8 megabits when I was in highschool about seven years ago. Now they've got six. They stream two HD TVs off that connection and never have any problem doing so. Which isn't all that surprising considering I can download a 2 hour blu-ray rip on my 50 meg connection in 10-20 minutes. So 5-10mbps, if it's reasonably consistent, should be plenty for at least one HD stream. Plan to give people 50 meg fiber lines, and it'll likely be well over a decade before those are saturated.
There's also the fact that the lines don't need to change as often as the equipment. If you've got a house built 30 years ago with 30 year old coax, you can still pump a 100mbit internet signal through those same lines. Coax is coax pretty much, fiber is fiber. Those last mile connections don't change much, just the hardware on the ends of it. So if you separate running the lines from running the service, the lines should be fairly cheap.
Seems to me that upgrading the municipal internet is more like upgrading the municipal recycling. They can handle five new kinds of plastic now, but you still toss it in the same bin and it still gets collected by the same truck.
(Score: 2) by monster on Friday January 23 2015, @05:36PM
Some local fiber can be owned by the ISPs but deployed on municipalities' infrastructure: The trench is shared, some fiber is local and can be leased but there is some free space to deploy new fibers if some ISP asks for it. Each ISP pays as it uses that available space and has already done most of the deployment, a new customer is just linking to the correct fiber and enabling it and maybe deploying some meters of fiber to the house in worst case.
(Score: 1, Interesting) by Anonymous Coward on Friday January 23 2015, @01:29AM
There are a variety of tariff structures used in telecom. As a title II company, Cc may have been bucking for setting up a 'sender pays' type tariff system the whole time. IOW, it is entirely possible that they were being pricks in public and feigning resistance to Title II on the sly. (Oh no, don't throw me in that briar patch!) Such a system will likely just formalize their current system of shakedowns, and further entrench the monopolies.
And the FCC will all say: "Look what we did, we protected the Internet!", and then Congress will inform everybody that the Cc/TWC merger is good for business because of governing dynamics, and that will be that. In turn Cc/TWC will provide congress cheap political advertising based on their vast consumer analytics databases, so that all the brand R citizens get conservative content, and all the brand D citizens get liberal content, with no cross pollination, and no debate. Provided of course, that both views are in conformance with the new Cc/TWC political advertising guidelines.
Welcome to Potemkin Democracy.
(Score: 0) by Anonymous Coward on Friday January 23 2015, @02:19AM
Huh?!
Verizon just admitted that FIOS was financed by title II. Can't see how having some of the money on telecom aka cable bills going toward improving infrastructure could be a bad thing.
The monopolies exist. Title II isn't going to do much about that, but if it does anything, it will be the other direction, as 3rd party re-sellers will be possible. So, even with tarifs and improving infrastructure, we might see reductions in bills.
(Score: 0) by Anonymous Coward on Friday January 23 2015, @12:38PM
If you look at how long distance tariffs work, you'll get an idea of what the original comment under this title was saying.
Now instead of picking and choosing which content originators to shakedown, the new regulations will likely make shaking down ALL content providers the new normal. This would give Cc/TWC even more control over the marketplace since they are the dominant controller of the last mile. Maybe there might be more competition at OSI layer 1, but at layer 3+, there would be more private sector constraint over free speech and free trade, not less.
It really depends on how they structure the regs. But since we know who the sucker is, we can speculate to some degree. I expect FERC to be a likely analog to this experiment, with equivalent results. (rolling blackouts anyone?)
The MPAA would probably support a sender pays system, as it increases barrier to entry for new content providers. Since Cc/TWC/Amazon would seem to be allies of the MPAA, it is reasonable to expect their support as well. This is probably going to be a significant backroom issue in the 2016 election, much like the DOJ vs. Microsoft lawsuit was a backroom issue of the Bush vs. Gore election. (or was I the only one to notice that lawsuit got dropped as soon as Bush took office?) The good news is there will be no need for another Diebold. Engineering the 2016 electoral outcome will be much cheaper than the Diebold system was, since half the of the east coasts content will be within the reach of one companies filters.
(Score: 2) by SlimmPickens on Friday January 23 2015, @03:49AM
AFAIK every significant network employs shaping. IIRC the debate started when ISP's started deprioritizing sharing traffic.
So does it just mean not charging netflix and sharers more? Or do people truly mean all packets are equal?
Anyone working in an ISP think all packets can be equal?
(Score: 3, Informative) by GeminiDomino on Friday January 23 2015, @04:38AM
So does it just mean not charging netflix and sharers more? Or do people truly mean all packets are equal? Anyone working in an ISP think all packets can be equal?
At this point, this "all packets can be equal" meme is so widespread that I can't tell anymore whether someone's full of shit or they just believed the lie.
Giving you the benefit of the doubt: "Network Neutrality" has nothing to do with all *packets* being equal, it's about *endpoints* being equal (Netflix = Hulu = Xfinity (Comcast), e.g.) and it's absolutely necessary since we've managed to let ourselves get into a situation where every major ISP has put themselves into position for a nice, juicy conflict of interests; they not only offer the pipes, but their own content services, too. Not exactly an industry rife with ethics, it's no surprise that they started interfering with connections to their content-competition.
"We've been attacked by the intelligent, educated segment of our culture"
(Score: 2) by SlimmPickens on Friday January 23 2015, @05:39AM
It might be mostly about endpoints now but it started with gnutella and later bittorrent, before there ever was a Hulu or a Netflix, which is clearly more about a particular use case than a particular endpoint.
If all packets being equal is a stupid meme (which I didn't know it was, I just said that) then what type of shaping IS acceptable? Is it OK to slow down bittorrent somewhat on a network that is not sold as 1:1? Or for heavy users? Where is the line?
(Score: 1) by GeminiDomino on Sunday January 25 2015, @05:11AM
Where is the line?
Nowhere near anything you're talking about. The whole "QOS breaks network neutrality" thing is, and always has been, a misdirection(misdirect? w/e).
"We've been attacked by the intelligent, educated segment of our culture"
(Score: 2) by SlimmPickens on Sunday January 25 2015, @10:09AM
Maybe you didn't realize, but I don't even slightly support that point of view.
(Score: 1) by GeminiDomino on Sunday January 25 2015, @10:28PM
I don't even know what point of view you're talking about, but don't worry: I'm not attributing any support of any position to you, I'm talking about the history of the argument.
Back before "Title II hasn't caught up with the times" (which is, admittedly, true), the QoS argument was the go-to for the Anti-Neutrality camp. Originally, it seemed like a misunderstanding run amok, the way things tend to in the interTubez, but when you see the same names repeating it despite being corrected the last N-1 times, cynicism tends to kick in.
"We've been attacked by the intelligent, educated segment of our culture"
(Score: 3, Insightful) by NotSanguine on Friday January 23 2015, @08:56AM
That part of what net neutrality means is that I should be able to buy a dumb pipe that gives best effort (which includes appropriate upgrades to provide such) of a minimum level of bandwidth.
No proxies, no throttling and no protocol restrictions. Which is, actually, what I have now. Unfortunately, that means I'm not using any of the fiber (not that there is any fiber available to me, in one of the largest and wealthiest cities in the world) or cable providers, since they are server unfriendly, block lots of ports and surreptitiously throttle connections. Which means my connection speed, while acceptable for many uses, is of limited bandwidth.
I do understand why many ISPs block SMTP. This is reasonable to an extent, as most people don't need to run their own relays, and botnets are rife on most consumer ISPs. However, all the other port blocking, connection throttling, transparent proxies and abusive TOS conditions WRT servers is just that. Abusive.
IMHO, the biggest promise of the Internet lies in its peer-to-peer nature. Just because the big boys have tried to force it into a client/server paradigm, doesn't mean that's the best (or even the right) way to make it work.
Symmetric bandwidth with no server restrictions (with the caveat that if there is SPAM coming from your host(s) you will be cut off without notice until you can demonstrate that you are no longer sending it) along with peer-to-peer applications for file-sharing, social networks, distributed workloads and ubiquitous encryption could be the most liberty-enabling set of capabilities since the invention of movable type.
Unfortunately, as was pointed out by others on this thread, the big ISPs, the government and the corporate world don't want that. They want to limit what you can do with your connection. For several reasons. It allows them to oversubscribe their links and make their own content offerings more attractive. It allows them to limit your ability to communicate personal information only with those you want, without being spied upon by the government, your network provider and the rapacious marketing establishment that wants to know everything you do or say, so they can sell you more worthless crap.
Title II reclassification won't solve these problems. Nor will municipal (or quasi-public/non-profit) ownership of the last mile. But both of those things can (not a sure thing by any means) move us toward a better, more equitable, freer and more robust Internet.
No, no, you're not thinking; you're just being logical. --Niels Bohr