The Russian parliament has approved a law creating a separate, domestic network, separate from the Internet. This Russian network of networks will be fully isolatable and will mean that the country's communications will become autonomous and able to continue functioning even when the plug is pulled on Russia's connections to the Internet at large. Concerns increase that this move will be used more for control of content and even just plain censorship, and make any attempts at circumventing restrictions much more difficult. The law is expected to take effect November 1st. Russia has already banned certain programs, such as Telegram.
One of the law's goals is to keep as much of the data exchanged between Russian internet users within the country's borders as possible. This aim may sound like a move to protect Russian users from external threats, but rights groups have warned that the new measures could ultimately be directed at Kremlin critics rather than international adversaries.
The idea of increasing the government's control over the internet is part of a more long-term national policy trend. In 2017, officials said they wanted 95% of internet traffic to be routed locally by 2020. Since 2016, a law has required social networks to store data about Russian users on servers within the country. The law was officially presented as an anti-terrorism measure — but many criticized it as an attempt to control online platforms that can be used to organize anti-government demonstrations.
Also at Silicon: Russian Parliament Passes Bill To Isolate Internet.
(Score: 1) by shrewdsheep on Wednesday April 17 2019, @03:31PM (3 children)
Hm..., this would imply that Russia is actually partitioned into unconnected network zones that find each other only through external links. Seems rather unlikely, implying ignorance of basic internet principles.
(Score: 3, Insightful) by HiThere on Wednesday April 17 2019, @04:42PM (2 children)
But if their goal is in opposition to "basic Internet principles", why wouldn't they do it?
Personally, this doesn't surprise me at all. It does mean that the "regional net" wouldn't be able to use services externally made available, but OTOH it would foster local development of (near) equivalents. They might need to temporarily go back to web spiders, but search engines aren't impossibly difficult to write. If they're serious about this they'll adopt incompatible protocols. It's not that difficult.
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 0) by Anonymous Coward on Wednesday April 17 2019, @05:50PM (1 child)
Yandex.ru is a search engine juste like Baidu is
(Score: 2) by HiThere on Sunday April 21 2019, @01:07AM
Yes, but it currently uses the same protocols as the other search engines. But incompatible protocols are quite possible, and not even that difficult. (The hard part is getting agreement, especially when correcting spec errors.)
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.