Legislation key to US President Barack Obama's trade agenda has passed a key hurdle in the Senate, just two weeks after it appeared to have failed.
The bill known as the Trade Promotion Authority (TPA) or, more commonly, Fast Track, makes it easier for presidents to negotiate trade deals.
Supporters see it as critical to the success of a 12-nation trade deal known as the Trans-Pacific Partnership (TPP).
The bill is expected to pass a final vote in the Senate on Wednesday.
Tuesday's 60-37 vote - just barely meeting the required 60 vote threshold - is the result of the combined efforts of the White House and many congressional Republicans to push the bill through Congress, despite the opposition of many Democrats.
This is primarily a tech news site, and it's generally good to avoid political news, but the TPP is a huge trade deal, negotiated in secret, that will have large ramifications for the world economy that affects us all, and that also has large implications for the accountability of major world governments to their citizens.
(Score: 2) by Non Sequor on Thursday June 25 2015, @02:04AM
There's a pretty major problem with using any large character set with amiguous glyphs in an application where you are using the characters to make identifiers which are intended to be unambiguously readable by both humans and machines. Honestly, it seems like the real solution is to break unicode into language specific (not necessarily disjoint) subsets and forbid identifiers that mix and match from multiple subsets. Allowing such identifiers will surely lead to pestilence and crop failures.
Write your congressman. Tell him he sucks.
(Score: 2) by VortexCortex on Thursday June 25 2015, @05:20AM
Good points, but some of this has been addressed.
A novel solution I've experimented with is to render the Unicode string then run it through an OCR program that favors output of the "expected" characters, typically due to multi-character recognition which is sort of like on the fly language detection. Compare the raw input URL chars to the OCR'd output URL chars and if any char doesn't match then it's likely an impostor URL using Unicode to mask itself. However, this is not a 100% fool proof solution. Without language detection it can generate false positives as there are valid uses of Unicode in URLs / domain names. For instance, Unicode can be included in domain names via Punycode's ToASCII() specified by Internationalized Domain Names. [wikipedia.org] (RFC3492 [ietf.org], required by RFC5890 [ietf.org], etc.) The Punycode function is complex as it attempts to avoid using common look-alike characters. Rehash should be using this to sanity check domain names, and not worry about the Unicode included after the domain name, as that's not important. Punycode is yet another encoding instead of using a escaped UTF-8 as in URL encoding; Used mostly due to the 64 char max limit of domain names, but also to remain somewhat human readable (Naming it Punnycode would be too punny).
However, the complexity of Unicode encodings in URLs has little to do with the failure to sanitize the anchor tag's title attribute and following "[domain.name]" via something like Perl's CGI module's escapeHTML() function prior to output, without which soylentnews.org is vulnerable to HTML and/or Javascript injection...
(Score: 2) by TheLink on Thursday June 25 2015, @08:19AM
More than a decade ago I proposed an HTML tag that will disable active/fancy stuff (the opening and closing tags need a matching random string).
That way even if the browser makers or HTML bunch come up with new features, the new stuff would still be disabled.
It's ridiculous to have a car with only "GO" pedals and not a single "STOP" or brake pedal, and to stop the car you are required to make sure that all the "GO" pedals are not pressed.
Apparently in recent years Mozilla has come up with something called CSP ( https://developer.mozilla.org/en-US/docs/Web/Security/CSP [mozilla.org] ) which supposedly would help do this sort of thing and is better than my original suggestion.
FWIW if they had implemented my suggestion, many of those XSS worms wouldn't have worked.
And by now people could say "add another brake pedal option" to stop such stuff - since the concept of brake pedals would no longer be novel in the browser/HTML arena.