Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Wednesday June 24 2015, @05:39PM   Printer-friendly
from the pass-it-to-know-what-is-in-it dept.

The BBC reports:

Legislation key to US President Barack Obama's trade agenda has passed a key hurdle in the Senate, just two weeks after it appeared to have failed.

The bill known as the Trade Promotion Authority (TPA) or, more commonly, Fast Track, makes it easier for presidents to negotiate trade deals.

Supporters see it as critical to the success of a 12-nation trade deal known as the Trans-Pacific Partnership (TPP).

The bill is expected to pass a final vote in the Senate on Wednesday.

Tuesday's 60-37 vote - just barely meeting the required 60 vote threshold - is the result of the combined efforts of the White House and many congressional Republicans to push the bill through Congress, despite the opposition of many Democrats.

This is primarily a tech news site, and it's generally good to avoid political news, but the TPP is a huge trade deal, negotiated in secret, that will have large ramifications for the world economy that affects us all, and that also has large implications for the accountability of major world governments to their citizens.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by VortexCortex on Thursday June 25 2015, @05:20AM

    by VortexCortex (4067) on Thursday June 25 2015, @05:20AM (#200798)

    Good points, but some of this has been addressed.

    A novel solution I've experimented with is to render the Unicode string then run it through an OCR program that favors output of the "expected" characters, typically due to multi-character recognition which is sort of like on the fly language detection. Compare the raw input URL chars to the OCR'd output URL chars and if any char doesn't match then it's likely an impostor URL using Unicode to mask itself. However, this is not a 100% fool proof solution. Without language detection it can generate false positives as there are valid uses of Unicode in URLs / domain names. For instance, Unicode can be included in domain names via Punycode's ToASCII() specified by Internationalized Domain Names. [wikipedia.org] (RFC3492 [ietf.org], required by RFC5890 [ietf.org], etc.) The Punycode function is complex as it attempts to avoid using common look-alike characters. Rehash should be using this to sanity check domain names, and not worry about the Unicode included after the domain name, as that's not important. Punycode is yet another encoding instead of using a escaped UTF-8 as in URL encoding; Used mostly due to the 64 char max limit of domain names, but also to remain somewhat human readable (Naming it Punnycode would be too punny).

    However, the complexity of Unicode encodings in URLs has little to do with the failure to sanitize the anchor tag's title attribute and following "[domain.name]" via something like Perl's CGI module's escapeHTML() function prior to output, without which soylentnews.org is vulnerable to HTML and/or Javascript injection...

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by TheLink on Thursday June 25 2015, @08:19AM

    by TheLink (332) on Thursday June 25 2015, @08:19AM (#200842) Journal

    More than a decade ago I proposed an HTML tag that will disable active/fancy stuff (the opening and closing tags need a matching random string).

    That way even if the browser makers or HTML bunch come up with new features, the new stuff would still be disabled.

    It's ridiculous to have a car with only "GO" pedals and not a single "STOP" or brake pedal, and to stop the car you are required to make sure that all the "GO" pedals are not pressed.

    Apparently in recent years Mozilla has come up with something called CSP ( https://developer.mozilla.org/en-US/docs/Web/Security/CSP [mozilla.org] ) which supposedly would help do this sort of thing and is better than my original suggestion.

    FWIW if they had implemented my suggestion, many of those XSS worms wouldn't have worked.

    And by now people could say "add another brake pedal option" to stop such stuff - since the concept of brake pedals would no longer be novel in the browser/HTML arena.