Researchers have proposed a cross-browser fingerprinting technique that uses OS and hardware-level features. The researchers claim to have successfully identified 99.24% of users in their dataset compared to 90.84% for the state of the art of single-browser fingerprinting.
Researchers have recently developed the first reliable technique for websites to track visitors even when they use two or more different browsers. This shatters a key defense against sites that identify visitors based on the digital fingerprint their browsers leave behind.
State-of-the-art fingerprinting techniques are highly effective at identifying users when they use browsers with default or commonly used settings. For instance, the Electronic Frontier Foundation's privacy tool, known as Panopticlick, found that only one in about 77,691 browsers had the same characteristics as the one commonly used by this reporter. Such fingerprints are the result of specific settings and customizations found in a specific browser installation, including the list of plugins, the selected time zone, whether a "do not track" option is turned on, and whether an adblocker is being used.
Until now, however, the tracking has been limited to a single browser. This constraint made it infeasible to tie, say, the fingerprint left behind by a Firefox browser to the fingerprint from a Chrome or Edge installation running on the same machine. The new technique—outlined in a research paper titled (Cross-)Browser Fingerprinting via OS and Hardware Level Features—not only works across multiple browsers. It's also more accurate than previous single-browser fingerprinting.
(Score: 1, Informative) by Anonymous Coward on Wednesday February 15 2017, @03:39AM
Why is it always JavaScript?!
In the paper, we propose a (cross-)browser fingerprinting
based on many novel OS and hardware level features, e.g.,
these from graphics card, CPU, audio stack, and installed
writing scripts. Specifically, because many of such OS and
hardware level functions are exposed to JavaScript...
(Score: 2) by takyon on Wednesday February 15 2017, @03:45AM
I'm thinking <canvas>, <video>, and <audio> have something to do with it.
What advanced font feature could be responsible? Is it in CSS?
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 3, Interesting) by Pino P on Wednesday February 15 2017, @05:45AM
Exact spacing of rendered web fonts (TTF, WOFF, etc.) can affect word wrapping and thereby cause element height to hit breakpoints.
(Score: 0) by Anonymous Coward on Wednesday February 15 2017, @06:14AM
List of installed fonts?
(Score: 1) by DECbot on Wednesday February 15 2017, @03:47AM
What VMs do people commonly use to run a browser, not that it wouldn't make too much a difference?
You know, perhaps the macbook users are a step ahead as they have less hardware choices. Now you just need to uncouple your Apple hardware from your Apple ID without removing Apple's software. And then accept all of Apple's default choices.
Perhaps I should start browsing the web with just telnet. But then I'd be identified by my typing rate and reoccurring typos.
You know, we might just be too damn smart for our own good. We should have been clever instead of smart and when presented with the challenge, respond with a "sounds plausible in a small group setting, but across society I don't think that is possible given the availability of commodity hardware and automatic updates and so forth." *wink**wink* instead of going, "sure, I'd love to identify myself, friends, family, and everyone for better marketing and government tracking! Sounds like a great idea because everybody makes rational, reasonable decisions firmly based on logic and everybody like me." What comic books have taught me is that logical, smart people are often the most sinister and misguided.
Netcraft confirms it, privacy is dead.
cats~$ sudo chown -R us /home/base
(Score: 2) by Snotnose on Wednesday February 15 2017, @03:49AM
I use Chrome. Vons.com and tdameritrade don't work with chrome for reasons. I use Firefox for both. Not like you're going to find me on match.com with both browsers, won't happen.
For all I know vons and td ameritrade fixed their websites a couple years ago. I won't know, I haven't tried.
When the dust settled America realized it was saved by a porn star.
(Score: 0) by Anonymous Coward on Wednesday February 15 2017, @03:59AM
It is the ad networks that help track everything together. Those little web bugs from indeed, linkedin, facebook, twitter, etc. Then the ad networks themselves have them. That they got it up to 99% accuracy does not surprise me. Heck most of the time just your IP and time of day alone is enough.
(Score: 0) by Anonymous Coward on Wednesday February 15 2017, @10:47AM
Well, those won't get any data from me as I won't even allow my browser to connect to them. If my browser doesn't even send any IP packets to them, there's no way they could find out anything about me, not even my IP address.
(Score: 2) by Snotnose on Wednesday February 15 2017, @04:17AM
Run scriptblock, maybe half the sites that don't work right get temporary javascript permissions.
When the dust settled America realized it was saved by a porn star.
(Score: 0) by Anonymous Coward on Wednesday February 15 2017, @06:13AM
(Score: 1, Touché) by Anonymous Coward on Wednesday February 15 2017, @09:23AM
Maybe, maybe not. That's a lot better than 'definitely'.
(Score: 0) by Anonymous Coward on Wednesday February 15 2017, @05:14AM
The big vulnerability here is WebGL.
Who even needs that for anything but games?
So turn off javascript globally and turn of WebGL.
That way when you re-enable javascript on specific sites, they still won't have access to WebGL.
(Score: 0) by Anonymous Coward on Wednesday February 15 2017, @05:45AM
There's also that video chat protocol and geolocation protocol and ...
We really need to bring control of the browser back to the user, just as free Unix freed us from M$ domination.
(Score: 4, Touché) by Anonymous Coward on Wednesday February 15 2017, @05:55AM
just as free Unix freed us from M$ domination
Yep free as in www.apple.com (iOS/BSD) and www.google.com (android/linux). Two companies that almost exclusively use unix/linux/bsd. They are the *very* definition of open right?
(Score: 0) by Anonymous Coward on Wednesday February 15 2017, @03:47PM
"open" is your word.
Anyway, they are not M$.
(Score: 1) by butthurt on Wednesday February 15 2017, @09:36PM
OpenDarwin and the Android Open Source Project--possibly named by Apple and Google--come to mind.
https://en.wikipedia.org/wiki/OpenDarwin#OpenDarwin [wikipedia.org]
https://en.wikipedia.org/wiki/Android_Open_Source_Project#Open-source_community [wikipedia.org]
(Score: 3, Informative) by Wootery on Wednesday February 15 2017, @09:15AM
Not even a mention of WebGL, the technology behind most of the new developments.
uses OS and hardware-level features really doesn't cut it. SN is intended for techies.
The entire blockquote tells me nothing I didn't gather from the headline.
(Score: 0) by Anonymous Coward on Wednesday February 15 2017, @11:08AM
Let's say your ISP installs a caching web proxy (like in ye olde days of 15 years ago to speed up the internet), would these methods fail?
(Score: 2) by Pino P on Wednesday February 15 2017, @02:22PM
If it's a traditional HTTP proxy, the user would have to configure the browser to connect to the proxy rather than directly to web sites. And now that HTTPS requests to public web sites exceed cleartext HTTP requests, as reported on Troy Hunt's blog [troyhunt.com] via the green site [slashdot.org], the majority of your traffic will fall back to a CONNECT method, which makes caching have too small of a hit rate to bother with.
Or did you specifically mean an intercepting caching web proxy? In that case, the user could avoid it by vising primarily sites that use HTTPS, such as Google-owned sites, Facebook, Twitter, and anyone else who has a Let's Encrypt or SSLs.com certificate. If you try intercepting HTTPS like the ISP of Firefox bug 460374's reporter did [mozilla.org], and the user hasn't specifically enabled your proxy, the user will see certificate errors.
Or did you mean that the ISP would require all subscribers to install its MITM proxy's root certificate? That would make the tech news for sure.
(Score: 0) by Anonymous Coward on Wednesday February 15 2017, @03:10PM
I meant in traditional sense, without TLS (I though that was clear and I know the user has to manually configure it). Would this fingerprinting technique fail if multiple people would configure their browser to use the proxy (even if the caching feature isn't used)?
(Score: 2) by Pino P on Wednesday February 15 2017, @11:47PM
I meant in traditional sense, without TLS (I though that was clear and I know the user has to manually configure it).
If trends continue such that 90 percent of HTTP traffic uses TLS, anonymizing the 10 percent that doesn't won't help much by itself.
Would this fingerprinting technique fail if multiple people would configure their browser to use the proxy
A script can still observe the precise results of rendering and relay it back to the site. Or a site can exploit subtle differences in page layout that trigger the retrieval or non-retrieval of images and other resources. Barring that, a site can make everything past the abstract "free reg. req." if it can't gather enough data to identify a user's demographic.
(Score: 2) by darnkitten on Thursday February 16 2017, @12:58AM
Is there anything effective at the moment?