Stories
Slash Boxes
Comments

SoylentNews is people

posted by hubie on Wednesday March 29, @07:57PM   Printer-friendly
from the pixels-pixels-everywhere dept.

State governments might be inadvertently helping Chinese-owned app in data collection:

More than two dozen state government websites contain web-tracking code made by TikTok parent ByteDance Ltd., according to a new report from a cybersecurity company, illustrating the difficulties U.S. regulators face in curtailing data-collection efforts by the popular Chinese-owned app.

A review of the websites of more than 3,500 companies, organizations and government entities by the Toronto-based company Feroot Security found that so-called tracking pixels from the TikTok parent company were present in 30 U.S. state-government websites across 27 states, including some where the app has been banned from state networks and devices. Feroot collected the data in January and February of this year.

[...] Site administrators usually place such pixels on the government websites to help measure the effectiveness of advertising they have purchased on TikTok. It helps government agencies determine how many people saw an ad on the social-media app and took some action—such as visiting a website or signing up for a service. The pixels' proliferation offers another vector for data collection beyond TikTok's popular mobile app, which is increasingly under fire in Washington as a possible way for the Chinese government to collect data on Americans.

[...] "Like other platforms, the data we receive from advertisers is used to improve the effectiveness of our advertising services," a TikTok spokeswoman said in a statement. "Our terms instruct advertisers not to share certain data with us, and we continuously work with our partners to avoid inadvertent transmission of such data."

[...] Tracking pixels, also called web beacons, are ubiquitous on commercial websites. The free bits of software code are intended to support digital marketing and advertising by logging a visitor's interactions with the site, such as what is clicked on and the duration of a visit.

While the web-tracking pixels ostensibly aim to better pinpoint advertising, they also pose threats for privacy, security experts have said. They can sometimes be configured to collect data that users enter on websites, such as usernames, addresses and other sensitive information. With enough pixels on enough websites, the companies running them can begin to piece together the browsing behavior of individual users as they move from domain to domain, building detailed profiles on their interests and online habits.

[...] Beyond TikTok, Feroot also found tracking pixels from Chinese-owned companies such as Tencent Holdings Ltd., which owns WeChat, Weibo Corp., and Alibaba Group Holding Ltd. on some state-government websites, as well as Russian-owned pixels from companies such as from cybersecurity company Kaspersky, which had its products banned from civilian and military federal U.S. networks during the Trump administration due to espionage fears.

[...] Feroot found that the average website it studied had more than 13 embedded pixels. Google's were far and away the most common, with 92% of websites examined having some sort of Google tracking pixel embedded. About 50% of the websites the firm examined had Microsoft Corp. or Facebook pixels. TikTok had a presence in less than 10% of sites examined.

Privacy advocates have long raised concerns about the proliferation of pixels, whatever their provenance. Alan Butler, the executive director of the Electronic Privacy Information Center, said the data can be used to identify individuals, track them physically and digitally, and subject them to common cybersecurity threats, such as phishing attempts and disinformation.

"Any social media platform, data broker, or ad service that is using tracking pixels to monitor people's browsing across the web is violating the privacy of users visiting those websites," Mr. Butler said. "This is especially troubling on government websites where individuals are being tracked even as they try to access information and services that are essential."

I'm sure it's fine...


Original Submission

This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by gznork26 on Wednesday March 29, @08:25PM (7 children)

    by gznork26 (1159) on Wednesday March 29, @08:25PM (#1298707) Homepage Journal

    It's been a while since I had my fingers in website code, so I'm not up on the methods currently in use. TFS doesn't speak about how they operate, but I suspect there's an included bit of code that reports user details to the mothership when the server sends particular page content to a user's browser. What's the method used to get that code on the server, and what can be done to find and remove the code?

    • (Score: 3, Interesting) by Barenflimski on Wednesday March 29, @08:36PM (2 children)

      by Barenflimski (6836) on Wednesday March 29, @08:36PM (#1298708)

      I 2nd this question.

      The last time I put together a site a couple of years ago, it was clear how I ended up with google trackers all over it.

      It isn't clear to me where a TikTok tracker would come from. Would love to know what app or plugin folks would add for that. I'm assuming this was intentional by some developer?

      • (Score: 2) by mcgrew on Wednesday March 29, @11:57PM (1 child)

        by mcgrew (701) <publish@mcgrewbooks.com> on Wednesday March 29, @11:57PM (#1298752) Homepage Journal

        I'm not sure what you call "google trackers," web bugs that are housed on Google's servers? If so, your evil host put them there, or tricked you into doing it for them, maybe with a visitor counter or some other javascript.

        If you mean spiders, they leave less than a visitor with a browser does. It just tells a search engine what your site says.

        --
        Carbon, The only element in the known universe to ever gain sentience
        • (Score: 3, Informative) by RS3 on Thursday March 30, @04:00AM

          by RS3 (6367) on Thursday March 30, @04:00AM (#1298785)

          There are many kinds of trackers, but a common one is a simple one-pixel image (usually .gif) that's hosted by the tracking site's server.

          Every time you visit a site / page that has for example a link to a google tracking pixel, google's web servers log your access, including whatever data they can glean from your browser, generally called "user agent", or browser "fingerprint".

          Your browser might have a very unique "fingerprint". As I posted below, you can test your browser. A really good test site is: https://coveryourtracks.eff.org [eff.org]

          It will test your browser / computer and tell you how unique your browser's "fingerprint" is. Unique is very bad. The more unique it is, the easier for someone like google to cross-correlate your browser's fingerprint in their logs and know all the sites you've visited.

          My browser setup passes the tests with flying colors, except for one bug: my browser gives a very unique but very incorrect screen resolution. I'll figure that out and fix it, but generally my browser (Vivaldi) is blocking all trackers and most if not all ads (ignorance is bliss dept: I don't know what I don't know. :)

    • (Score: 4, Informative) by RS3 on Wednesday March 29, @09:39PM (1 child)

      by RS3 (6367) on Wednesday March 29, @09:39PM (#1298717)

      Some of the data comes from the browser <-> server interaction. I don't have time right now, but you can search for sites that test your browser's security, what it reveals, etc. Search on something like "browser security audit" or some such. And that's just the browser's self-identifying data. You can go much deeper with tests that run extensive javascript which can dig into all kinds of stuff.

      As such I generally browse a sketchy site with Old Opera, javascript and cookies turned off, maybe I'll look at source, watch IP traffic to see what the site wants to do, how many 3rd-party sites the page is trying to communicate with (Old Opera easily gives a list of servers the site is trying to link to).

    • (Score: 3, Informative) by hopdevil on Wednesday March 29, @10:29PM

      by hopdevil (3356) on Wednesday March 29, @10:29PM (#1298723)

      These tracking things can get installed in many ways, but typically it is intentionally done by the developers of the website (at the behest of marketing dept or such). Individual trackers (pixel/JS) could be installed, but more likely they come as part of a snippet of JS included in the website source which then loads a whole suite of other bullshit. They are essentially giving up code execution into anyone's browser who happens across the website.
      Like the article says, this is done to measure the effectiveness of advertising.. by literally tracking what your journey was to get to the website and what you do on it. But that is only the beginning of what the trackers can do, and they certainly take their liberties by going all the way once you install (include) them.

    • (Score: 2) by mcgrew on Wednesday March 29, @11:52PM

      by mcgrew (701) <publish@mcgrewbooks.com> on Wednesday March 29, @11:52PM (#1298750) Homepage Journal

      It's an app, not a website. Apps are virtually unlimited, trust is as paramount as with any other computer program.

      --
      Carbon, The only element in the known universe to ever gain sentience
  • (Score: 5, Insightful) by Runaway1956 on Wednesday March 29, @10:35PM (2 children)

    by Runaway1956 (2926) Subscriber Badge on Wednesday March 29, @10:35PM (#1298724) Homepage Journal

    The real problem is that ANYONE can track you on a government website. Google shouldn't be able to do it. Or Apple, or Amazon, or Microsoft, or - fill in the blank with whichever tech company you want to hit on.

    When I visit a .gov site, ALL INFORMATION should be extremely confidential. In fact, the government's definition of confidential isn't really quite high enough - it should be secret. Sharing personal information from or through a government website should be punishable by a couple years in prison. Forget about fines, I want to see prison time.

    This is something that congress needs to address. Seriously - who in the fuck authorized any of these tech companies to harvest data through government websites?

    On a related note . . . some congress members are all over the ATF right now, for assuming authorities which the ATF does not have. Watch for a journal post soon.

    --
    Abortion is the number one killed of children in the United States.
    • (Score: 2) by Gaaark on Wednesday March 29, @11:36PM (1 child)

      by Gaaark (41) Subscriber Badge on Wednesday March 29, @11:36PM (#1298742) Journal

      Yep: why are they allowing social media on government sites? They should be locked down as tight as possible. (But so should banking/financial sites as well, so go figure). Damn we are a pathetic species, needing social media pwning data from us with our permission and nothing is done about it....

      --
      --- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
      • (Score: 2) by RS3 on Thursday March 30, @03:19AM

        by RS3 (6367) on Thursday March 30, @03:19AM (#1298778)

        Disclaimer: sorry if I'm cynical or worse here. Over the years I've used this and that federal and state government website, including state DOT. From the site design, and looking at page code, my take is there are people and web developers ripping off the government. Sadly the various govt. agencies are trusting the developers- usually outside companies / contractors, to do a good job. What I often find is a developer's playground. Getting paid to play with unnecessary stuff, all to pad their CV.

        One site in particular required Adobe Flash for state assistance (medicaid, food stamps, welfare, etc.- I was helping someone). Several thoughts came to mind: 1) the required level of Flash required a fairly decent / recent computer and OS. What poor people can afford a new computer? 2) Flash was pretty insecure. 3) Flash was proprietary, not clear text or html. 4) Flash was going to be deprecated soon. 5) The Flash app was clumsy, awkward, not forgiving, couldn't go backward to previous page, ... 6) The obvious: the needs tests could and should have easily been done in html.

        All that aside, many (most?) webpage generators, especially webpage-based editor/generators, automatically add the various trackers.

        Years ago I was required to use Adobe Dreamweaver (now I hear a song in my head) and it made buggy code and embedded some kind of something that tracked back to Adobe. I always hand edited the crappy pages before posting/publishing them.

  • (Score: 2) by Mojibake Tengu on Wednesday March 29, @11:48PM (1 child)

    by Mojibake Tengu (8598) on Wednesday March 29, @11:48PM (#1298748) Journal

    Loading any non-origin location content to the web page is abyssal design defect of the browser protocol architecture.

    It is impossible to secure anything with that mechanical flaw, whatever meaning of security you venerate.

    Did you know there is a special cursed Hell Ring readily created, attended only by the damned Netscape/Mozilla engineers?

    --
    The edge of 太玄 cannot be defined, for it is beyond every aspect of design
    • (Score: 5, Interesting) by RS3 on Thursday March 30, @03:33AM

      by RS3 (6367) on Thursday March 30, @03:33AM (#1298782)

      All good and correct points. To be fair, a ton (far too much) of OS and code design is overly optimistic and trusting that nobody will do wrong things.

      All that said, there are more and more browsers which can block any 3rd-party URL/URN/URI accesses, and in some cases you can whitelist any you trust.

      I still run Old Opera (NOT chrome-based) and one of the many things I like about it: it has site blocking, blacklist and whitelists for URL, cookies, per-site javascript on or off, etc. For sure some sites won't load because it doesn't have the latest TLS level (1.2 is max). Some will load but get badly screwed up due to older css rendering engine, but you can turn css off and see flat html rendering. But it's much safer than most current browsers, so it's my main browser, and heavily customized / configured Vivaldi for the rest.

      Last summer I suddenly had huge problems with ebay / paypal. Many phone calls and chats mostly resulted in the typical "blame the user" crap, and most of them trying to get me to "link my bank account". That's not going to happen!! None were smart enough to see technical details: the new version of Vivaldi added very strong built-in tracker and ad blocking. I didn't even know it was there. It's on a per-site basis. None of the geniuses at ebay or paypal could figure it out, but I somehow noticed it, turned it off for ebay and paypal, and everything worked. I tried to let them know what I found, and the response was <crickets>.

(1)