Google agrees to settle Chrome incognito mode class action lawsuit::2020 lawsuit accused Google of tracking incognito activity, tying it to users’ profiles.

  • whatwhatwhatwhat@lemmy.world
    link
    fedilink
    English
    arrow-up
    51
    ·
    11 months ago

    I actually saw a video once where the argument was that phones aren’t listening. Rather, Google (and Meta and the like) have so many other data points on you that they don’t need to listen. Listening to you would be far less efficient and far less insightful than relying on their vast network of other data they have on you. Even if you don’t use a single Google product, you’re still not safe.

    Reminds me of the story where Target knew a customer was pregnant before she did. They started sending her ads for pregnancy/baby products before she even knew she was pregnant, all because they had so much data on her.

    In my opinion, this is way more terrifying and problematic than if they were listening to us.

    • paraphrand@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      11 months ago

      In my opinion, this is way more terrifying and problematic than if they were listening to us.

      Exactly. This is what I try to explain to people when they bring up the listening.

      Somehow it’s difficult for many to comprehend this. They find the listening easier to understand.

      • whatwhatwhatwhat@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        ·
        11 months ago

        I described it to my dad like this: “They don’t need to listen to your conversations because they’re already able to simulate your thoughts.”

        Kinda a stretch, but it worked for him.

        • vexikron@lemmy.zip
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          11 months ago

          Shhh shh shh, you can’t just tell people that the vast majority of them are so predictable in so many ways that most of their life choices can be determined to reasonable degrees of accuracy such that they functionally /are/ NPCs to those with decent models and a huge training dataset.

          It’ll upset them.

    • thejml@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      11 months ago

      This is really what’s likely happening. Running a microphone + speech decoding 24x7 on a device with limited battery and limited, metered bandwidth is quite a proposition, especially when there are so many ways to prevent microphone usage (not to mention how wrong Siri and Google and Alexa get things). It’s far easier to just gather data that people willingly provide and extrapolate.

    • vexikron@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      4
      ·
      edit-2
      11 months ago

      Oh I agree that is happening … as well.

      See my argument for /the phones really are listening and watching/ is pretty simple.

      Parental Control mode.

      A phone with parental control mode on does NOT let the user know it is on.

      And it can listen, know your location, see what you are doing, etc.

      It will even lie to the user and tell them location tracking is disabled… when it actually is not.

      It will even lie to the user and tell them processes are disabled when they are not: Clear the cache, disable the thing, turn off wifi, and the process will update and reenable itself unless youve pulled out the sim card or turned airplane mode on. But then your phone isnt a phone any more, and they all reactivate once you let it be a phone again.

      Only way you can get a hint this is going on is by installing an app that lets you monitor per app, per process network activity in detail.

      If the OS has the capability to do these things and lie to the user about it… only showing up as strange bursts of google and android services firing off network activity when there shouldn’t be any…

      … then we know the OS has the ability to listen to you. Voice recognition isnt perfect for many reasons, but all your phone has to do is hear enough key words and make them out, and the fire off a little network burst of the key words to something like AdSense, or via a closed source api directly into an app.

      Then all that has to happen is tech companies lie to their users and assume either they will never be definitively caught, or that if they are caught, they’ll survive the fine, and do other market tactics to make it so competitors who do not spy on users never gain marketshare, including just buying them out.

      Google gets money from selling your transcribed voice data directly via AdSense, and from selling it to apps that access the secret api that serve their own ads, other apps benefit from increased ad accuracy as well as other info maybe more pertinent to them specifically.

      To me its similar to the argument around most anti cheat engines in games: nearly all of them are closed source, networked root kits. Its why to this day many online games cannot be played on linux: devs often seemingly just assume that linux players are hackers by default. Never mind that this is not true at all, that the vast, vast majority of actual hacks exploit vulnerabilities in windows and are sold on shady forums and darknet sites.

      What you end up with is the current anti cheat paradigm for many large game studios of forcing pc players to use anti cheat software that causes massive performance issues, has full total kernel level access to your system, and you cant reveal anything about the code because then the hackers would know how to hack better! … Even though anti cheat software does not actually work to stop effectively stop hacks in any decently popular game for very long.

      Again the argument is: These are tech companies that would gain significantly by knowing as much about their players as possible for market research, we know the code is capable of spying.

      Basically, motive, opportunity, murder weapon present at the scene but it cannot be determined if it was actually used, because the suspect will not let anyone examine it. Suspect has a conviction track record of killing people with similar weapons and would stand to gain financially from commiting the murder.

      Obviously not a perfect analogy, but maybe reframing it like that gives a bit different perspective?

      Absolute proof? No. Extremely suspicious and plausible? Yes.