• jnod4@lemmy.ca
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 days ago

      Just put it in gay ppl and give them a shock everytime they have uncouth thoughts. We’d have so many more children to grow into mindless drones for the billionaires if everyone bred!

  • thatradomguy@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    2 days ago

    They won’t need polys for Top Secret clearance anymore. They’ll just force you to get one of these and BAM.

  • hansolo@lemmy.today
    link
    fedilink
    arrow-up
    18
    ·
    3 days ago

    Making this a privacy issue is about a decade PR two early.

    These devices require a surgical implant and extensive training. There’s no “thoughtcrime” potential here. This of for giving paraplegic people, victims of neurological disease, and potentially non-verbal autistic people the ability to speak.

    The need to use a “public thought password” is because the AI training and system is not good enough yet to recognize the difference between internal monologue and thoughts directed for others. Likely because people involved in the research have been rendered mute for AI long they don’t know the difference anymore either.

    Sure, it means both come from the same source and both can be translated. But in some SciFi future world where the cops plonk a thought helmet on you, just do what I do all the time anyway, and think “boobs boobs boobs…” over and over.

    • marcie (she/her)@lemmy.ml
      link
      fedilink
      arrow-up
      5
      ·
      2 days ago

      i think the issue comes when the state needs information and they will be willing to surgically implant things in you and force you through a series of tests to establish a baseline.

      • hansolo@lemmy.today
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        That’s expensive and takes work.

        Not sure if you’ve seen a government lately, but something taking work isn’t how it works. What’s the blunt, immediate, let the lawyers sort it out version? That’s what will be done.

        Let’s consider some post 9/11 style Abu Ghraib Prison scenario where captors try this instead of breaking someone with 24 hour music and waterboarding torture. This gets all your thoughts, right? OK, well all my thoughts at that time would be “F you, F you, you’re all liars, I don’t believe anything you say, F you, go jump in the ocean and get eaten by sharks, F you, F you…” You have to think the thing they want you to think for this to work. Anyone anticipating that might be what could happen if detained should know how to control their thoughts. Sing the same 3 songs over and over in your head. Repeat lines from your favorite movies. Think in another language. Name things you can see in the room and think through construction methods of the building. It will very, very quickly prove useless against anyone with even moderate willpower. A genuine sociopath would excel at using it to manipulate their jailers. If anything, the tables have turned and they’re forced to listen to everything you think - now you’re torturing them!

    • daydrinkingchickadee@lemmy.mlOP
      link
      fedilink
      arrow-up
      13
      ·
      3 days ago

      There’s no “thoughtcrime” potential here. This of for giving paraplegic people, victims of neurological disease, and potentially non-verbal autistic people the ability to speak.

      Good on you for looking at the upside I suppose. In my view, the framing of this technology as a “medical miracle” is a very convenient smokescreen.

      • hansolo@lemmy.today
        link
        fedilink
        arrow-up
        5
        ·
        3 days ago

        It genuinely is a medical miracle, though.

        The predictive nature of monitoring your internet and money use is less invasive, easier, cheaper, and overall good enough to build a law enforcement paradigm around it to surveil hundreds of millions of people. Thoughtcrime can be predicated off your searches. Why add extra steps?

          • hansolo@lemmy.today
            link
            fedilink
            arrow-up
            5
            ·
            3 days ago

            You’re missing the entire point. This is complicated and delicate and expensive to directly read thoughts. This is high end stuff.

            Meanwhile, in reality…

            Google already knows what you are thinking by tracking everything you do online. Well enough that the debate among advertisers from as far back as 2010 was how to not “spook the customer.” A large amount of the “my phone is listening to me!” Is the predictive nature of advertising that tracks everything you do.

            There’s 15 years of success in this already. Then - THEN 6 months ago Google started using browser fingerprinting to track everything everything everything you do, everywhere you go, everyone you talk to online. EV.AH.RE.th.ha.iiiiiing. Why spend money to get invasive and weird about it when you’ve got the data you need in high fidelity? Why do anything physical when you get what you need from people’s phones?

            • daydrinkingchickadee@lemmy.mlOP
              link
              fedilink
              arrow-up
              1
              ·
              1 day ago

              Why spend money to get invasive and weird about it when you’ve got the data you need in high fidelity? Why do anything physical when you get what you need from people’s phones?

              Same answer as before, to read minds. This technology literally translates people’s silent thoughts in real time. Does anything you’ve mentioned here do that? Sorry but no, not even close.

              • hansolo@lemmy.today
                link
                fedilink
                arrow-up
                1
                ·
                22 hours ago

                Do you seriously just not understand? Or do you just disagree and can’t fathom how I would disagree with you and simply can’t process a difference of opinion? I’ve been clear and consistent. I understand your point and don’t agree fully.

                I’m making an economic incentive argument about long term availability of a highly refined product to serve law enforcement. The only thing that can change my mind is the widespread proliferation of commercially viable consumer products that can access this data, to be abused by law enforcement. Which may happen, who knows.

                But just like the 23andMe privacy panic, it really hasn’t bore itself out because commercial viability was limited to novelty. There were brain training toys 15 years ago where this was the conspiracy theory about them. That fizzled. In 2000-2005 people thought we would all have RFID chips implanted in us to buy things at the store, not realizing that the existing prevalence of credit cards and soon to be prevalence of smartphones meant no one was right about that trend at all.

                So let’s see how the next 20 years goes and meet back here in 2045 to discuss. OK?

                • daydrinkingchickadee@lemmy.mlOP
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  19 hours ago

                  This technology is capable of translating a person’s silent thoughts in real time. I’m sorry but nothing you’ve discussed comes even close to that.

    • upstroke4448@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 days ago

      Not sure how you can believe on one hand that there isn’t potential for “thoughtcrime” but also recognize that the technology has a hard time differentiating between internal monologues and thoughts directed for others.

      At best it just can’t recognize consent but at worst there will be a point where people can filter for internal thoughts. The more fine tuned this gets the easier it will become to abuse.

      Granted I agree with you that the prospect of this being a nefarious technology doesn’t seem imminent and could also have huge upsides if used in good faith.

      • hansolo@lemmy.today
        link
        fedilink
        arrow-up
        1
        ·
        2 days ago

        I’m trying to scream from the rooftops that the horse is already out of the barn. We’re PAST this point already with existing technology.

        Let’s say that it’s illegal to own a tiger where you live. You spend a month searching online “how to feed a tiger” “what does a tiger eat” “tiger bedding” and shopping for enclosures. Maybe you even buy tiger enclosure panels and bedding and a blinged-out tiger-sized collar. But never actually buy a tiger. Especially in the most sophisticated surveillance states, that gets close enough that you might get arrested. A crime prediction algorithm would put you as high risk for committing a crime, and if local legislation permits it, you might be charged with a crime you haven’t actually committed. “Conspiracy to endanger wildlife” even though the conspiracy is between you and Google.

        However. you’re telling me that I need to be worried about the far-flung future edge case where after going through a surgical procedure, I spend 5 months training an AI tailored to my brain in particular, and then at that point I think “You know, I think I really want to get a tiger” and would be arrested. You’re going to die on the hill that it’s possible. Sure, maybe one day it will be less invasive and the training on your own brain will take hours instead of months. But for now, we have everything that’s needed to do this to a reasonable enough degree that it’s shocking to me how complacent people are.

        • upstroke4448@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          2 days ago

          You can be concerned about a possible future issue and a current issue at the same time. To dismiss a possible issue because you are already concerned with something else seems silly. Maybe I’m misunderstanding what your getting at.

          • hansolo@lemmy.today
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 day ago

            We’re agreeing to disagree, and that’s ok. I don’t think that this will ever get to the point of the scifi plot device you think it will be. It’s too invasive and requires too much training to be effective.

            So it needs a lot of development and patience and literal brain surgery on willing subjects to test this specific use. Medical uses will follow their own niche, making this a point where clandestine research well have to fork it for their own purpose soon. The overhead is massive here, pardon the pun.

            Meanwhile, an effective method for getting close enough already exists and is used to prosecute people.

            Where is the incentive to spent the years of research to develop this specialized tool further, tailored for the “precrime” purpose? Especially when it’s not just quick and easy, but cheap to do this already? There’s no urgency or need to do anything else. This doesn’t do anything we can’t already do.

            Edit: You’re suggesting something as invasive as the famously unsuccessful MK Ultra program. Meanwhile, using money and black mail remain widely used ways to manipulate someone. The bar is already set high by low-tech means.

            Ooooh, did you listen to the EFF podcast and that’s why you’re all about this?

  • irmadlad@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    2 days ago

    I firmly believe, given enough time and technology, and barring we don’t destroy our own selves like dumb asses of the highest order that mankind is…I think we can achieve just about anything we can possibly think of. Which is a beautiful picture if we weren’t so short sighted.

    I have a brain injury from impact. Tho I am not a paraplegic or in a wheelchair, I have suffered some mental/neural maladies. I am pretty mobile, I read a ton of content off the internet. So, I function with some defficiencies. Imagine for those far worse off than I, like a paraplegic with significant brain/spinal damage. I always think ‘I wonder if they have any kind of inner monologue?’

    In a general sense, we loose very few of our memories over our lifetime. Yes there are caveats but, a majority still exist, our retrieval system lags. I realize it’s a lot more complicated than this simplistic explaination. If we could somehow tap into that lost archived data bank, that would open up a lot litany of issues. Solving crime comes to mind.

  • Zerush@lemmy.ml
    link
    fedilink
    arrow-up
    12
    ·
    3 days ago

    “You have the right to remain silent, because everything you think can be used against you in a trial…”

  • cerebralhawks@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    6
    ·
    3 days ago

    Thoughtcrime issues? You have thoughts about harming someone, you get punished for it even if you don’t take action?

    I suppose the upshot is it could be used to detect and diagnose mental illnesses?

    Furthermore, going into the future, it could ostensibly be used to control parts of the body that are damaged or otherwise not working, or emulate their function. For example, someone with damaged vocal cords could use it to speak through connected speakers. Someone who is paralysed could use it to walk with a mechanical exoskeleton.

    The problem with something like that is, it would have to be privacy focused. Samsung, one of the most popular smartphone makers, updated their Health app a year or two ago to where you had to agree to allow them to sell or give away your medical information if you want to continue to use Samsung Health. And that’s a Korean company. Apple Health is still private, but Apple is an American company, so the question is begged, “but for how long?”.

    So, the question is, are the powers that be/fascists in charge going to use it to weed out LGBTQ+ and put them in concentration camps, or what?

  • latenightnoir@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    3 days ago

    Sorry to be THAT guy, but… Well, Technically™, this idea has been analysed for, I dare say, centuries at this point.

    Edit: and in many (not all, but very, very many) scenarios, it ends up being a profoundly shitty idea.

    Not against actual medical use, but unless absolutely needed, it should be a “fuck no.” Imho.

    Edit 2/unprompted musings: I think we, as a species, are simply too immature for such technologies. We’re still at the “age” where the first thing we’ll do with it would be to figure out how to best weaponise it.

    Because there are ways in which this might actually help if we place some heavy-duty limits and restrictions on the tech itself - I’m thinking about how much it might facilitate comprehension in a scenario where one could transmit the unmolested concept directly, instead of butchering it through language first. But that’s still such a hhhhyooooj responsibility, that I simply don’t think we could handle as we are now collectively.

    • ScoffingLizard@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      2
      ·
      3 days ago

      This makes me wonder if there is a silver lining to the fact that education is going down the shitter. Science is starting to destroy society. I would rather just be dumb and farm shit.

      • Thorned_Rose@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        3 days ago

        It’s not science that’s destroying society - its the misuse of science and worse, greed that has a veneer of science or rather, $cience.

        It’s the usual story of capitalism, oligarchy and big corp ruining everything for everyone but themselves.

          • latenightnoir@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 days ago

            Thing is, technology just… is, neither bad nor good. The principles behind every single technological breakthrough have always existed, we just figure out how they work.

            Like a knife, science is a tool. The ones who misuse the tool (i.e. us) are to blame for the damage done, not the tool itself.