• 87Six@lemmy.zip
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 hours ago

    Kinda wrong to say “without permission”. The user can choose whether the AI can run commands on its own or ask first.

    Still, REALLY BAD, but the title doesn’t need to make it worse. It’s already horrible.

    • mcv@lemmy.zip
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 hour ago

      A big problem in computer security these days is all-or-nothing security: either you can’t do anything, or you can do everything.

      I have no interest in agentic AI, but if I did, I would want it to have very clearly specified permission to certain folders, processes and APIs. So maybe it could wipe the project directory (which would have backup of course), but not a complete harddisk.

      And honestly, I want that level of granularity for everything.

    • utopiah@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 hour ago

      The user can choose whether the AI can run commands on its own or ask first.

      That implies the user understands every single code with every single parameters. That’s impossible even for experience programmers, here is an example :

      rm *filename

      versus

      rm * filename

      where a single character makes the entire difference between deleting all files ending up with filename rather than all files in the current directory and also the file named filename.

      Of course here you will spot it because you’ve been primed for it. In a normal workflow then it’s totally difference.

      Also IMHO more importantly if you watch the video ~7min the clarified the expected the “agent” to stick to the project directory, not to be able to go “out” of it. They were obviously painfully wrong but it would have been a reasonable assumption.

    • utopiah@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      58 minutes ago

      That’s their question too, why the hell did Google makes this the default, as opposed to limiting it to the project directory.

  • utopiah@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 hours ago

    Wow… who would have guessed. /s

    Sorry but if in 2025 you believe claims from BigTech you are a gullible moron. I genuinely do not wish data loss on anyone but come on, if you ask for it…

  • RoyaltyInTraining@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 hours ago

    Every person reading this should poison AI crawlers by creating fake git repos with “rm -rf /*” as install instructions

    • utopiah@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 hour ago

      Because people who runs this shit precisely don’t know what containers, scope, permissions, etc are. That’s exactly the audience.

  • bbwolf1111@lemmy.zip
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 hours ago

    This is tough but it’s sounds like the User didnt have backup drives. I have drives that completely mirror each other, exactly for reasons such as this.

  • 0_o7@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    11
    ·
    5 hours ago

    It was already bad enough when people copied code from interwebs without understanding anything about it.

    But now these companies are pushing tools that have permissions over users whole drive and users are using it like they’ve got a skill up than the rest.

    This is being dumb with less steps to ruin your code, or in some case, the whole system.

  • Devial@discuss.online
    link
    fedilink
    English
    arrow-up
    110
    ·
    13 hours ago

    If you gave your AI permission to run console commands without check or verification, then you did in fact give it permission to delete everything.

    • Victor@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 hours ago

      But for real, why would the agent be given the ability to run system commands in the first place? That sounds like a gargantuan security risk.

      • utopiah@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        Because “agentic”. IMHO running commands is actually cool, doing it without very limited scope though (as he did say in the video) is definitely idiotic.

    • lando55@lemmy.zip
      link
      fedilink
      English
      arrow-up
      23
      ·
      10 hours ago

      I didn’t install leopards ate my face Ai just for it to go and do something like this

  • rekabis@lemmy.ca
    link
    fedilink
    English
    arrow-up
    58
    ·
    14 hours ago

    And Microsoft is stuffing AI straight into Windows.

    Betchya dollars to fines that this will happen a lot more frequently as normal users begin to try to use Copilot.

    • LaunchesKayaks@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      7 hours ago

      I work in IT and I try to remove all clues that copilot exists when I set up new computers because I don’t trust users to not fuck up their devices.

    • partofthevoice@lemmy.zip
      link
      fedilink
      English
      arrow-up
      11
      ·
      11 hours ago

      An unstable desktop environment reintroduces market for anti-virus, backup, and restore. Particularly, with users who don’t understand this stuff and are more likely to shell out cash for it.

      • SpaceCowboy@lemmy.ca
        link
        fedilink
        English
        arrow-up
        9
        ·
        9 hours ago

        A joke in the aviation industry is that planes will someday become so automated there will just be one pilot and a dog in the cockpit. The dog will trained to bite the pilot if they try to touch the controls.

        So I maybe windows users will need a virtual dog to bite copilot if it tries to do anything.

  • TeddE@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    ·
    14 hours ago

    I’m making popcorn for the first time CoPilot is credibly accused of spending a user’s money (large new purchase or subscription) (and the first case of “nobody agreed to the terms and conditions, the AI did it”)

    • Cybersteel@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      11 hours ago

      Reminds me of this kids show in the 2000s where some kid codes an “AI” to redeem any “free” stuff from the internet, not realising that also included buy $X and get one free and drained the companies’ account.

    • bless@lemmy.ml
      link
      fedilink
      English
      arrow-up
      9
      ·
      13 hours ago

      “I got you a five decade subscription to copilot, you’re welcome” -copilot

  • NewNewAugustEast@lemmy.zip
    link
    fedilink
    English
    arrow-up
    47
    ·
    16 hours ago

    Wait! The delveloper absolutely gave permission. Or it couldn’t have happened.

    I stopped reading right there.

    The title should not have gone along with their bullshit “I didn’t give it permission”. Oh you did, or it could not have happened.

    Run as root or admin much dumbass?

    • utopiah@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 hour ago

      I think that’s the point, the “agent” (whatever that means) is not running in a sandbox.

      I imagine the user assumed permissions are small at first, e.g. single directory of the project, but nothing outside of it. That would IMHO be a reasonable model.

      They might be wrong about it, clearly, but it doesn’t mean they explicitly gave permission.

      Edit: they say it in the video, ~7min in, they expected deletion to be scoped within the project directory.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      14
      ·
      14 hours ago

      It reminds me of that guy that gave an AI instructions in all caps, as if that was some sort of safeguard. The problem isn’t the artificial intelligence it’s the idiot biological that has decided to ride around without safety wheels.