

Chat GPT, generate me an image of an eyeball.


Chat GPT, generate me an image of an eyeball.


It might have changed but there is a setting for it now.
Pretty annoying that I’m just learning setting no signature did nothing since they added a second signature option for when sending from mobile and enabled it by default.
Is it stealing if that’s why it’s there?
Especially if you put a link in a comment for attribution.


The blog article you link I think implies you do not have your own VM. LLMs are stateless, the previous conversation is fed in as part of the prompt.
You send your message, which is E2E encrypted. The LLM runs in an environment where it can decrypt your message and run in through the LLM, then send a response to you. Then it gets the next user’s message and replies to them.
The key part is that the LLM is running inside an encrypted environment not accessible to the host system, so no one can watch as it decrypts your message.
That’s what I get from reading your links.


If big tech are the issue, then try this robots.txt (yes on github…): https://github.com/ai-robots-txt/ai.robots.txt
My issue is with the scrapers pretending to be something they aren’t. Tens of thousands of requests, spread over IPs, mostly from China and Singapore but increasingly from South America.


Well if you do, contribute it to home assistant and I’ll install it 😆, it’s actually a little surprising conversions aren’t supported natively but I guess there is a lot to cover and they will get there eventually.


Home assistant has an automation event that lets you set the conversation result, but you’ve already passed my ability haha so I can’t tell you how to pull the result in from an external service.
It may well be worth building it as a home assistant integration rather than just custom sentence triggered automations.


Worth noting you need both the speaker part and the server part. Home assistant sells both as out of the box ready to go but you do need both parts.
It’s also worth noting it’s a Preview Edition, as in not yet consumer ready.
It works but you will find quirks, and will find things it can’t do that you’d expect it to, and things it can do that others can’t.
It’s also very customisable, if you’re a bit technical (honestly you don’t need to be that technical these days, it has come a long way).


Do you have a plan? I have a Home Assistant Voice Preview Edition and it’s great but I don’t think it can do unit conversions without connecting it to an LLM. Timers work locally.
I guess if it’s an equation you could add automation to pick up on the phrase and reply with the conversion, but that would need each unit to be manually done and wouldn’t work for things like currency conversion that needs live data.
Also arbitrary things would be challenging, like converting tablespoons of butter into grams or grams of rice into cups.


So far no one has mentioned this, but typically images or other uploads only exist on the original server. When lemm.ee went down, all the content those users uploaded was lost.
The text content of posts and comments is copied across all the linked servers, but the images aren’t. Some instances will proxy images from a short term cache, but it’s far too expensive to store the images permanently.


Couldn’t most people here could do it in one? The code is in the post and newlines can be removed like in minified javascript.


As someone with a public facing website, there are significant volumes of scraping still happening. But largely this appears to come out of South East Asia and South America and they take steps to hide who they are so it’s not clear who is doing it or why, but like you say it doesn’t appear to be OpenAI, Google, etc.
It doesn’t appear to be web search indexing, the scraping is aggressive and the volume will bring down a Lemmy server no matter how powerful the hardware.
What’s your solution to this problem for the rest of your digital life?
So testing on animals?


Yeah that’s a pretty good argument for it.


Why group it into language instead of say a ‘web’ directory or ‘android’/‘mobile’?
I’m just curious, I am more of a ‘throw everything in one directory and home I remember what I’m looking for’ sort of organiser.


Multiple people in this topic say they organise in directories for different programming languages, something I have never considered and I find it to be an odd way of organising for some reason I can’t explain.
Where do you put a project with a Javascript frontend and a Python backend?


Ah thanks!


Don’t forget the Breezy live wallpaper, where it shows a wallpaper based on the current weather.
Well the containers are grouped into services. I would easily have 15 services running, some run a separate postgres or redis while others do an internal sqlite so hard to say (I’m not where I can look rn).
If we’re counting containers then between Nextcloud and Home Assistant I’m probably over 20 already lol.