I’m beautiful and tough like a diamond…or beef jerky in a ball gown.

  • 154 Posts
  • 344 Comments
Joined 8 months ago
cake
Cake day: July 15th, 2025

help-circle






  • Audio transcribing should be the little “waveform” icon at the right of the text input:

    Image generation, I’m not sure as that’s not a use-case I have and don’t think the small-ish models I run are even capable of that.

    I’m not sure how audio transcribing works in OpenWebUI (I think it has built-in models for that?) but image generation is a “capability” that needs to be both part of the model and enabled in the models settings (Admin => Settings => Models)








  • 🤚totally guilty there.

    I wish there was a way to mute or turn off replies, and I might post more. Sometimes / often I’ll want to post something but definitely do not want to be bombarded with the comments it would generate. Other times, I’ll like the community but not be involved enough in whatever hobby to post anything but still enjoy seeing other people’s work (e.g. HAM radio, sewing, etc).








  • Disclaimer: : All of my LLM experience is with local models in Ollama on extremely modest hardware (an old laptop with NVidia graphics) , so I can’t speak for the technical reasons the context window isn’t infinite or at least larger on the big player’s models. My understanding is that the context window is basically its short term memory. In humans, short term memory is also fairly limited in capacity. But unlike humans, the LLM can’t really see (or hold) the big picture in its mind.

    But yeah, all you said is correct. Expanding on that, if you try to get it to generate something long-form, such as a novel, it’s basically just generating infinite chapters using the previous chapter (or as much of the history fits into its context window) as reference for the next. This means, at minimum, it’s going to be full of plot holes and will never reach a conclusion unless explicitly directed to wrap things up. And, again, given the limited context window, the ending will be full of plot holes and essentially based only on the previous chapter or two.

    It’s funny because I recently found an old backup drive from high school with some half-written Jurassic Park fan fiction on it, so I tasked an LLM with fleshing it out, mostly for shits and giggles. The result is pure slop that seems like it’s building to something and ultimately goes nowhere. The other funny thing is that it reads almost exactly like a season of Camp Cretaceous / Chaos Theory (the animated kids JP series) and I now fully believe those are also LLM-generated.







  • I used to buy their stuff and use tuya-convert to flash Tasmota onto them. But they kept updating the firmware to lock that out, and I ended up returning a batch of 15 smart plugs because none of them would flash. They were too much of a PITA to try to crack open and flash the ESP8266 manually so I returned the whole batch as defective, left a scathing review, and blackballed the whole brand.