• 0 Posts
  • 67 Comments
Joined 7 days ago
cake
Cake day: September 14th, 2025

help-circle
  • I’m pretty sure that that guide is one of those AI-generated spam sites. In this case, it appears to use a character where the LLM involved wasn’t too sure about whether the character is a house painter or an artistic painter. Which doesn’t mean that the information on it is necessarily wrong, just that I’d be cautious as to errors. If you want information from an LLM, probably better in terms of response quality to just, well, go ask an LLM yourself without the distortion from a spammer trying to have the LLM role-play some character.



  • I believe that the fediverse.observer site can list any Fediverse instance type by number of users (though not active users).

    checks

    Oh, they do do active users.

    https://peertube.fediverse.observer/list

    Looks like the top one is phijkchu.com, at 8074 active users.

    EDIT: There’s also fedidb.com:

    https://fedidb.com/servers

    Choose “PeerTube” as server type, and they’ll give you some data on instances too.

    EDIT2: Note that another way to explore PeerTube, which may be to your taste, is that Google Video indexes PeerTube servers, though I don’t know of a way to restrict it to only PeerTube servers aside from using something like site:phijkchu.com to restrict the search on an instance-by-instance basis. But if you search and it’s on PeerTube, and Google has indexed it, it should come up there.

    Kagi also indexes videos, and lets lets one restrict the search by source of videos, with “PeerTube” being one.

    EDIT3: Adding “peertube” as a search term on Google Video isn’t ideal, but it did result in videos on PeerTube hosts at the top, so maybe that could be kind of an ad-hoc way of searching on Google Video.

    EDIT4: libera.site doesn’t appear to provide sortability, but it does list a video count per instance, as well as a bunch of other graphed data. Never seen it before now, though.

    https://libera.site/channel/peertube


  • Kids and their chats today have it easy, man.

    https://home.nps.gov/people/hettie-ogle.htm

    Hettie moved to Johnstown on 1869 to manage the Western Union telegraph office where she was employed on the day of the flood. Her residence was 110 Washington Street, next to the Cambria County Library. This also served as the Western Union office. Unlike many other telegraph operators associated with messaging on the day of the flood, Hettie was not employed by the Pennsylvania Railroad. She was a commercial operator. Three women were employed by Hettie; Grace Garman, Mary Jane Waktins and her daughter Minnie. They all died in the flood including Hettie.

    A timeline of Hettie’s activity on May 31, 1889:
    7:44 a.m. -She sent a river reading. The water level was 14 feet.
    10:44 a.m. -The river level was 20 feet.
    11:00 a.m. -She wired the following message to Pittsburgh. “Rain gauge carried away.”
    12:30 p.m. -She wired “Water higher than ever known. Can’t give exact measurement” to Pittsburgh.
    1:00 p.m. -Hettie moved to the second floor of her home due to the rising water.
    3:00 p.m. -Hettie alerted Pittsburgh about the dam after receiving a warning from South Fork that the dam “may possibly go.” She wired “this is my last message.” The water was grounding her wires. A piece of sheet music titled “My Last Message” was published after the flood.

    Hettie’s house on Washington Street was struck by the flood wave shortly after 4:00 p.m.

    https://en.wikipedia.org/wiki/Halifax_Explosion

    The death toll could have been worse had it not been for the self-sacrifice of an Intercolonial Railway dispatcher, Patrick Vincent (Vince) Coleman, operating at the railyard about 230 metres (750 ft) from Pier 6, where the explosion occurred. He and his co-worker, William Lovett, learned of the dangerous cargo aboard the burning Mont-Blanc from a sailor and began to flee. Coleman remembered that an incoming passenger train from Saint John, New Brunswick, was due to arrive at the railyard within minutes. He returned to his post alone and continued to send out urgent telegraph messages to stop the train. Several variations of the message have been reported, among them this from the Maritime Museum of the Atlantic: “Hold up the train. Ammunition ship afire in harbor making for Pier 6 and will explode. Guess this will be my last message. Good-bye boys.” Coleman’s message was responsible for bringing all incoming trains around Halifax to a halt. It was heard by other stations all along the Intercolonial Railway, helping railway officials to respond immediately.[71][72] Passenger Train No. 10, the overnight train from Saint John, is believed to have heeded the warning and stopped a safe distance from the blast at Rockingham, saving the lives of about 300 railway passengers. Coleman was killed at his post.[71]







  • tal@olio.cafetoAsk Lemmy@lemmy.worldAI error message
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 day ago

    Assuming that you’re just using their website, I’d guess a problem on their end.

    That being said, could be something you’ve done that’s tripped it.

    You could try reloading the webpage, see if that magically makes the issue go away.

    Could disable all browser extensions, and try that.

    Could try a simpler character and see if it shows up with even that. Don’t upload an image or use variables in descriptions or whatever it supports.

    I’m not familiar with that website, but I understand that there are various formats in which characters may be exported. If it has the ability to do so and you’re trying to import a pre-created character card, could be something wrong with that character card.

    Could report it to them if they have a route to take reported issues.

    EDIT: They appear to have a support community on the Threadiverse, which you can find at !perchance@lemmy.world.


  • I’d also bet against the CMOS battery, if the pre-reboot logs were off by 10 days.

    The CMOS battery is used to maintain the clock when the PC is powered off. But he has a discrepancy between current time and pre-reboot logs. He shouldn’t see that if the clock only got messed up during the power loss.

    I’d think that the time was off by 10 days prior to power loss.

    I don’t know why it’d be off by 10 days. I don’t know uptime of the system, but that seems like an implausible amount of drift for a PC RTC, from what I see online as lilely RTC drift.

    It might be that somehow, the system was set up to use some other time source, and that was off.

    It looks like chrony is using the Debian NTP pool at boot, though, and I donpt know why it’d change.

    Can DHCP serve an NTP server, maybe?

    kagis

    This says that it can, and at least when the comment was written, 12 years ago, Linux used it.

    https://superuser.com/questions/656695/which-clients-accept-dhcp-option-42-to-configure-their-ntp-servers

    The ISC DHCP client (which is used in almost any Linux distribution) and its variants accept the NTP field. There isn’t another well known/universal client that accepts this value.

    If I have to guess about why OSX nor Windows supports this option, I would say is due the various flaws that the base DHCP protocol has, like no Authentification Method, since mal intentioned DHCP servers could change your systems clocks, etc. Also, there aren’t lots of DHCP clients out there (I only know Windows and ISC-based clients), so that leave little (or no) options where to pick.

    Maybe OS X allows you to install another DHCP client, Windows isn’t so easy, but you could be sure that Linux does.

    My Debian trixie system has the ISC DHCP client installed in 2025, so might still be a factor. Maybe a consumer broadband router on your network was configured to tell the Proxmox box to use it as a NTP server or something? I mean, bit of a long shot, but nothing else that would change the NTP time source immediately comes to mind, unless you changed NTP config and didn’t restart chrony, and the power loss did it.




  • Yeah, in all honesty, it’s not really my ideal as a quote to capture the idea. Among other things, it’s comparing what is for the quoted person, household tasks and employment, whereas I’d generally prefer employment vs employment for most of these.

    And for the quoted person, the issue is that AI is doing work that we tend to think of as potentially-desirable, rather than in the context I’m writing about, where it’s more that science fiction often portrays AI-driven sex robots that perform for humans (think Blade Runner or A.I. Artificial Intelligence (2001)), but doesn’t really examine humans performing for AIs.

    Still, it was the closest popular quote I could think of to address the idea that the split between AI and human roles in a world with AIs is not that which we might have anticipated.



  • In the broad sense that understanding of spatial relationships and objects is just kind of limited in general with LLMs, sure, nature of the system.

    If you mean that models simply don’t have a training corpus that incorporates adequate erotic literature, I suppose that it depends on what one is up to and the bar one has. No generative AI in 2025 is going to match a human author.

    If you’re running locally, where many people use a relatively-short context size on systems with limited VRAM, I’d suggest a long context length for generating erotic literature involving bondage implements like chastity cages, as otherwise once information about the “on/off” status of the implement passes out of the context window, the LLM won’t have information about the state of the implement, which can lead to it generating text incompatible with that state. If you can’t afford the VRAM to do that, you might look into altering the story such that a character using such an item does not change state over the lifetime of the story, if that works for you. Or, whenever the status of the item changes, at appropriate points in the story, manually update its status in the system prompt/character info/world info/lorebook/whatever your frontend calls its system to inject static text into the context at each prompt.

    My own feeling is that relative to current systems, there’s probably room for considerably more sophisticated frontend processing of objects, and storing state and injecting state about it efficiently into the system prompt. The text of a story is not an efficient representation of world state. Like, maybe use an LLM itself to summarize world state and then inject that summary into the context. Or, for specific games written to run atop an LLM, have some sort of Javascript module that runs in a sandbox, runs on each prompt and response to update its world state, and dynamically generates text to insert into the context.

    I expect that game developers will sort a lot of this out and develop conventions, and my guess is that the LLM itself probably isn’t the limiting factor on this today, but rather how well we generate context text for it.



  • I have to say that the basic concept of having Meta pay human adult content performers to perform to teach an AI about sexual performance would be kind of surreal.

    “So what do you do for work?”

    “I’m an exotic dancer.”

    “Straight or gay establishment?”

    “Err…I perform for an artificial intelligence.”

    You know what the biggest problem with pushing all-things-AI is? Wrong direction. I want AI to do my laundry and dishes so that I can do art and writing, not for AI to do my art and writing so that I can do my laundry and dishes.

    — Joanna Maciejewaska

    I expect that Joanna would not be enthused about humans stripping for machines.