• Asafum@feddit.nl
    link
    fedilink
    arrow-up
    20
    ·
    3 days ago

    I always laugh at the absurd confidence it has when being wrong… A while ago I was messing with chat gpt asking it for upgrade suggestions for a weapon in a game to see what knowledge it had of the game. It absolutely knew names, but it kept saying “here’s a suggestion go from xyz(my great sword) and upgrade it to abc(a longsword)”

    “…Ok, but the name you gave me is not a greatsword.”

    “Oh yes, thank you for your correction! Here’s another suggestion (a bow).”

    “That’s not a greatsword either.”

    “You’re right! Thank you again for your correction! Here’s another suggestion (another longsword)”

    “That’s a longsword again.”

    “Thank you for your patience! After your corrections I finally have arrived at the correct answer! Names (a bow)”

    Lol at least they were all real names from the game I was playing I guess.

  • Wispy2891@lemmy.world
    link
    fedilink
    arrow-up
    9
    arrow-down
    1
    ·
    3 days ago

    Isn’t something like:

    You are absolutely right, and I apologize for that error. Thank you for the correction, it helps me learn and improve. Let’s try this again with the right information…