Elvith Ma'for
Former Reddfugee, found a new home on feddit.de. Server errors made me switch to discuss.tchncs.de. Now finally @ home on feddit.org.
Likes music, tech, programming, board games and video games. Oh… and coffee, lots of coffee!
I � Unicode!
- 0 Posts
- 7 Comments
but some people don’t seem to be able to read one goddamn paragraph ever.
I had a problem with my car. It felt strange while driving. Made some unusual noise. Then a bit later the motor warning light came on.
I went to the garage, told them about the warning light and what I noticed the time before, what I suspected and such. A short while after the mechanic came to me and asked for a few details, as my description “wasn’t helpful” and the repair would be much faster with more details that told them where to look etc. Turns out the guy who checked in my car only noted “a warning light is on” and nothing else of my ramblings.
So sometimes it’s also paying attention to what might be important and relaying information.
I have a Copilot license at work. We also have an in house „ChatGPT clone“ - basically a private deployment of that model so that (hopefully) no input data gets used to train the models.
There are some usecases that are neat. E.g. we’re a multilingual team, so having it transcribe, translate (and summarize) a meeting so that it’s easier to finalize and check a protocol. Coming back from a vacation and just ask it summarize everything you missed for a specific area of your work (to get on track before just checking everything chronologically) can be nice, too.
Also we finetuned a model to assist us in writing and explaining code from a domain specific language with many strange quirks that we use for a tool and that has poor support from off the shelf LLMs.
But all of these cases have one thing in common: They do not replace the actual work and are things that will be checked anyways (even the code one, as we know there are still many flaws, but it’s usually great at explaining the code now - not so at writing it). It’s just a convenient method to check your own work - and LLM hallucinations will usually be caught anyway.