The pathological need to find something to use LLMs for is so bizzare.
It’s like the opposite of classic ML, relatively tiny special purpose models trained for something critical, out of desperation, because it just can’t be done well conventionally.
But this:
AI-enhanced tab groups. Powered by a local AI model, these groups identify related tabs and suggest names for them. There is even a “Suggest more tabs for group” button that users can click to get recommendations.
Take out the word AI.
Enhanced tab groups. Powered by a local algorithm, these groups identify related tabs and suggest names for them. There is even a “Suggest more tabs for group” button that users can click to get recommendations.
If this feature took, say, a gigabyte of RAM and a bunch of CPU, it would be laughed out. But somehow it ships because it has the word AI in it? That makes no sense.
I am a massive local LLM advocate. I like “generative” ML, within reason and ethics. But this is just stupid.
When I’m browsing around with multiple tabs open, the last thing I want is something to start moving them around and messing my flow up. This is a solution looking for a problem.
Yup
Auto naming functionality is neat in some cases, like the AI chat UI itself
- It’s convenient to have names when toggling between a few recent chats or searching through 10s or 100s of chats later on
- I spawn new chats often and it’s tedious to name them all
- I don’t have a strong preference for what the title is as long as it’s clear what the chat was about
Tab groups don’t hit those points at all
- I’ll have a handful of tab groups
- I don’t make them often
- I have a strong preference for what it’s called, and the AI will have trouble figuring out exactly what I’m using those sites for
The pathological need to find something to use LLMs for is so bizzare.
Venture capital dumped so much money into the tech without understanding the full scope of what it was capable of. Now they’re so in so deep that they desperately NEED to find something profitable it can do, otherwise they’ll lose the farm.
Literally no one on this green earth asked for this shit. In fact, we’ve been pretty direct about how much we don’t want it.
It’s exhausting.
browser.ml.chat.enabled false
I hate how many of these you have to do on any new installation of Firefox.
You only disable the chat. Overall setting seems to be
browser.ml.enable
.
TBH despite I don’t like this specific idea, nor use Firefox directly, I do like the usage of local inference vs sending your data to thirdparty to do AI.
They just needed to do it OPT IN, not OPT OUT.