this post was submitted on 26 Jun 2024
153 points (94.7% liked)

Firefox

20202 readers
230 users here now

A place to discuss the news and latest developments on the open-source browser Firefox

founded 5 years ago
MODERATORS
 

Tldr: Theyre adding an opt-in alt text generation for blind people and an opt-in ai chat sidebar where you can choose the model used (includes self-hosted ones)

all 38 comments
sorted by: hot top controversial new old
[–] slazer2au@lemmy.world 66 points 1 year ago (2 children)
[–] dustyData@lemmy.world 28 points 1 year ago (1 children)

Self-hosted and locally run models also goes a long way. 90% of LLMs applications don't require users to surrender their devices, data, privacy and security to big corporations. But that is exactly how the space is being run right now.

[–] LWD@lemm.ee 6 points 1 year ago* (last edited 1 week ago) (2 children)
[–] xor@lemmy.blahaj.zone 5 points 1 year ago (1 children)

The alternative is only supporting self hosted LLMs, though, right?

Imagine the scenario: you're a visually impaired, non-technical user. You want to use the alt-text generation. You're not going to go and host your own LLM, you're just going to give up and leave it.

In the same way, Firefox supports search engines that sell your data, because a normal, non-technical user just wants to Google stuff, not read a series of blog posts about why they should actually be using something else.

[–] LWD@lemm.ee 4 points 1 year ago* (last edited 1 week ago) (1 children)
[–] xor@lemmy.blahaj.zone 2 points 1 year ago (1 children)

Ah, I missed that alt text specifically is local, but the point stands, in that allowing (opt-in) access to a 3rd party service is reasonable, even if that service doesn't have the same privacy standards as Mozilla itself

To pretty much every non-technical user, an AI sidebar that won't work with ChatGPT (Google search's equivalent from my example previously) may as well not be there at all

They don't want to self host an LLM, they want the box where chat gpt goes

[–] LWD@lemm.ee 2 points 1 year ago* (last edited 1 week ago)

deleted by creator

[–] LWD@lemm.ee 11 points 1 year ago* (last edited 1 week ago) (2 children)
[–] barryamelton@lemmy.ml 12 points 1 year ago (2 children)

If it was truly opt-in, it could be an extension. They should not be bundling this with the browser, bloating it more in the process.

The extension API doesn't have enough access for this.

You technically can run your own local AI, but they hook up to the big data-hungry ones out of the box.

While it is opt-in and disabled by default, this is the real problem.

[–] LWD@lemm.ee 3 points 1 year ago* (last edited 1 week ago)

deleted by creator

[–] slazer2au@lemmy.world 8 points 1 year ago

Look at the Firefox subreddit. One month ago, people were criticizing the thought of adding AI to Firefox. Two months ago, same thing. Look at the Firefox community. See how many times people requested AI.

I believe what most people are concerned about, including myself, was the AI features being enabled automatically and then having to disable it like every other application would do to inflate metrics.

Because this is opt in like it says in the blog I am ok with it there and disabled.

[–] ScreaminOctopus@sh.itjust.works 13 points 1 year ago (1 children)

Will you need your own account for the proprietary ones? Mozilla paying for these feels like it couldn't be sustainable long term, which is worrying.

[–] Blisterexe@lemmy.zip 12 points 1 year ago

The proprietary ones are free

[–] Xuderis@lemmy.world 2 points 1 year ago* (last edited 1 year ago) (1 children)

But what does it DO? How is it actually useful? An accessibility PDF reader is nice, but AI can do more than that

Our initial offering will include ChatGPT, Google Gemini, HuggingChat, and Le Chat Mistral

This is great, but again, what for?

[–] Blisterexe@lemmy.zip 5 points 1 year ago (1 children)

A lot of people use llms a lot, ao its useful for them, but its also nice for summarizing long articles you dont have the time to read, not as good as reading it, but better than skimming jt

[–] rgbd@ursal.zone 2 points 1 year ago (1 children)

@Blisterexe @Xuderis It's true, as a researcher, these models have helped me a lot to speed up the process of reading and identifying specific information in scientific articles. As long as it is privacy respecting, I see this implementation with good eyes.

[–] Blisterexe@lemmy.zip 3 points 1 year ago (1 children)

It lets you use any model, so while it lets you use chatgpt, it also lets you use a self-hosted model if you edit about:config

[–] Xuderis@lemmy.world 1 points 1 year ago (1 children)

But what does using that in my browser get me? If I’m running llama2, I can already copy and paste text into the terminal if I want. Is this just saving me that step?