142
What’s next for Mozilla?
(techcrunch.com)
This is a most excellent place for technology news and articles.
I agree completely. I think AI can be a valuable tool if you use it correctly, but it requires you to be able to prompt it properly and to be able to use its output in the right way - and knowing what it's good at and what it's not. Like you said, for things like brainstorming or looking for inspiration, it's great. And while its artistic output is very derivative - both because it's literally derived from all the art it's been trained on and simply because there's enough other AI art out there that it doesn't really have a unique "voice" most of the time - you could easily use it as a foundation to create your own art.
To expand on my asking it questions: the kind of questions I find it useful for are ones like "what are some reasons why people may do x?" or "what are some of the differences between y and z?". Or an actual question I asked ChatGPT a couple of months ago based on a conversation I'd been having with a few people: "what is an example of a font I could use that looks somewhat professional but that would make readers feel slightly uncomfortable?" (After a little back and forth, it ended up suggesting a perfect font.)
Basically, it's good for divergent questions, evaluative questions, inferent questions, etc. - open-ended questions - where you can either use its response to simulate asking a variety of people (or to save yourself from looking through old AskReddit and Quora posts...) or just to give you different ideas to consider, and it's good for suggestions. And then, of course, you decide which answers are useful/appropriate. I definitely wouldn't take anything "factual" it says as correct, although it can be good for giving you additional things to look into.
As for writing code: I've only used it for simple-ish scripts so far. I can't write code, but I'm just about knowledgeable enough to read code to see what it's doing, and I can make my own basic edits. I'm perfectly okay at following the logic of most code, it's just that I don't know the syntax. So I'm able to explain to ChatGPT exactly what I want my code to do, how it should work, etc, and it can write it for me. I've had some issues, but I've (so far) always been able to troubleshoot and eventually find a solution to them. I'm aware that if want to do anything more complex then I'll need to expand my coding knowledge, though! But so far, I've been able to use it to write scripts that are already beyond my own personal coding capabilities which I think is impressive.
I generally see LLMs as similar to predictive text or Google searches, in that they're a tool where the user needs to:
And just like how people having access to predictive text or Google doesn't make everyone's spelling/grammar/punctuation/sentence structure perfect or make everyone really knowledgeable, AIs/LLMs aren't going to magically make everyone good at everything either. But if people use them correctly, they can absolutely enhance that person's own output (be it their productivity, their creativity, their presentation or something else).