40
submitted 1 year ago* (last edited 1 year ago) by NevermindNoMind@lemmy.world to c/chatgpt@lemmy.world

According to the analytics firm's report, worldwide desktop and mobile web traffic dropped by 9.7% from May to June, and 10.3% in the US alone. Users are also spending less time on the site overall, as the amount of time visitors spent on chat.openai.com was down 8.5%, according to the reports.

The decline, according to David F. Carr, senior insights manager at Similarweb, is an indication of a drop in interest in ChatGPT and that the novelty of AI chat has worn off. "Chatbots will have to prove their worth, rather than taking it for granted, from here on out," Carr wrote in the report.

Personally, I've noticed a sharp decline in my usage. What felt like a massive shift in technology a few months ago, now feels like mostly a novelty. For my work, there just isn't much ChatGPT can help me with that I can't do better myself and with less frustration. I can't trust it for factual information or research. The written material it generates is always too generic, formal, and missing the nuances I need that I either end up re-writing it or spending more time instructing ChatGPT on the changes I need than it would have taken me to just write it myself in the first place. Its not great at questions involving logic or any type of grey area. Its sometimes useful for brainstorming, but that is about it. ChatGPT has just naturally fallen out of my workflow. That's my experience anyway.

you are viewing a single comment's thread
view the rest of the comments
[-] Rhaedas@kbin.social 1 points 1 year ago

There are a number of them now, but I've put the Vicuna 13B one on my Windows side before. Trying to get it on Ubuntu so it can use the GPU, but it's being difficult. Look up TheBloke on github, they have a large selection that can be used through the text-generation UI coding.

I may have misspoke saying "better", as it looks like it's a few percentages below on comparisons. I thought I had seen some varieties of local compared that rated higher though, such as on AI Explained's channel.

[-] Zeth0s@reddthat.com 0 points 1 year ago

Thanks! I tried vicuna, but I didn't find it very good for programming. I will keep searching :)

[-] Rhaedas@kbin.social 2 points 1 year ago

I didn't either, actually. It seems to me that where LLMs excel is in situations where there will be a large consensus of a topic, so the training weights hit close to 100%. Anyone who has read through or Googled for answered for programming in the various sources online has seen how among the correct answers there are lots of deviations which muddy the waters even for a human browsing. Which is where the specialized training versions that hone down and eliminate a lot of the training noise come in handy.

this post was submitted on 06 Jul 2023
40 points (100.0% liked)

ChatGPT

8912 readers
1 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS