It depends upon what you use ChatGPT for and if you know how to use it productively. For example if I ask ChatGPT coding questions it is often very helpful. If I ask it history questions it constantly makes things up. You also again need to know how to use it, like people who claim ChatGPT is not helpful for coding you ask them how they use it and they basically just ask ChatGPT to do their whole project for them and when it fails they claim it is useless. But that's not the productive way to use it, the productive way to use it is like a replacement for StackOverflow or to provide you examples of how to use some library, or things like that, not doing your whole project for you. Of course, people often use it incorrectly so it's probably not a good idea to allow its use in the workplace, but for individual use it can be very helpful.
People Twitter
People tweeting stuff. We allow tweets from anyone.
RULES:
- Mark NSFW content.
- No doxxing people.
- Must be a pic of the tweet or similar. No direct links to the tweet.
- No bullying or international politcs
- Be excellent to each other.
- Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.
For coding it heavily depends on the language. For example, it's quite decent at writing C#, but whenever I try to ask it any question about rust, it's either flat out wrong or doesn't even fucking compile.
Also found it most useful when I know exactly what I want, just don't know the syntax. Like when I was writing C# code generation for the first time. Also unsurprisingly sucks at working with libraries.
And thank god it doesn't get them all the way there, because if it were able to completely do everything accurately with the level of ambiguous prompts the layperson gives it, anyone technical would essentially be out of a job.
And honestly, the world would be better off not making people complacent just being end users of everything, and instead have to have a modicum of understanding what they are doing.
I used to think its just neophobia having all these kids using smart phones and touch screens for everything at increasingly earlier ages, but its like they only know how to use/consume things, never an inkling of trying to tinker with things and understand how to repurpose the mechanisms , figure out how things work (tbf everything now is super integrated, much harder to repair).
It just doesn't bode well to me when it seems like the future labor force is so disconnected from the underlying systems they use.
I used it today to find out how to do something on my Juniper that would have taken 45 minutes of sifting bullshit documentation. One question and I figured it out in 2 minutes.
This is similar to gabe Newell's idea of piracy. This is a convenience issue. And GPT solves some of it.
in my use case, the hallucinations are a good thing. I write fiction, in a fictional setting that will probably never actually become a book. If i like what gpt makes up, I might keep it.
Usually, I'll have a conversation going into detail about a subject, this is me explaining the subject to gpt, then having gpt summarize everything it learned about the subject. I then plug that summary into my wiki of lore that nobody will ever see. Then move on to the next subject. Also gpt can identify potential connections between subjects that I didn't think about, and wouldn't have if it didn't hallucinate them.
Gippity is pretty good at getting me 90% of the way there.
It usually sets me up with at least all the terms and etc I now know to google, whereas before I wouldnt even know what I am looking for in the first place.
Also not gonna lie, search engines are even worse than gippity for accuracy often.
And Ive had to fight with so many cases of garbage documentation lately that gippity genuinely does the job better, because it has all the random comments from issues and solutions in its data.
Usually once I have my sort of key terms I need to dig into, I can use youtube/google and get more specific information though, and thats the last 10%
Remember when you had to have extremely niche knowledge of "banks" in a microcontroller to be able to use PWM on 2 pins with different frequencies?
Yes, I remember what a pile of shit it was to try and find out why xyz is not working while x and y and z work on their own. GPT usually gets me there after some tries. Not to mention how much faster most of the code is there, from A to Z, with only little to tweak to get it where I want (since I do not want to be hyper specific and/or it gets those details wrong anyway, as would a human without massive context).
chatgpt has been really good for teaching me code. As long as I write the code myself and just ask for clarity or best practices i haven't had any bad hallucinations.
For example I wanted to change a character in an array with another one but it would give some error about data types that were way out of my league. Anyways apparently I needed to run list(string) first even though string[5] will return the character.
However that's in python which I assume is well understood due to the ton of stackoverflow questions and alternative docs. I did ask it to do something in Google docs scripting something once and it had no idea what was going on and just hoped it worked. Fair enough, I also had no idea what was going on.
You have to understand it well enough to know what stuff you can rely on. On the other hand nowadays there are often sources there, so it's easy to check.
What are you talking about? I don’t verify anything that ChatGPT gives me.
bold of u to assume there are docs
Or docs are far too extensive... reading imagemagick docs is like reading through some old tech wizard's personal diary.. "i was inspired to shape this spell like this because of such and such...." like, bro.. come on, I just want the command, the args, and some examples... 🤷♂️
I usually tell it "using only information found on applicationwebsite.com " that works pretty well at least to get me in the ballpark to find the answer I'm looking for.