India is an addict. It's hooked on cheap russian oil and gas. It's going to have a hard time when it has to go cold turkey.
wewbull
It's not an add-on feature. The LLM produces something with the best score it can. Things that increase the score:
- Things appropriate to the tokens in the request
- Things which look like what it's been trained on.
So that includes:
- Relevant facts
- grammatically correct language
- friendly style of writing
- etc
If it has no relevant facts then it will maximise the others to get a good score. Hence you get confidently wrong statements because sounding like it knows what it's talking about scores higher than actually giving correct information.
This process is inherent to machine learning at its current level though. It's like a "fake it until you make it" person, who will never admit they're wrong.
In this thread
🤯...🤯...🤯🤯...🤯
Beat me to it.
No wonder the birth rate has been down.
End to end encryption of a interaction with a chat-bot would mean the company doesn't decrypt your messages to it, operates on the encrypted text, gets an encrypted response which only you can decrypt and sends it to you. You then decrypt the response.
So yes. It would require operating on encrypted data.
"Burning it out" still leaves contamination. You need to remove it.
I think it's different. The fundamental operation of all these models is multiplying big matrices of numbers together. GPUs are already optimised for this. Crypto was trying to make the algorithm fit the GPU rather than it being a natural fit.
With FPGAs you take a 10x loss in clock speed but can have precisely the algorithm you want. ASICs then give you the clock speed back.
GPUs are already ASICS that implement the ideal operation for ML/AI, so FPGAs would be a backwards step.
If an AI can work on encrypted data, it's not encrypted.
It's when the coffers of Microsoft, Amazon, Meta and investment banks dry up. All of them are losing billions every month but it's all driven by fewer than 10 companies. Nvidia is lapping up the money of course, but once the AI companies stop buying GPUs on crazy numbers it's going to be a rocky ride down.
...and quite warm.