Wait what? AI generated reviews? Please tell me this is a joke? What the fuck is the point of that? In an ideal scenario I want to read about what experience a buyer has with a product. I already expect at the bare minimum of half of the reviews to be fake. Why would you admit you absolutely shouldn't trust any of them?
Particularly since summarizing text is something that llms are actually decent at, it makes sense to use them for that. They're unreliable at generating new content, but asking for a description of text that's just below it is reasonable.
Wait what? AI generated reviews? Please tell me this is a joke? What the fuck is the point of that? In an ideal scenario I want to read about what experience a buyer has with a product. I already expect at the bare minimum of half of the reviews to be fake. Why would you admit you absolutely shouldn't trust any of them?
From the bit we can see of the screenshot, I think he means the AI generated summary of the reviews. So not really a review itself.
Particularly since summarizing text is something that llms are actually decent at, it makes sense to use them for that. They're unreliable at generating new content, but asking for a description of text that's just below it is reasonable.
As always with these things, it ends up being garbage in garbage out.
Ok that makes a bit more sense on the surface