Yeah, if you’re relying on them to be right about anything, you’re using it wrong.
A fine tuned model will go a lot further if you’re looking for something specific, but they mostly excel with summarizing text or brainstorming ideas.
For instance, if you’re a Dungeon Master in D&D and the group goes off script, you can quickly generate the back story of some random character that you didn’t expect the players to do a deep dive on.
It’s not just about the environmental impact.
If you’re an expert in a specific field, you should interrogate these LLMs to see how accurate they actually are
When you see how fucking wrong they are about shit you have a firm grasp on, you will immediately stop trusting it regarding ANYTHING.
Yeah, if you’re relying on them to be right about anything, you’re using it wrong.
A fine tuned model will go a lot further if you’re looking for something specific, but they mostly excel with summarizing text or brainstorming ideas.
For instance, if you’re a Dungeon Master in D&D and the group goes off script, you can quickly generate the back story of some random character that you didn’t expect the players to do a deep dive on.