QuentinCallaghan@sopuli.xyzM to Political Memes@lemmy.caEnglish · 3 months agoMeanwhile at DeepSeeksopuli.xyzimagemessage-square141linkfedilinkarrow-up11.01Karrow-down115
arrow-up1992arrow-down1imageMeanwhile at DeepSeeksopuli.xyzQuentinCallaghan@sopuli.xyzM to Political Memes@lemmy.caEnglish · 3 months agomessage-square141linkfedilink
minus-square474D@lemmy.worldlinkfedilinkarrow-up1·3 months agoI mean obviously you need to run a lower parameter model locally, that’s not a fault of the model, it’s just not having the same computational power
minus-squareAtHeartEngineer@lemmy.worldlinkfedilinkEnglisharrow-up2·3 months agoIn both cases I was talking about local models, deepseek-r1 32b parameter vs an equivalent that is uncensored from hugging face
I mean obviously you need to run a lower parameter model locally, that’s not a fault of the model, it’s just not having the same computational power
In both cases I was talking about local models, deepseek-r1 32b parameter vs an equivalent that is uncensored from hugging face