Only Bayes Can Judge Me

  • 32 Posts
  • 1.26K Comments
Joined 2 years ago
cake
Cake day: July 4th, 2023

help-circle




  • Oof that’s the good stuff. Chuds with overly self-inflated egos co-opting eastern philosophy for tech shit is pretty well known around these parts. It’s refreshing to see it from a slightly different white guy.

    Also, my usual muckraking bore unexpected fruit:

    I’m gonna believe it. The Candace Owens part is disputed, and I daresay debunked, though.

    text of tweet inside image

    From @BootsRiley:

    Now is as good a time as any to tell people that Rick Rubin is a behind-the-scenes rightwinger who tries to recruit music industry folks to Q-anon type stuff and is who (according to Kanye) convinced Kanye to meet Candace Owens and endorse Trump.

    He just looks like a hippie.














  • In the current chapter of ā€œI go looking on linkedin for sneer-bait and not jobs, oh hey literally the first thing I see is a pile of shitā€

    text in image

    Can ChatGPT pick every 3rd letter in ā€œumbrellaā€?

    You’d expect ā€œbā€ and ā€œIā€. Easy, right?

    Nope. It will get it wrong.

    Why? Because it doesn’t see letters the way we do.

    We see:

    u-m-b-r-e-l-l-a

    ChatGPT sees something like:

    ā€œumbā€ | ā€œrellā€ | ā€œaā€

    These are tokens — chunks of text that aren’t always full words or letters.

    So when you ask for ā€œevery 3rd letter,ā€ it has to decode the prompt, map it to tokens, simulate how you might count, and then guess what you really meant.

    Spoiler: if it’s not given a chance to decode tokens in individual letters as a separate step, it will stumble.

    Why does this matter?

    Because the better we understand how LLMs think, the better results we’ll get.