Can’t wait to see when happens when a program calls this function every clock cycle lmao
This right here is giving me flashbacks of working with the dumbest people in existence in college because I thought I was too dumb for CS and defected to Comp Info Systems.
BRB abusing
LD_PRELOAD
, recompiling Linux, pushing to prod and taking a sabbatical in Alaska.One of the things I’ve noticed is that there are people who earnestly take up CS as something they’re interested in, but every time tech booms there’s a sudden influx of people who would be B- marketing/business majors coming into computer science. Some of them even do ok, but holy shit do they say the most “I am trying to sell something and will make stuff up” things.
Can we make a simulation of a CPU by replacing each transistor with an LLM instance?
Sure it’ll take the entire world’s energy output but it’ll be bazinga af
why do addition when you can simply do 400 billion multiply accumulates
When do we get HexbearPlusAI?
BeanisBot
mallocPlusAI
That made me think of…
molochPlusAI("Load human sacrifice baby for tokens")
I’d just like to interject for a moment. What you’re refering to as molochPlusAI, is in fact, GNU/molochPlusAI, or as I’ve recently taken to calling it, GNUplusMolochPlusAI.
GNUplusMolochPlusAI!
GNUplusMolochPlusAI!
GNUplusMolochPlusAI!Now is the time to dance!
GNUplusMolochPlusAI!
GNUplusMolochPlusAI!
GNUplusMolochPlusAI!-–
To a drum and bass beat
GNUplusMolochPlusAIplusDrumPlusBassPlusBeat
loop-loop-loop-loop-loop-loop-loop-loop-loop-loop-loop-loop-loop
Id legit listen to this.
See also: https://pypi.org/project/jit-implementation/
Quality bits.
Uncritical support for these AI bros refusing to learn CS and thereby making the CS nerds that actually know stuff more employable.
But to recognize people who know something you too need to know something, but techbros are very often bazingabrained AI-worshippers.
Wait, is this not a joke?
deleted by creator
Chatgeepeetee please solve the halting problem for me.
modern CS is taking a perfectly functional algorithm and making it a million times slower for no reason
Given a small allocation can take only a few hundred cycles, and this would take at minimum a few seconds, it’s probably between a billion and a trillion times slower.
inventing more and more creative ways to burn excess cpu cycles for the demiurge
“INTEGER SIZE DEPENDS ON ARCHITECTURE!”
That will also be deduced by AI
idk why boomers decided to call integers int/short/word/double word/long/long long
you ever heard of numbers? you think maybe a number might be a little more descriptive than playing “guess how wide i am?”
yet another thing boomers ruined
Found javascript’s burner account.
Word size was architecture dependent
lets add full seconds of latency to malloc with a non-determinate result this is a great amazing awesome idea it’s not like we measure the processing speeds of computers in gigahertz or anything
sorry every element of this application is going to have to query a third party server that might literally just undershoot it and now we have an overflow issue oops oops oops woops oh no oh fuck
want to run an application? better have internet fucko, the idea guys have to burn down the amazon rainforest to puzzle out the answer to the question of the meaning of life, the universe, and everything: how many bits does a 32-bit integer need to have
new memory leak just dropped–the geepeetee says the persistent element ‘close button’ needs a terabyte of RAM to render, the linear algebra homunculus said so, so we’re crashing your computer, you fucking nerd
the way I kinda know this is the product of C-Suite and not a low-level software engineer is that the syntax is mallocPlusAI and not aimalloc or gptmalloc or llmalloc.
and it’s malloc, why are we doing this for things we’re ultimately just putting on the heap? overshoot a little–if you don’t know already, it’s not going to be perfect no matter what. if you’re going to be this annoying about memory (which is not a bad thing) learn rust dipshit. they made a whole language about it
if you’re going to be this annoying about memory (which is not a bad thing) learn rust dipshit. they made a whole language about it
holy fuck that’s so good
wait is this just the e = mc2 + AI bit repackaged
I think this is the first solo I’ve seen and it’s beautiful
t the syntax is mallocPlusAI and not aimalloc or gptmalloc or
if they’re proposing it as a C stdlib-adjacent method (given they’re saying it should be an alternative to malloc [memory allocate]) it absolutely should be lowercase. plus is redundant because you just append the extra functionality to the name by concatenating it to the original name. mallocai [memory allocate ai] feels wrong, so ai should be first.
if this method idea wasn’t an abomination in and of itself that’s how it would probably be named. it currently looks straight out of Java. and at that point why are we abbreviating malloc. why not go the distance and say largeLanguageModelQueryingMemoryAllocator
I might call it like, malloc_ai so it’s like, I INVOKE MALACHI lol
This is simply revolutionary. I think once OpenAI adopts this in their own codebase and all queries to ChatGPT cause millions of recursive queries to ChatGPT, we will finally reach the singularity.
There was a paper about improving llm arithmetic a while back (spoiler: its accuracy outside of the training set is… less than 100%) and I was giggling at the thought of AI getting worse for the unexpected reason that it uses an llm for matrix multiplication.
Yeah lol this is a weakness of LLMs that’s been very apparent since their inception. I have to wonder how different they’d be if they did have the capacity to stop using the LLM as the output for a second, switched to a deterministic algorithm to handle anything logical or arithmetical, then fed that back to the LLM.
I’m pretty sure some of the newer ChatGPT-like products (the consumer-facing interface, not the raw LLM) do in fact do this. They try to detect certain types of inputs (i.e. math problems or requesting the current weather) and convert it to an API request to some other service and return the result instead of a LLM output. Frankly it comes across to me as an attempt to make the “AI” seem smarter than it really is by covering up its weaknesses.
I think chatgpt passes mathematical input to Wolfram alpha
Yeah, Siri has been capable of doing that for a long time, but my actual hope would be that moreso than handing the user the API response, the LLM could actually keep operating on that response and do more with it, composing several API calls. But that’s probably prohibitively expensive to train since you’d have to do it billions of times to get the plagiarism machine to learn how to delegate work to an API properly.
bit idea: the singularity but the singularity just crushes us with the colossal pressure past the event horizon of a black hole.
Society is 12 hours of internet outage away from chaos.
Coming soon to Netflix?
Chaos Day (2025)
Tagline: 12 hours without the netPlease Iran, detonate an EMP over the US