puzzling out the proofs for concepts so utterly fundamental to math by myself that it’s like if Genesis 1:3 was And God said, ‘Let there be integer,’ and there was integer
I remember my discrete math professor getting to Aleph Numbers and thinking “My God, it finally happened! THEY RAN OUT OF GREEK LETTERS!”
Wait until you hear about ω1CK
Algebra is the great innovation of humankind, prove me wrong.
And then you study foundational Maths and you are convinced of the Demiurge.
Cantors diagonal argument and the continuum of the reals is my demiurge. It’d be one thing if it was like a weird tangential fact about the reals, but no, you have to accept choice to construct them in the first place, and then that means that there has to be a well ordering on any subset, and of course, wtf is a well ordering on (0, 1)
It took me until following down the “how do we dodge Gödel’s theorem maybe we can use probability or restrict proofs to a subset or something idk.” Thoughts of the 1940s logical empiricists that I truly realised how perverse Maths is.
i found out that cantor’s diagonal argument is more of a persuasive argument than an actual proof and it’s been sort of driving me a bit insane since. math is truly a perverse spiral.
I’ve been obsessing over axiom of determinacy as a potential replacement for axiom of choice.
If you are already familiar with calculus but not with topology, I recommend you take a look at the latter.
I’d suggest doing introductory analysis prior to topology. Having a bit of concrete experience with the topology of R helps motivate a lot of the basic definitions and results.
I’d suggest doing introductory analysis prior to topology
Pretty sure that is covered under ‘calculus’ in English-speaking countries. Is that not so?
Only if it’s the math major version of the course at elite institutions, at least in the US. Typical versions of calculus will probably at best discuss epsilon-delta definition of a limit. They won’t discuss topics like connectedness or compactness, and when covering the Riemann integral they will use a version that only works for continuous (and can be extended to piecewise continuous) functions, but the definition can’t answer some basic questions like “is this function Riemann integrable”.
Only if it’s the math major version of the course at elite institutions, at least in the US
Huh? Wow, I guess the west is this barbarous. Seriously, those topics were covered in the first semester in my case, with the primary textbook also taking a topological approach (without introducing topology explicitly - just working with the metric notion of open sets, though).
Commercialised access to higher education has been a scourge upon your education, or seems.
I want to understand math but my adhd makes it hard to study (plus i dont even know like, how to study effectively on my own with no direction)
That was the only math class I did well in. Everything else was straight Bs lol
I barely scraped by in Discrete Mathematics, but it was definitely neat when I understood what the hell was going on. I love that aspect of it–that you’re basically bootstrapping a logical framework for doing math. But boy does it feel bad when you’re taking an exam, staring at the proof prompt, and going, “I have absolutely no idea where to even start.” My experience was that in Algorithms I could at least fudge an answer for partial credit, but I got plenty of big fat zeros in Discrete Math.
If you’re doing CS and enjoy the math aspect, definitely take a gander at a cryptography elective if that’s an option! Formal math wasn’t my strong point but I still loved that class for helping me actually understand the mathematical primitives behind modern crypto. Not math based, but I also enjoyed compilers for that same bootstrappy aspect (admittedly I am also one of those masochists who enjoys working with assembly).
On the note of CS and the rest of math, there are also computer graphics and artificial neural networks.
Computer Graphics was also a lot of fun! It’s amazing how satisfying it is when you’ve wrestled with your twenty lines of GLSL for hours and then you’re finally like, “Holy shit! My teapot is reflective now!!” Definitely gave me a newfound appreciation for people who work with graphics down at that level. I only learned the basics, but it’s definitely a topic I’d be interested in learning more about at some point.
linear algebra was my favourite class in CS. I also loved assembly:)
Ooh, that was another good one! I didn’t find it to be too difficult, and there’s something super satisfying about doing all those matrix operations by hand. It was really cool to take cryptography and computer graphics later on and see just how powerful a tool linear algebra can be!
“God created the integers, all else is the work of man”
What the heck is abstract mathematics?
Are they like “a number that might be 2 + the number that most invokes the season of fall = 14”
Edit: and discreet mathematics?
“Okay, I’m not saying 2 was added to 4, I have not comment on whether that happened or not but I am saying =6”
it’s when you ask “okay but how does division work on a fundamental level, it’s definitely physically intuitive yeah but what’s the maths behind it, like multiplying is adding a number to itself x amount of times, dividing is unmultiplying by a number, but it’s not subtracting a number multiple times, at least not one that’s always present in the equation, what’s going on here, how did this happen” and it all goes downhill from there
I’m too stoned for this
but it’s not subtracting a number multiple times, at least not one that’s always present in the equation
isn’t it, though? subtract 4 from 12 three times and you’re left with the additive identity
exactly, it’s anti-multiplication – you find how many times you need to subtract out a number from a product to get another number, but it’s hard to compute comparatively due to that abstraction. Reason I reference this is that for a fundamental operation of math, computers absolutely suck at it on a basic level compared to multiplication
i see. yes, that is weird. now that you’ve experienced gnosis, why do you think it’s so much harder to compute?
because the fundamental definition of divisibility is whether or not the chosen divisor—
b
—can be multiplied by any number within the set of numbers you are working with—c
—to get the dividend—a
.The output of division is
c
. Therefore, the brute force way of dividing a number would be to iterate through the entire set of possible numbers and return the number that, when multiplied by what you are dividing, outputs the value you want to divide from—or, to have the multiplication table as a persistent hash map in memory, pre-computing all possible products.It’s not implemented like this because that would be horrifically slow/bloated. See Low Level Learning’s 5 minute video computers suck at division (a painful discovery) to see how it’s implemented in modern processors—it’s very, very unintuitive.
or, to have the multiplication table as a persistent hash map in memory, pre-computing all possible products.
huh, yeah, i guess that’s kinda how humans do long division?
See Low Level Learning’s 5 minute video
he’s hopping right past the most interesting part, though. sure, doing math with byte logic is tricky, so you have to make approximations. so in order to divide by 9 the processor does some fixed point math and says to divide by 9 we’re actually going to multiply by 2^33 / 9 which equals that long number. but how does the processor know what 2^33/9 equals? how’s it doing that? sounds like it’s very good at division, because it would take me a while to work out that value even if i started with trying to find 8589934592/9, y’know?
more to the point, how does it do 0b1000000000000000000000000000000000 / 0b1001 = 0b111000111000111000111000111001 so quickly?
You should understand that “computers very bad at division” is a relative term, on modern desktop x86 chips a single 64-bit division takes somewhere on the order of 20-40 clock cycles (which seems pretty fast when you think about how they’re running at like 4 billion cycles per second, but these same chips can do integer multiplication in 2-3 cycles so division is painfully slow by comparison).
The reason multiplying by 2^33 / 9 is faster is because you don’t have to actually compute what 2^33 is and then divide by 9 every single time - the compiler can compute that value the “slow” way a single time and bake that into the machine code, and then when the program is running it only has to deal with multiplication.
While we don’t know exactly what techniques are being used in a modern Intel/AMD chip, as far as I know the current best approach is basically just long division.
EDIT: The reason long division is slow is that all the partial results depend on the results of previous steps. This means that your division circuit ends up being a really long line of comparator circuits chained end-to-end, and long circuits are bad because it takes a long time for the signal to actually reach the end. Multiplication is fast by comparison because you can compute all the partial products in parallel, and then add them together in a kind of tree shape. The end result is a circuit which is super wide, but the “propagation delay” (the time it takes until the last input signal reaches the end) is pretty low since there’s no path which passes through more than a few adders.
I found a YouTube link in your comment. Here are links to the same video on alternative frontends that protect your privacy:
Really getting into studying things like that always feels so good. Once I almost understood the Maxwell equations. Feels good. Grasp the divine!
I really liked my “Sequences, Series, and Foundations” course which was the core proofs course for my University.
i remember the lecture i decided to ignore and instead see if i could figure out the golden ratio just knowing a golden rectangle could be subdivided into another golden rectangle + a perfect square. it took me basically the whole class and that was just simple algebra and substitution, the shit you’re talking about may as well be heiroglyphs to me
As someone specializing in graph theory I am greatly enjoying this thread 😄 what a treat
That’s so cool. I hope you enjoy it!
I really lament not being mathematically talented enough to get that far with math and feel that divine aspect of it. I have other strong suits though. Can you please recommend any books discussing philosophy of math, if you know any?
discrete maths is fun as hell
it’s all like lil puzzles and stuff (indeed most puzzles can be described as problems in discrete maths), it’s real satisfying