Tom Veatch

Flames of inspiration often leave smoke signals behind.
From mine, these.


Math

Mean Thoughts

Thoughts about the Pythagorean means: the average, the geometric mean, and the hyperbolic mean. This is surprisingly deep for so simple a topic. Understanding at least 2/3 of it is part of your path to wealth and meaningfulness.

Negative Dimensionality

Abstracting geometry.

Hilbert's axiomatization of geometry, full of redundancy, led me to a generalization which makes geometric dimensionality a characteristic that can be counted up (as in point to line to plane to space, etc.) and down (space to plane to line to point: etc.) Geometries, by intersecting, create lower-dimension geometries; for example two intersecting 2D planes create a 1D line. Geometries, by projecting, create higher-dimension geometries; for example, two 0D points project a 1D line.

But if there is no upper limit, perhaps there is also no lower limit. The idea that a geometry might have negative dimensionality seems absurd, considered within the assumptions of spatial thinking, yet it derives from the same less-redundant axiom set as the geometries we understand. Suggestions for intuition and use of this idea are also given.

A More General Theory of the Syllogism

Abstracting logic. Aristotle's list of syllogisms missed half of them; there's nothing to them (H!); and we can do better without.

Still it is pretty fun and cool, considering this was the intellectual pinnacle of humanity for 2000 years, and plus I'd say this is not a bad introduction to "term logic", and might be suggested reading for students of computer science, philosophy, classics, and/or math.

Bliss Theory: Emotion in General

On a mathematical represention of emotion, with decomposed functions including Identification which turns out to have a central role.

Math as Language

Underlying intuition reads out as discrete expression.

Math Tutor

Learn your times table.

Neural Networks + Fuzzy Logic + Space

A careful, accessible introduction to neural networks assuming only high school algebra and a little geometry and differentiation. NNs are defined mathematically, along with how to run them, how to train them (by the usual gradient descent), how to train them better (so I suppose: using 'Newton-Raphson', which really ought to kill!). I also discuss how to understand the training algorithm's implicit reasoning about the adjustments it decides to make; I share an interpretation that backpropagation is like an Anti-Dunning-Kruger learning system (and therefore morally superior to most men?). Then I give a whole Fuzzy Logic re-interpretation of NNs, along with suggestions on how to enhance their logical reasoning capabilities. I tried the wikipedia page, and got so frustrated I wrote my own introduction. So yes, I suggest reading this if you want to really understand neural networks, and if your other resources have made it seem inscrutable. It's a few pages of actual math, yes, but all the steps are laid out: no leaps! It's not short, but you don't have to be a math major to follow along. I encourage your study here if you are interested in really knowing how neural nets work.

Also this adds Fuzzy Logic to neural networks, including how to train them. Finally this goes into Space Representing Neural Networks so robots can represent space, or humans' representation of space can be understood better. Three months of work is in here.


Comments?
(will not be shared or abused)
Comment:
    Please give feedback
RSS for updates Subscribe to what's new
 

Copyright © 2000-2021, Thomas C. Veatch. All rights reserved.
Modified: 12/20/2021