Tom Veatch

Flames of inspiration often leave smoke signals behind.
From mine, these.


Software

Tea Lady

Science can be fun and easy!

Tea Lady is a web app for testing if an assertion holds up statistically. Great, easy, accessible, quick, observational experiments by anybody, anywhere.

Tea Lady applies Fisher's Exact Test repeatedly as you do live data collection.

A predicate, for some category X, and some property Y, is the statement that X's are Y. That could include anything. The truth of it, though, can be empirically tested with the Tea Lady on-line tool. You watch and classify: Observe the X's and Not-X's that occur or come by, and record whether they are Y's or not Y's. Tea Lady keeps count and gives you a P-value or significance measure.

Real Time Scrolling Spectrograph

Watching a time-frequency analysis unroll on the screen as you speak, whistle, or sing, is perhaps the coolest thing in all of speech science. I used one of these in grad school, so I wrote one when I had the chance. So amazing, so much fun! Speak slowly. Whistle! Do a scale. Say "why" slowly and watch your vowels evolve. Compare "you" and "yow". Compare high and low pitch. Go play!

Translation Graphs

A project for representing media and documents along with their translations.

Version 1.0 as of 5/29/2024: super-easy multilinear text display tables. Download code, how-to & examples.

A Parallel Fourier Transform

Wow. This can't be true: an O(10) time algorithm for a a 1024 point fourier transform.

Let me set the stage. Now Gauss could be the greatest mathematician in history. I've always considered Tukey the greatest algorithm-making mind in history. How in the world could you wrap your mind around the FFT?! So they had the same idea (Gauss 1805, Tukey 1965) we call it the Fast Fourier Transform. The FFT is the most famous algorithm in history; almost incomprehensible to follow it, so much the more so it must have been to invent it. And the fastest FFT has always taken a hard limit of O(N*log(N)) time to calculate, with N the number of samples in the input.

The conclusion? My self-imposed, weekend Divacon homework exercise yielded a 12-symbol expression which computes a Parallel Fourier Transform in O(log(N)) time, ignoring communication: 1024x faster than Gauss and Tukey on 1024-sample data.

The story: My friend George Mou's 1990 PhD thesis was about how to think about parallel algorithms, and offered us a new orthogonal structure of analysis of computation, a formalism for algorithm expression, and a programming construct that imposes a particular way of thinking, which is actually an extremely general parallel decomposition structure onto your problem which yields world-beating results. If you can develop this skill, you can conquer the world.

George is what I'd call a world-historical genius, and I'm amazed to know him and lucky to be his friend. So I thought I'd document my education, so far, in Divacon, and maybe offer up some bits of tutorial explanation in the same, as well as in the calculations of, and the intuitions about, the Fourier transform, which I've worked on previously.

It was supposed to be a simple, outcome-uncertain homework exercise. George had told me he had a three line FFT written in Divacon which presumeably like all the others took him a half hour to write 30 years ago, but it wasn't in his thesis or published anywhere I could find, so I figured I would have to do honest homework to do an FFT in it. I figured that if I matched what he said he had done, I would surely know Divacon better, and believe me I'm impressed with the power of what that means. It was a Divacon exercise, not so much FFT algorithm research. So I puzzled on it for a few days, thinking about the geometry of the Fourier transform equation elements, going back and forth doing n'th partial re-reading of George's thesis, and finally just reading Numerical Recipes' section on FFT which used a 1942 version of the math, and it was like plug and play, pick out the divide and the combine from the usual suspects, the one that matched, use the regular base predicate everyone always uses for everything, ok, then discover that the base function was also the standard one which was a big surprise, but I'm okay with that. Then just looking at the equation to try to see the difference between one layer and the next, I could write the postadjustment function as a plain and simple, old, multiply-add, and by then it was essentially done.

A one line expression came out of this, and it says \(PFT\ \sim\ O(log(N)) is its time of computation.

Interested yet? I hope if you are a computer scientist, or even a mere programmer, you will find this super motivating and empowering to read about and then to learn yourself. Divacon is a big deal, and today it is stupid what a big secret it is.

Auditory User Interfaces

AUI is to GUI as ears are to eyes. In 27 years, little progress. This aspect of the Metaverse was conceived in 1995: a vision of how computers can improve our auditory environments, to benefit not just the blind but everyone with ears. Because the auditory channel carries more than just meaningless or musical background; it can be information-bearing as to place and source and content, so we could, should, enable our software to interact with us intelligently using those informational channels. Why not? Conceptual introduction, design specifications, killer applications, etc.

Teachionary language trainer

Fast, effective, and free audio-based vocabulary training in 18 languages. Iraqi / Arabic, Canadian French / Quebecois, Cantonese, Czech, English, Farsi / Persian, German, Hebrew, Hindi / Urdu, Japanese, Korean, Malayalam, Pashto, Russian, Spanish, Tamil, Turkish, Uzbek

IS: Information Structure

The IS cloud app generator creates a working single page web app for single-table database-connected tasks including Search, Create, Read, Update, and Delete. Used in examples like Dog and PM, Water Meter Readings, Tx transaction/receipts recorder, and soon HumBeep.

Community Capitalism

How to organize constituents to achieve shared goals: money-driven but not exactly capitalistic. There are idea specifiers, contributors, agents and engineers and producers of various sorts, and of course recipients. Long before kickstarter, this was an idea for crowd-funded prizes for crowd-implemented goals. Medical research, open source software, public goods of any kind, can be made to work with Community Capitalism.

Dog Name Database

Put your dog's name in here; see everyone's dog's names.

Notes from a Forehead on the Ground

After a week, a Friday night program, after the chanting, after the meditation, after darshan, comes a contemplation of oneness. A subset of Darshan Notes, selected for accessibility. Click "Choose One For Me Now".

Tom's Inventions

15 inventions you might like.

Hum Beep: the Buy Protocol

A way to use digital money more safely, via a crypto enhancing protocol and system implementing the purchase and sale process.

Study Notes: Haskell

Aaron Vargo thought I should learn functional programming with Haskell. So I went as far as this cram sheet. Perhaps it'll help others as a brief summary. More likely, it'll sow confusion and doubt. Aaron first thought to say 'It could be more wrong', but later corrected himself to say, 'It couldn't be more wrong.' I await more specific corrections.

Study notes for Julia

Julia is a nice new computer language. Here's my cram sheet.

Study Notes: Ruby

Ruby is a popular language on the web. Here is my cram sheet.

Anti Software

Where Tom goes of on a rant against using software to run a pingpong tournament.

GitHub.com/tcveatch

My GitHub repo, little used.

A More General Theory of the Syllogism

Abstracting logic. Aristotle's list of syllogisms missed half of them; there's nothing to them (H!); and we can do better without.

Still it is pretty fun and cool, considering this was the intellectual pinnacle of humanity for 2000 years, and plus I'd say this is not a bad introduction to "term logic", and might be suggested reading for students of computer science, philosophy, classics, and/or math.

Neural Networks + Fuzzy Logic + Space

A careful, accessible introduction to neural networks assuming only high school algebra and a little geometry and differentiation. NNs are defined mathematically, along with how to run them, how to train them (by the usual gradient descent), how to train them better (so I suppose: using 'Newton-Raphson', which really ought to kill!). I also discuss how to understand the training algorithm's implicit reasoning about the adjustments it decides to make; I share an interpretation that backpropagation is like an Anti-Dunning-Kruger learning system (and therefore morally superior to most men?). Then I give a whole Fuzzy Logic re-interpretation of NNs, along with suggestions on how to enhance their logical reasoning capabilities. I tried the wikipedia page, and got so frustrated I wrote my own introduction. So yes, I suggest reading this if you want to really understand neural networks, and if your other resources have made it seem inscrutable. It's a few pages of actual math, yes, but all the steps are laid out: no leaps! It's not short, but you don't have to be a math major to follow along. I encourage your study here if you are interested in really knowing how neural nets work.

Also this adds Fuzzy Logic to neural networks, including how to train them. Finally this goes into Space Representing Neural Networks so robots can represent space, or humans' representation of space can be understood better. Three months of work is in here.


Comments?
(will not be shared or abused)
Comment:
    Please give feedback
RSS for updates Subscribe to what's new
 

Copyright © 2000-2021, Thomas C. Veatch. All rights reserved.
Modified: 12/20/2021