(http://www.santafe.edu/projects/CompMech/papers/ECA54TitlePage.html)
I wish I were smart like
Cosma Shalizi. He's cool. (He works on CAs, among other things.).
I take back what I said about recommending "
Computational Mechanics of Cellular Automata: an Example". I mean, I understand it's pioneering and breaking new ground (published 1995), but the obsession with "synchronization" is really bad. Saying that a finite automaton "effectively builds a finite queue" is really bad. And what's up with those 4 and 8 "unrecognized defects"? It's unprofessional to have unrecognized defects- couldn't you have mentioned them earlier, like in the list of particle interactions
(Of course, it does have the delightful terms "space-time machine" and "gamma gun"- as I said before, it has plenty of merit.)
I would like to see the XOR of a solid domain with a space-time diagram. There's probably a group structure of some sort on the set of possible "phases" of the regular domain (a checkerboard domain has two phases, and at the locations where one phase has white squares the other has black squares and vice versa). (translations of the plane quotiented by the translations that leave the regular domain identical?) The different particles each are associated to some element of this group, describing how the phase changes as you walk from one side to another. This constrains the possible particle interactions:
Like, hypothetically, an interaction might be: alpha + alpha + gamma -> beta + beta.
The product of the group elements associated with the the particles on the left would have to equal the product of the group elements associated with the particles on the right. If g is the function from particles to group elements:
g(alpha)*g(alpha)*g(gamma)=g(beta)*g(beta)
This is exciting. I think I could go off and compute these right now! But I'm not going to.
They call their process a "particle filter", which is mildly amusing, to me. Once I thought "particle filter" means "it filters out the particles" (like Brita or something). Then recently I read about "particle filter" meaning "it's a signal processor that uses (virtual) particles". This is a "particle filter" that adds particles.
My idea for how to build a particle filter would be to grow equivalence relations- equivalence meaning "in the same domain". You could use Tarjan's union-find algorithm, and eventually you'd get a partition of the space-time diagram, and the big equivalence classes would be the domains. I think I know how to do it. Maybe I don't know what I'm talking about, or maybe it'd be inefficient (I mean, finite automata are linear time, and that's hard to beat!).
I recommend "
Classifying Cellular Automata Automatically". It has a much better explanation of LMC complexity ("medium spikyness of histograms") than the
LMC complexity paper (that I talked about previously), even though it doesn't mention LMC complexity.