Follow

Absolutely burned my brain out on this shainline shit yesterday. I'm supposed to resume writing at "The correspondence between computational measure theory and computation-centric characters of cosmological reproductive fitness optimization:"

Guh. Something about uh, shainline's fine tuning for the chemistries of computer technologies being explicable via simulationism, but only to a limited extent, and then I have to explain why computational cosmological measure theory is wrong despite that??

· · Web · 1 · 0 · 1

@faun from what I can tell, computational measure theory is not a real field

imo the problems with this work are more philosophical: it's trying to bayes its way out of having no sane priors. you can't just assume [uniform] distributions on things you don't understand

@migratory field? By measure theory I mean theory of cosmological measure, per the measure problem. An answer to how to divide things up. How to say "this exists more/exists at all/occurs more often/is real, relative to this other thing", usually with respect to time, but there are other questions.

Those problems are insoluble, but I find that assuming some uniform distribution somewhere without any prior is totally inescapable if you want to be able to reason at all

@migratory simulationism maximalists believe computational measure theory, that a thing's existence is proportionate to its computational uniqueness/pattern count/interestingness, or something, because it's computers all the way down and computers don't repeat redundant work, but I'm not really writing for them. But maybe I should address them. Maybe in an appendix.

Hmm actually uh what if stephen wolfram is one of these.. maybe I should engage

@faun do you mean something like "simulations running on a given substrate have a trivial most-efficient option for computing operations that their substrate implements natively (and as a result need not implement some redundant, slower, method of computing that operation)"?

to me, I think of computers per se as spending most of their time performing redundant work because intermediate values are not content-addressed and proving equivalence of partial intermediates is often more work than recomputing them from scratch

@migratory I guess that's one of the things that might be going on.

I'm not really sure how to articulate a reduction of computational measure accounts that makes sense because they generally don't make sense, and aren't right. There is no continuous regress of simulations. There is exactly one level of regress, we cannot get a cosmological measure account from that.

@faun to me, the entire question of evaluating theories in terms of whether they say that our single data point is likely or unlikely is insane, because probability is a unidirectional tool: it can tell you what outcomes are likely given a situation with unknowns and some statistical assumptions on those unknowns, but cannot tell you anything about a specific event that did occur. this kind of probabilistic anthropic reasoning seems fundamentally misguided

@migratory I think you're being led by an intuition that I should shut up and go and get more data, which would usually be right, but this is a case where it's impossible on the metaphysical level to get more data. We'll only ever have just one observation of uniformly sampled anthropic moment (our present moment).

So deal with it. If you don't think the epistemologies we have can deal with that gracefully, I guess you'll just have to create new epistemologies

@migratory Reasons we have no choice but to grapple this one datapoint for theories: We need to be able to say whether an artificial brain experiences being, we need to figure out what kind of universe we're in (so that we can plan for the way it will end), and we need to figure out how to make new universes with happy living things in them assuming that the black hole → new universe thing is true.

@faun there's a type error here in that "uniformly sampled" refers to a sampling procedure, but there's no reason to believe a sampling procedure created our world, and acting as if the parameterization we seem to think is intuitive from inside (pick a scalar cosmological constant, etc.) has anything to do with the space our world was drawn from (were it sampled) is like looking at your first-pick monty-hall goat to see whether the fabled "car" is more likely to have long or short hair

@faun ok, I understood "measure theory" as the mathematical field, i.e. "well-behaved sigma-algebras", a formalization of probability

@migratory Yeah. I haven't properly studied that but probably should and this is going to keep being a problem. Initially I was calling them "models" instead of theories but figured that'd be even more irritating to nonbayesians.

@faun per wiki it seems they're just called different measures, and you could discuss classes of measure if you need to generalize

Sign in to participate in the conversation
Merveilles

Revel in the marvels of the universe. We are a collective of forward-thinking individuals who strive to better ourselves and our surroundings through constant creation. We express ourselves through music, art, games, and writing. We also put great value in play. A warm welcome to any like-minded people who feel these ideals resonate with them.