Saturday, November 28, 2015

Excerpt from a Facebook Philosophy thread.

There are two theories of practical physical consideration which are directly relevant to the problem you set forth. Quantum mechanics and general relativity. Your own stated position based on the law of conditioned causal continuity corresponds to a position asserting the truth of general relativity and its model of spacetime as a smooth, continuous manifold. Your view corresponds to what is called local realism and is directly challenged if not outright ruled out experimentally by Bell's entanglement; Einstein-Poldolsky-Rosen have addressed issues you raise and were soundly shown to be wrong by repeated experiments and theoretical arguments due to scientists such as Bohr, Shrödinger, and Bell.

You have to abandon at least one: locality or counter-factual definiteness.

I am agnostic as to the correct answer, so I consider cases of abandoning locality, counter-factual definiteness, or both. Due to recent experimental results, I exclude the possibility of hidden variable theorems.

My view tends towards quantum mechanics which is demonstrably non-classical, and as such, I consider non-classical theoretical methods and alternatives including paraconsistent logics and pre-classical logical languages like that attributed to the Buddha Siddhartha Gautama. Nick Bostrom, C. Hogan, Max Tegmark, Seth Lloyd, Vlatko Vedral, Stephen Hawking, and several other contemporary researchers have laid out experimental cases for testing simulation, holographic, or computable theories of physical reality.

My position then comes down to a question of the degree of emptiness of a given apparent conditional object and my own observable experience of observation and measurement. The difference between a fundamental particle, me, and the seeming emptiness of spacetime is a matter of density or degree not kind.

Koan for Your Thoughts?

Suppose that classical logic is like a viral matrix.
If a person was totally classical in their behavior and finite or countable in their form then they would be equivalent to a Turing machine.
if we accept the proposition for the moment that a Turing machine can not be a person, can not simulate consciousness or whatever property you want to ascribe to humans which is not Turing computable, then a person who totally accepts classical logic ceases to be a person. They are a machine; they are determined and at best exist with undecidable procedures or things totally inaccessible to them.
If classical logic has a viral matrix then it is infectious and it confuses communication and it renders things undecidable.

But suppose we go the other way. The Turing machine doesn't really think. It does. It obeys. It carries out functions. In some sense, to it all that exists is the instructions or input it receives. Nothing exists beyond what is decidable to it, and all else gets lost down whirlpools and rabbit holes.
But we exist. People exist. People are not Turing machines and people are not totally classical. So what happens when a Turing machine becomes aware? I am thinking that consciousness is like an antibody or an immune system.
Suppose we could develop a vaccine?

Saturday, June 20, 2015

The Impossibility of Reiterating Choice

If we accept that theorems of formal theory apply to physical theory--a postulate that with constraints Einstein explicitly accepted--then Gödel, Tarski, and Turing collectively demonstrate there are conditions independent of necessary and sufficient conditions. Necessary and sufficient conditions are the model of classical causality, so this is to say that there exist conditions which are neither necessary nor sufficient with respect to a causative system.

Heisenberg, Schrödinger, Bell, and others demonstrated that some conditions of physical systems can not be determined a priori. It is possible within quantum theory to construct conditions where the state of a system is indeterminate with respect to some physical observer, and Schrödinger outlined an argument linking the independence of such a state from determinate--classically causal--conditions. Short of hidden variables or more exotic theories which so far have been refuted in experimentation, randomness is a necessary condition of quantum theory.

If there exist causal, random, and formally essentially undecidable conditions in theory then we have a place for free will within physical theory which can not be accounted for by mathematical chaos--a complex deterministic theory, or by classical computing theories such as the Church-Turing thesis of recursive functions or processes. Free will may exist between the decidable, deterministic, and random conditions of our embodied mediation in our physical universe.

There is no need to appeal to supernatural or unnatural sources for the so called "uncaused cause".

Furthermore, randomness is adequately defined by Gregory Chaitin in "Meta Math!" as a self-justified fact. He links this notion to the uncomputability of Turing machines by a non-unique constructable number now known as Chaitin's number which encodes the halting probability of Turing machines. Free will can not exist in a strictly decidable or determinate system, so it has properties in common with non-deterministic or undecidable formal systems; thus, we have reason to believe that choices can be self-justifying facts arising independently of causal conditions and degenerating into decidable and deterministic conditions by interference with the environment.

Saturday, April 25, 2015

From Non-contradictory Identity to Contradictory Anatta

I have decided in writing the draft of my manuscript that the primary project will be the development of the preliminary logical languages and theories which have proceeded the development of mine. The preliminaries will be by no means comprehensive. In fact, my objective is to present minimalistic languages and models representing philosophical views on reasoning and physics while covering the major methods and theories. Notably the standard model of particle physics as given by quantum mechanics and quantum field theories together with the cosmological model based in general relativity and blackhole thermodynamics. Classical and non-classical metamathematics will be represented by computing theory and analytical calculus.

For this purpose, I have chosen to comparatively study Lq, L2q, Basic Logic, Minimalistic Logic, LK, and Natural Deduction. LK or an extension of it will be used for formalizing my proof of the impossibility of a non-contradictory theory of everything. Lq will be used for heuristicly hypothesizing the properties of an empty or paraempty logical language. Lnq, Basic Logic, Minimalistic Logic will be included mostly for reference and comparison of classically consistent object languages and paraconsistent object languages with truth-preserving consistent metalanguages.

This will likely be sufficient logical machinery to establish the inadmissibility of finitistic contradiction tolerant proofs in systems like Peano arithmetic. It is hoped that critical analysis of Lq and its computable extensions will suggest the form of finitistic contradiction tolerant proofs. If such a thing in some sense exists then a dual proof of possibility for contradictory theories of everything might be deducible in some way. Furthermore, the development of such kinds of proof might suffice for direct proofs of consistency.

Tuesday, March 31, 2015

Summary of Arguments for Free Will

Determinism is precisely definable. The Turing Thesis on computability defines exhaustively deterministic processes. All decidable systems are necessarily deterministic. All deterministic systems are not necessarily decidable. Hypothetically from the perspective of the Turing Thesis there exist non-deterministic processes, but they are not expected by the collective assumptions of the Turing Thesis to differ in expressive power, but this is only a hypothesis unlike the material implications of decidability with respect to determinism.

Quantum mechanical theory can be split into a variety of cases. The important cases for this argument is that of hidden variables; it is sometimes argued by theoreticians that quantum uncertainty can be explained as an artifact of the imperfect knowledge and construction of our experimental measuring devices, but our most recent experiments including the LHC experiments rule out hidden variables to very high precision because the existence of hidden variables would inevitably result in measurable consequences in contradiction to the Standard Model of Particle Physics. Therefore, it is more probable that quantum uncertainty is not due to hidden variables and as such is necessarily a random effect or process limiting the inherent precision of mechanisms including Turing machines. Quantum randomness and non-determinism is a necessary prediction of quantum mechanical theory without hidden variables.

A non-deterministic Turing machine is necessarily undecidable. Jointly, the class of deterministic Turing machines and non-deterministic Turing machines include class of processes which are not only undecidable but essentially undecidable. This is to say that no logical system that is consistent in the sense of having no contradictory states can in principle decide essentially undecidable procedures without incurring a contradiction. From this view, the classes of deterministic and non-deterministic processes is mutually exclusive and jointly exhaustive. The universe may be explained almost completely by the interaction of random and classically causal processes.

However, these theories depend crucially on the non-existence of contradictions in principle and the impossibility of logical systems which do not logically explode when given a contradictory premise. Non-contradiction is a hypothesis; it is a logical statement amounting to the mutual exclusivity condition and typically giving rise immediately to its dual the law of the excluded middle. The law of the excluded middle is the jointly exhaustive condition of the argument. Fuzzy logics are what you get when you reject the law of the excluded middle but not necessary non-contradiction. Paraconsistent logics are what you get when you reject non-contradiction and at least one of the three structural rules of classical logic or certain logical inference rules.

This is to say that contradiction tolerant logics do in fact exist, so hypothetically, there exists contradictory procedures or processes which may or may not be able to decide the classically undecidable conditions of Turing machines and quantum mechanics as formulated in classical logical terms. Free will might be said to be the class, set, multiset, sequence, or other collection of such procedures. Given that, free will can not exist within a whole decidable mechanism; free will may not exist in a whole deterministic system which is partially decidable. Free will would seem to be compatible with a notion of non-determinism which does not quantify every non-deterministic process as random though in classical terms they may not be distinguishable.

Turing alluded to this possibility in a keynote where he mentioned the computing power of fallible machines.

Thursday, September 19, 2013

We'll start with paraconsistent and contradiction tolerance

Paraconsistency is orthogonal to (consistency XOR inconsistency). Equivalently, paraconsistency is to non-dual as (p XOR not p) = not (p and not p) is to duality. Contradiction tolerance isn't a positive quality; it's defined by the absence of contradiction intolerance. Strongly consistent systems have absolutely no contradictions in them; strongly or strictly consistent systems also have logical explosion proofs when you put a contradiction in the premises. Contradictions in the consequences of a proof can be handled by erasure of one or more of the premises that entailed the contradiction. That's the basis for Reductio ad Absurdum, and the specific form of proof by contradiction where you posit the existence or non existence of a proposition and derive a contradiction to prove that it is specifically not that proposition, and under excluded middle and non-contradiction, show that the negation of the proposition necessarily exists. This forms the basis of many indirect proofs and measurements.

What all this is for is to exclude contradictions from our arguments except as consequences of proofs that erase propositions that prove contradiction. We can weaken consistency in various ways to tolerate contradictions. Basic logic is one example where the structural rules of weakening and contraction are absent as are the axioms of non-contradiction and the excluded middle. Without the ability to arbitrarily copy or erase information, it is difficult to definitely prove anything. Paraconsistent logics with consistent metasystem necessarily prove strictly fewer theorems than their consistent metasystem. Paola Zizzi proved that paraconsistent logics of this kind are subject to a no-self-reference meta theorem for contradictions. Zizzi's logics Lq and Lnq lack all axioms except a pair of propositionl identities for each proposition p_n and its primitive negation p_(n+1); Lq and Lnq also lack all structural rules. The turnstile representing metalogical constructive proofs can't be expressed in the syntax of Lq or Lnq and the metasystem is consistent. Her turnstile admits complex values, but the sum of probabilities is required to equal one, and the sum of the cross products z_0 z_1* + z_0* z_1 = 0. This effectively ensures that the proofs of Lq and Lnq and any quantum computer implemented satisfying it will have its output reduce to classical computations satisfying metasystem consistency. In a sense, these forms of paraconsistency manually join a subset of non-self-referential contradictions into a partially or weakly consistent sub-first-order object logic. Zizzi's creates a complex relationship between a proposition and its negation which is analogous to classical AND but explicitly accounts for the complex degrees and decomposition of the propositions. She identifies it with the process of superposition. Her Lq models a qubit as a weak contradiction; it is unitary.

I conjecture removing the axioms of identity and using neither a consistent, nor an inconsistent metasystem will allow the formalization of the turnstile for unrestricted domains. The trick is in what happens to trivial notions when you don't have non-contradiction, the excluded middle, and identity axioms/theorems to for the absolute emptiness of a set. I've been trying to wrap my brain around a way to formalize a paraconsistent system which is independent of and essentially undecidable from consistent formal systems and consistent metasystems. 

My intuition is that a non-dual relationship is paraconsistently defined--analogous to an automorphism--between an empty set and universal sets. In (Empty XOR not Empty) logics the Empty set is singular and absolutely empty; the negation of the Empty set can't possibly be universal sets because universal sets entail Russell's paradox, but I think their undefinable because there can't exist an absolute separation relationship of the two concepts. They are not absolutely true or false, so negations of either can not be absolute in general. Think of it as an uncertainty relationship between the truth values of the Emptiness and non-Emptiness of a set. The more certain you are it is Empty the less certain you can be about its non-Emptiness.

The notions of Reductio ad Absurdum and Proof by Contradiction need to be revised in such a system. As well as vacuous truth and empty functions. You still use the techniques of consistency, but consistency is something recovered in certain cases of conditions of the paraconsistent metasystem. Proof by contradiction splits into two parts: proof by contradiction tolerance and proof by contradiction intolerance. Proof by contradiction tolerance is new and to be used for describing recursive paraconsistent arguments. Proof by contradiction intolerance is a relabeling of classical proof by contradiction; it applies for restricted or conditional domains.

Under consistency, combonatorics, computing, Boolean algebra, the set of recursive functions, and others are equivalent systems. Cantor and cardinality proofs tell us that. The Chomsky hierarchy gives us further insight into how the formal languages, grammars, systems, and Turing machines all interrelate. The difference is in where we go from there. Calculus, Higher order logics, Diophantine polynomials, recursively enumerable languages, the set of all Turing machines, and uncountable sets spin off into infinite infinities. The Turing point is proof by contradiction. We can trust Gödel's proof in so far as we accept his assumptions or the premises of his argument. If you reject the consistency of his consistency assumption for instance, you reject the conclusions of his proofs in general. Truth preservation does that. For Gödel and his contemporaries, there was only one kind of contradiction and it could not be tolerated or trusted. Arbitrary contradictions led to explosion, so it was argued that contradictions in the conclusions of an argument necessitates the erasure of the premises; and there is effectively assumed to be exclusively contradictions or no contradictions.

Let us doubt Gödel, Tarski, Russell, and the great thinkers of Western history. Take a radically skeptical stances regarding the need for consistency and the intolerance of contradictions. Arguments to the effect, "There exists at most one of either p or not p. We need only consider consistent systems or not consistent systems. All other options can be reduced to these extremes." Can be dismissed because we can write a Gödel's sentence which gives us another option, and we can show the necessary independence of contradictions from Gödelian systems. We can point directly to examples of paraconsistent logic as proof of neither consistent nor inconsistent alternatives. So we refuse to accept the assumption of consistency and demand again: prove or disprove it.

Now consistency is a hypothesis that we need to theorize a way to develop it from paraconsistent formal systems. But we need a paraconsistent number theory. We need paraconsistent computing. We need paraconsistent algebras and set theories. And in at least restricted domains, some of it needs to reduce somehow back to the familiar consistent results.

What changes when you can partially define a curve joining what would be a singularity in classical mathematics? When 0/0 can be given a paraconsistently definite meaning? How does diagonalization work with complex-valued membership and non-membership? What are the properties of recursive contradictory expressions?

Thursday, August 29, 2013

Empty Axiom Schemata as Models of Paraconsistency and Fiction

Everything except nothing can be the metasystem of nothing.

Four general configurations to consider:
Metasystem is consistent, object system is consistent.
Metasystem is consistent, object system is paraconsistent.
Metasystem is paraconsistent, object system is paraconsistent.
Metasystem is paraconsistent, object system is consistent.

Consistent material implication is true iff the consequence of the material implication is not false; paraconsistent material implication alters 'not' and 'false' and redefines 'iff' based on the redefinition of the properties of equivalence relationships due to the altered form of material implication. The difference between the definitions of consistent and paraconsistent material implication is the difference between the formal notions of consistent and paraconsistent vacuous truth.

Paraconsistency is neither consistent nor inconsistent--independent of consistency and inconsistency. OR paraconsistency is both consistent and inconsistent but not logically explosive--included as a strict subset to consistency and inconsistency. From a consistent formal system, paraconsistency is not equal to consistency and inconsistency; the non-identity of paraconsistency with consistency and inconsistency is defined consistently. From a paraconsistent formal system, paraconsistency can not be strictly not equal to consistency and inconsistency as it lacks non-contradiction and lacks the excluded middle which define the strict XOR relationship. The question from a paraconsistent formal system becomes whether paraconsistency is a strict subset of consistency and inconsistency or consistency is a strict subset of paraconsistency or paraconsistency is equal to consistency and inconsistency or paraconsistency is neither equal nor greater nor lesser than consistency and inconsistency.

All of the above is relevant in defining the difference between consistent emptiness and paraconsistent emptiness; in general, paraconsistent emptiness will not be absolutely empty unlike consistent emptiness; consistent emptiness is defined as empty XOR not empty. Paraconsistent emptiness will in general be defined as empty and not empty; using Zizzi's notion of AND as the logical superposition operator, this is to say that paraconsistent emptiness will generally be a superposition of emptiness and non-emptiness; however, it would seem this notion of emptiness inherits the characteristic that every or almost every formal system can be its metasystem perhaps even itself. The Empty set of ZF set theory is often used in formal models to model logical triviality or falseness; likewise, a paraconsistent notion of emptiness and non-emptiness will likely be used to model logical fictions and falseness, but as the notion of non-emptiness is generally identified with notions of truth and existence, the paraconsistent notion of existence and truth is not totally or absolutely distinct from the notion of non-existence and falseness. They are joined to some degree which reduces to the usual consistent notions of truth and falseness under logical cut conditions. IE when we can exclude or neglect the effect of contradictions on our models such as when universal sets are intolerable or when locally causal processes dominate.