What is the Incommensuration Theorem?

It is a statement about the nature of the relationship between what is meant by symmetry and what is meant by continuity. Insofar as the usage of these terms, 'symmetry' and 'continuity', are at the core of many other areas of thinking, the Incommensuration Theorem (ICT) is one of the very most important results of Analytic Metaphysics (1).

Why are the notions of symmetry and continuity so important?

These are both very basic concepts, occurring in direct connection to the foundations of a domain, and as such, are defining of the most primal kids of structure occurring within that domain. In effect, everything within a given domain of study or consideration ends up being governed, directly or indirectly, by the notions of symmetry and continuity, and the inter-relationship between them.

What do you mean by a 'domain'?

In common usage, the meaning of the term "domain" is similar to the notions of "objective reality", "world", "universe", and "dimension". They all refer to a notion of 'a space' in which things happen.

Here, the concept of domain extends to include any realm or class of thought, theory, and/or imagination. For example, any single language is a domain (ie, English, Ansi-C, HTML, etc). Each system of mathematics or field of scientific study is a domain.

As such, the notion of domain is more about being a 'field of study' than it is about being about things that exist that can be studied.

In more technical considerations, a 'domain' is that which contains (includes), but which is itself uncontained and uncontainable. In other words, it is inherent in the notion or idea of a 'domain' that it can be regarded as if independently, as if not included or embedded in anything else -- any other domain.

So the notion of a 'domain' is like a set that can contain no other sets?

Almost. The notion of 'set' is a good one, insofar as it gets at the essence of what is meant by a the term 'domain' as used here. However, the idea of 'domain' is more in reference to the 'top level set', as being a set that is defined in such a way that it itself is explicitly defined as being of such a nature that it cannot be 'validly' included in any other set, rather than to be establishing a limit on that top level set itself such that it can contain no other sets.

As such, the top level set can contain both elements and child sets, which themselves can contain other child elements and sets, etc, but that for whatever reason, the top level set is considered as being 'exclusive' in that it cannot be a member of any other set (ie, it does not and cannot include itself, etc).

Are you defining it this way to prevent recursion, and the types of paradoxes that occur thereby, such as the Barber Paradox?

Not really. The reason for the 'non-embedding' idea is to appropriately reconcile our own relation, our own participation, as philosophers imagining the concept of a set, to the notion of a set in itself. In particular, the emphasis is to be 1st on the relation between parts and wholes, directly, and then 2nd to be considering some of the specific implications of what it means to be a 'part' or a 'whole'.

As such, it is necessary to consider the meaning of what it means to be an 'element' in a 'container', as a kind of 'inclusion relationship' prior to the then important considerations of what is meant by the notions of 'element' and then 'container'. Jumping directly to a recursive schema where 'containers' themselves become 'elements' means that by conflating their usage, we also end up overlooking important considerations regarding the distinctions inherent in the ideas themselves.

So you are explicitly asserting that elements are not containers and containers are not elements?

Yes. Each of these concepts -- 'element', 'container', and 'the relation between' -- each, all three of them, have their own distinct ontological classes.

Wait -- what do you mean by 'ontological class'?

Just that the three terms -- the three concepts -- are each inherently basic to the 'domain' of consideration with which we are currently concerned. In other words, that the three concepts of 'element', 'set, and 'the relation of elements being inside of sets' are in some ultimate basic sense distinct, inseparable, and non-interchangeable.

In effect, the notion of 'element' is a 'concept role' that is being played in relation to the two other concept roles, even though we may, at some point for the sake of interest, consider the implications of more elaborate embeddings. Even when we consider the idea of a 'set within a set', the set which is being contained within the other set is acting in the role of being an element within that larger set. No matter how much we may use the term 'set' in a confused way, and no matter how confused we may be ourselves about this usage, it is still the case that the 'interior set' is in the role relation to the 'exterior set' of being an 'element'. The notion of 'inherent ontological classes' is an explicit reflection of this truth.

Ok, so I get the distinction. Why was it so important to mention right now? It seems like so much semantic pedantry.

You may have noticed that I used the word 'domain' in connection with the notion of 'set theory' as being about the relations of the three concepts 'elements', 'sets', and 'the relations between elements and sets'. If we regard the idea of 'set theory' as being an example of 'a domain', then it is necessary to clarify that the notion of 'a domain' is not equivalent to the notion of 'a set', even though both are seemingly identical in the sense of being an 'inclusive container'. The concepts are subtly different, and this difference is crucial in understanding what the real meaning of these concepts actually are.

If we are regarding the pure idea of 'a container', inclusion relations, etc, as a kind of ultimate Platonic Ideal, then by extension we are considering the relation between the notions of singularity and plurality. In effect, there can be only one container -- one "totality" of all that is -- all that there is. It does not really make sense to have a notion of 'totality' that is plural, for if there were multiple totalities, then there could be created and conceived of some 'meta' concept that included all of these together to thus make an even more ultimately 'true' totality.

Similarly, the notion of 'an element' is considered to be atomic -- as inherently indivisible. If it were possible to (even just conceptually) divide, in some manner, an element in to a collection of sub-elements, it could be regarded that the real situation functionally equivalent to the main set simply directly including all of the sub-elements as elements. The relationships between the elements (or sub-elements) is not important insofar as the notion of 'a set' is specific to regarding only that these elements are included in, and members of, the set.

However, it is in this latter, more specific technical sense of 'inclusive container', that it must also be regarded that the notion of 'a domain' is in reference to a construction and collection -- a constellation -- of specific concepts, rather than of 'things' (ie, elements).

To see this, notice that the notion of 'a domain' was bound to the notion of 'set theory', rather than to any one of the member elements of what 'set theory' consists of: a collection of concepts regarding 'elements', 'collections of elements called sets', and 'the relationship of inclusion; having elements be in sets'.

As such, the notion of 'a domain' must be regarded as inherently an abstract reference, rather than as being a concrete instance of something that 'exists'. A domain is not a container or context in which things a certain type are content; rather it is a reference to the combination of a certain set of ideas. Therefore, a domain (as a notion) does not refer to a total collection of things so much as it refers to a common context (or type) of consideration of three or more mutually associated (and usually fundamental) defining concepts.

Is this what you meant when you said that a domain was a 'field of study'?


The specific confusion that occurs frequently is that people tend to use the word 'domain' as being a kind or type of existing world, and as such, tend to have very much too strongly a connection to and connotation of 'a domain' as being a collection of 'existing things', or more ultimately of being 'a universe'.

While the notion of 'universe' gets us closer to the notion of a domain as 'that which contains, but which itself is uncontainable', it implies the mistake that the 'elements' of that universe, as existing things, materials, matter, etc, are the 'members of the set' that are of interest. However, within the lexicon of the IM, that is explicitly not the case.

'The Universe', as an example of a domain, is actually the collection of only and exactly three necessary and sufficient concepts: the notion of creation (as that which establishes that there are existing things), the notion of existence (as that which has actually been created), and the notion of interaction (as that which exists and can at least potentially be mutually influencing).

In effect, to consider the 'domain of the universe' is to consider the meanings and the inter-relations of those three concepts, neither more nor less, rather than to be considering any specific collection of 'existing stuff'. To understand the notion of universe, or of this "our" specific universe in particular, it is not sufficient to simply list and enumerate all of the existing stuff -- it is also necessary to account for how that stuff came to be (ie, 'theory of creation'), and how that stuff interacts and changes (ie, 'theory of physics'). Moreover than necessity, it is the case that having a complete accounting of the totality of existence, creation, and interaction, that we would have also achieved sufficiency -- that nothing else would be needed in addition to just that to have encompassed a 'complete' understanding of the universe.

It is in this very specific, non-trivial way, that the notion of domains, as always being about some minimum set of necessary and sufficient basis concepts, is fundamentally different than as is usually conceived.

Ok, so when you were earlier asserting that the concepts of symmetry and continuity were 'low level concepts in a domain' you were also implicitly stating that these notions were particularly important in considering the inter-relationships between existence, creation, and interaction, insofar as a 'our universe' is an example of 'a domain'. So how do you define the terms 'symmetry' and 'continuity'?

These concepts are defined in terms of comparison. Therefore, we will first need to consider the concept of comparison.

Why is comparison so important?

Two key reasons: 1; the notion of comparison is deeply and inherently connected with the notions of 'measurement', 'interaction', and 'signaling/communication', as used within the field of study called 'physics', and 2; the notion of comparison is explicitly and exactly analyzable into an explicit and finite set of concepts, hereafter called the 'intrinsics of comparison', which are known to be necessary and sufficient to that concept.

To unpack this a bit, it can be regarded that all of the theories of physics can be described and modeled as 'allowed relations of signaling' -- as information flowing between 'existing things' in the form of 'events'. An equation is simply a model and description of an particular information flow, an identified connection and shape of a network of explicit measurements.

Moreover than just defining what connections exist -- what sorts of interactions occur and are possible, as specified in multiple equations, some physical theories will also define what sorts of connections do not occur, as a sort of (set of) 'access control limits' -- as specifying what we cannot measure.

For example, the theory of General Relativity (hereafter 'GR') defines this sort of limit very explicitly with its 'speed of light' constraint. In effect, any portion of the universe that (presumably existent) is 'farther away' than can be reached by light traveling in free (empty) space in a given unit of time is part of the 'absolute elsewhere' -- a region of time and space that is inherently and completely inaccessible to any (presumably all) ordinary causal signal paths. Similarly, events occurring wholly within the 'event horizon' of a black hole are regarded as conceptually, absolutely and in principle, as being fully and completely inaccessible -- a kind of perfected 'one way mirror' of total information security.

No hacker, no matter how powerful or fast their tools and computers, no matter how insightful they are or what technology or mathematical tricks they may use, (ie, even inclusive of our wildest hopes for what might be achievable using Quantum Computing), will ever be able to breech the 'security boundary' of data stored within the context of a black hole. The theory of GR, in this regard, is defining very powerful principled limits on the allowed signaling, communication, and measurement paths that might otherwise have 'existed' within 'the physical universe'.

Quantum Mechanics (hereafter QM) also defines some very stringent -- equally fundamental -- 'absolute signaling access control limits' in the form of the Heisenberg Uncertainty Principle. For any portion of the universe which is very much 'too close' -- ie, defined at the scales of the very small -- there are limits on what sorts of measurements may simultaneously be made. If I make a measurement of one type, then I am unable to obtain, by any conceptual means whatsoever, information regarding what would have happened if I had made a measurement of a contrasting type. In QM, the entire world of accessible measurements is divided into contrasting types, and even the very notion of their being an actual 'definite state' beyond what we can measured is hotly contested, since total access to such a world is for sure explicitly forever limited.

As such, at a very abstract level, when considering the theories of physics in terms of the inter-relations of absolute primary concepts such as causality, signaling, information, measurement, etc, that the notions of 'interaction', 'communication' and 'comparison' are strictly functionally equivalent in their usage and application. In the language of the IM, such terms are regarded as being 'isomorphs' -- literally 'one shape' insofar as the role that these concepts play in the conceptions of their domains is everywhere the same.

You did not mention Newton -- did he set access controls also?

No. So far as I am aware, his three laws of motion are just statements of relation -- what signaling paths exist -- what concepts/measurements are connected and networked together, rather than to be specifying what possible connective paths do not or cannot exist. Similar can be said about his law of gravitation -- they define mathematical connections between otherwise incompletely unrelated ideas and corresponding measurements.

In effect, Newtons theories are ones made with a background assumption of 'total access'. His concept of gravitation -- which remains useful to this day for many practical calculations -- is still one of saying that all matter influences all other matter (at least within the same time/space frame, and as implicitly within any given single moment; the existence of future masses in various positions does not apparently influence or interact with the masses positioned in places in the past).

Defining theories only in terms of 'what is connected' is natural, considering especially insofar as at the time of Newton, explicitly establishing any connections at all -- using formal equations, actual absolute links -- was something of a novelty. It was not yet known what might be connected to what, or in what way, so everything was at assumed to be at least possible, even if pragmatically seeming very unlikely.

Remember that Newton, not having any established bias or reason otherwise, was just as interested in looking for explicit connections in what we would now regard as purely mystical and occult matters. It is only our 'modern' experience and perspective that we might regards these as 'inaccessible' and 'intractable' fields of study (ie, as being areas of 'no access'), and thus have a established habit of ignoring these things as unimportant.

Similarly, each of the other basic theories of Modern Physics, such as those described by Thermodynamics, Electromagnetism, etc, are describable as defining explicit connections between concepts and methods of measurement that would otherwise be 'undefined'. These connections and relations, for each major foundation of physics, as considered and analyzed as a model of a process of information theory, was developed by B. Roy Frieden in his book 'Physics from Fisher Information'. In this sense, each theory of physics defines a kind of API -- an Application Programming Interface -- of what sort of causal relations can be invoked, and consistently relied upon, when developing various kinds of technology.

If we extend this metaphor to consider the/this universe -- as an instance of a domain -- as to be a kind of 'operating system', then as students learning about the system for the first time, we would first become aware of and interested in what we can do -- what APIs exist -- before we become fully aware of what we cannot do -- what APIs do not (yet?) exist. Moreover, we can re-recognize that the concept of a domain would be defined as a kind of programming language environment, ultimately considered in terms of statements ('commands'), syntax (how to compose commands), and semantics (what those commands are intended to mean/do, or, more hopefully, what they actually do).

I can see how every measurement must ultimately be regarded as an interaction (or one or more interactions), and I can even see how every interaction might be regarded as 'a measurement' (at least in an abstract sense that all interactions can be modeled as an exchange of information). How is the concept of comparison connected to all of this?

Every measurement is inherently and irreducibly a comparison. There are no measurements that are not also comparisons.

In some basic and ultimate sense, the notion of measurement is, and will always be, a measure. Implied in this, again in some abstract sense, are some ideas and notions of what can be called a numeric concept, which themselves imply things like the notion of a zero, a unit of measure, and an extent.

Consider, as an example, what it means to 'make a measurement of temperature'. Presumably, we would use an instrument that in some way functionally resembles a thermometer. Moreover, this 'thermo-meter' would have a posted scale or 'graticule' which would define both the range and the precision of the 'numeric output' that that instrument would indicate. There would be a minimum temperature, a maximum temperature that could be expected to be measured, and a specific resolution -- the number of decimal places -- to which a reading could be obtained.

For a very good heat-sensor, instrument, etc, there would also be a detail document defining how what sort of accuracy could be expected (as distinct from precision) and how, eventually, the instrument would/could be calibrated to some 'reference standard' so that the obtained measurements in various settings remained semantically meaningful.

In this way, and in this example, the notion of 'reading a number' and having it be 'semantically meaningful', as an actual temperature of some actual thing at some actual moment, is a ultimately a comparison, insofar as the 'units of measure' (Fahrenheit, Celsius, Kelvin) are with respect to a known reference standard.

Moreover, it can be regarded that the notion of perception -- of a person seeing a number as a measurement -- is a kind of measurement in itself. The notion of seeing can be regarded in the sense that 'taking a photograph' is a measurement of the light received from various positions, directions, etc, in terms of intensity, color, etc. Our biological eyes, brains and nervous system, etc, are keyed to notice contrasts, edges, and shapes, all of which are concepts which are defined in terms of comparisons -- of noticing differences first, and then sameness.

In terms of raw physics, those examples are very high level, involving macroscopic instruments, people, eyes, brains, and somewhat soft semantic concepts involving numbers, etc. The notion of 'interaction' as used in Standard Model particle physics precludes all of that. How would the concept of comparison apply to the notion of 'measurement' in a QM physics context?

Another example that can be considered in the notion of measuring the 'spin' of a given single particle, such as an electron. Even in an interaction situation which is so constrained that only a single bit of information is received in a measurement, the notion of comparison would (must) apply in at least three specific senses: 1; insofar as the notion of a '1' is distinct/different from a '0', and that it is only in the relation between these two, as being different, that the notion of a comparison, as a distinction called a 'bit', is applied, and 2; insofar as the specific relation of that measured bit of information is treated in comparison to the notions of 'spin up' and 'spin down', ie with respect to some semantic concept defined in relation to some external world, as an embedding, and meaningful within that context, and 3; insofar as there is an inherent difference between the event of 'having made a measurement' and 'having not (yet?) made a measurement', insofar as the 'states' of the system, (including the measuring apparatus), are considered/compared in contrasting ways (ie, as under the principle of identity, that which is indistinguishable, must be the same).

Any one of these aspects would be sufficient to establish that there is an intrinsic relationship between the concept of measurement, as a kind/example/instance of interaction, and comparison, also as a kind/example/instance of measurement, interaction, signal process, causal communication of meaningful information, etc.

This notion of function concept equivalence, as a role played by a concept within the context of a domain, is simply a recognition that the concept of relationship -- as something within something else, or near something else, or 'like' something else -- is a very basic concept and that that concept will everywhere occur and operate in functionally equivalent and isomorphic ways.

Insofar as mathematics is the study of pure relationship, considered as abstracted in various ways of all other details, and insofar as physics is the study of causal interactions, which are themselves a kind of (sets of) relationship(s) between things and events, then it is not so much of a surprise to recognize that within these two domains, each having a concept in a similar primary role (relation and interaction), would have those corresponding concepts operate in similar corresponding ways. Nor would it be unexpected that therefore mathematics would be very useful as a basis for modeling physics.

If we were to add to our aligned consideration a 'natural philosophy' (another domain) to also go with our physics, as a kind of 'meta-physics' -- an examination of the nature and basis of the epistemology and ontology established by the method of scientific inquiry -- then we can similarly recognize that the notion of connecting comparison to measurement is a natural one. Insofar as science uses measurement (an epistemic aspect) to establish a connection between a hypothesis/theory and the real (what is actually true of nature, as ontological aspects), then the action of a comparison between theory predicted results and actual results is absolutely key. Therefore, we may validly regard that the notion of comparison, as a kind and example of the general notion of relation (as used in mathematics) is equally as fundamental as the notion of measurement, in any completed and inclusive conceptual analysis.

Therefore, if we are wanting to understand the basis and foundation of any one of these domains, (mathematics, physics, or ontology/epistemology) in terms of 'what is real?', and 'what can we know?', then we need understand and examine the intrinsics and characteristics of these 'between' concepts (relation, interaction, comparison, etc) fairly closely and carefully.

Ok, so what are the intrinsics of comparison?

Where discussing the concept of comparison in the common usage terms (ie, where holding the notions of subjective and objective as implicit), it is apparent that exactly four other concepts are both necessary and sufficient for the formal consideration of the concept of comparison. These concepts are 'sameness', 'difference', 'content', and 'context'.

Any process of comparison involves all four of these terms as intrinsics (or, six concepts, if you also include 'subjective' and objective', as is often also necessary to include as part of the basis). Where there is comparison, there must be sameness, difference, content, and context. No comparison can be defined without also implicitly making reference to all of these concepts, always.

Do the concepts of relationship, interaction, measurement, etc, also have similar intrinsics?

Yes. The other 'ultra low level between' concepts can also be considered as having inherent and intrinsic modes, aspects, characteristics, and/or types. For the most part, the notion of 'mode' is used more or less interchangeably with 'type', 'kind', etc, to refer to a specific abstract commonality of aspect in that relationship, interaction, measurement, etc.

For the notion of relationship, the three 'kinds' are 'relations of inclusion', 'relations of proximity', and 'relations of similarity. These three types/kinds/modes are both necessary and sufficient in considering the domain of relationship.

For the notion of 'measurement', as considered in the domain of physics, the necessary and sufficient aspects are 'mass' (aka 'structure', 'information' or 'pattern', etc), 'space' (of whatever dimensions), 'force', 'time', 'probability', 'possibility'. Other metrics are generally composites of these.

Even the notion of to be asking a question -- as a kind of event which can modeled as 'measurement interaction process' within the context of a conversation -- also a domain unto itself -- has six necessary and sufficient kinds or types, can be organized into modes, etc. The basic question types are: 'who?', 'when?', 'where?', 'what?', 'how?', and 'why?'.

There are lots of other examples in the IM, as defined for many other domains.

Are all of these intrinsics in some sort of pattern of relationships of correspondence to one another?

Yes. This is the main topic of the IM, and is addressed under the headings/principles of 'Foundational Triplication' and 'Type Isomorphism'. These principles are further clarified in terms of the 'Modalities' and 'Axioms' of the IM, which are the basis of that work.

However, for now, the one most important consideration for our present conversation regarding the notions of symmetry and continuity, is to consider the intrinsics of comparison. Many of these other relationships can be more easily clarified when starting from there.

Ok, so what it is about the intrinsics of comparison that we need to know?

The main thing is the notion/principle of 'Concept Inseparability'. Basically, the idea is that the intrinsics of comparison occur in a certain pattern. If we consider the meanings of each of the intrinsics of comparison, certain statements of inseparability are apparent. On the basis of that pattern, specific 'must also occur' assertion statements can also be made.

For example, it is clear that the concepts of sameness and difference are inseparable. Whenever one appears, the other is implicit. Where there is sameness, there must also be difference. Where there is difference, there must also be sameness.

This was already observed when we mentioned earlier the relation between 0 and 1 when considering the notion of a 'bit' of information. In order for the bit to be a bit, an implied difference between the 'zero' and the 'one' must be assented, and further, for that single bit to be meaningful, at least one or the other of these 'states' must be regarded as being in some way 'the same' as some reference concept, such as 'spin up' and 'spin down'. Without there always being both a sameness a difference, paired in this way, the notion of an actual 'measurement', 'comparison', etc, is simply not possible.

Further, the concepts of content and context are also inseparable. Where there is content, there must also be context. Where there is context there must also be content.

By content and context, are you referring to the kind of figure-ground relationship as seen in art? Ie, as the difference between what is in the painting, and the frame and wall, room, etc, on which it is placed?

Yes. The notions of 'set' as a container, and 'element' as that which is contained, is also an example for context and content, respectively.

As such, the notion of inseparability of the concepts of content and context, and also of the concepts of sameness and difference, is assumed as being fundamental to, and irrevocably inherent in, any and all discussions of the concept of comparison, or of anything that is defined in terms of comparison.

Where the concepts of comparison, (sameness, difference, content, and context), are inseparable, it is also clear that they are distinct concepts with distinct meanings. For example, the meanings of these terms are not interchangeable in any formal statement, without changing or altering the fundamental meaning of that statement.

Therefore, it is also to be understood that similar aspects of distinctness and non-interchangeability will apply similarly to any terms that are defined in terms of these.

Why is the formality important?

It is clear that some of the intrinsic concepts of comparison are applicable to one another. As such, it is part of the setup for defining the concepts of symmetry and continuity, and it will also be needed when we get to identifying the nature of the relationships between these concepts.

Specifically, the concepts of sameness and difference can be applied to the concepts of content and context. Also, we could apply (the concepts of) content and context to (the concepts of) sameness or difference.

In this way, through these mutual applications of the intrinsics, we can define four new concepts. Where there is a sameness of subjective context, the following definitions hold about the objective:.

Continuity is a reference to a sameness of content
where there is a sameness of context,

Discontinuity is a reference to a difference of content
where there is a sameness of context,

Symmetry is a reference to a sameness of content
where there is a difference of context,

Asymmetry is a reference to a difference of content
where there is a difference of context,

That is an unusual way of defining those terms, and somewhat unlike how they are used in a dictionary. Why not just use the dictionary definitions?

The main thing that is of importance is that these terms are defined exclusively and purely on the basis of the intrinsics of comparison, and nothing else at all. Therefore, everywhere the notion of comparison is applicable, these four concepts are also inherently definable. This means that these concepts can be always be used analytically, regardless of the domain of comparison being considered.

In other words, they are fully general, and therefore fully effectively universal, concepts. That makes them very very important conceptual tools.

In regards to the dictionary definitions, it must be recognized that there is a distinction between a descriptive definition and a proscriptive definition. The dictionary gives descriptive definitions -- an assessment of the average of how people tend to use them, in common practice, writing and speaking in all sorts of contexts, scientific and otherwise, every day. This is in contrast to the form needed here, which is proscriptive, as needed so as to provide an analytic basis for examining higher order concepts, proofs, etc.

Why are proscriptive definitions needed?

If we are going to demonstrate a given statement as being somehow a 'fact', then the terminology used to compose that statement need to be as exact, complete, and as well defined as is inherent any labeled variable in an equation. This becomes especially more important if we are wanting to have the added facility of symbolically manipulating that equation, combine it with others, etc, so as to identify other, otherwise unknown and previously unspecified relations. For these sorts of operations, only proscriptive definitions are applicable, rather than descriptive ones.

Descriptive definitions, no matter how careful, can say only what something is like, not what that something actually, fully, and completely is. It is the difference between being a story, representative of something else, vs being directly that thing in itself, such as a named variable in a software program.

A definition is only as good as the degree to which it specifies -- separates out of the set of all possible things -- those things which are covered by the definition, and those which are not. As such, we can use the functional definition of a definition to evaluate how good a given definition is, so as to evaluate its suitability for a given purpose (in conversation, communication, etc).

As a partitioning, a descriptive definition is an attempt to draw that boundary along the lines of how people are observed to use the term, and as such, will always be somewhat vague in the details, since such descriptions can only follow observations already made, and such observations are always going to be information limited (finite, and therefore, discrete).

A proscriptive definition, in contrast, is a specification of an exact line position, usually having a fairly simple shape defined in a geometric way. On the scale of the smallest details, the 'resolution' is effectively 'infinite', since the partitioning is defined on the basis of a continuum.

As such, when symbolically manipulating such definitions, as used in statements of relation, etc, any errors in the fine scale positions of the partitioning boundary of the set of all possible concepts will be magnified, and if there are any such minor issues, they will eventually be magnified to the point of making the overall statement useless/meaningless. Descriptive definitions will have this defect, whereas proscriptive definitions will not.

Ok, so I see that there is utility in having proscriptive definitions for certain terms, and I recognize that these terms, symmetry, continuity, etc, would benefit from such treatment. However, how do we know that the proscriptive definitions as given actually have any real or actual correspondence to what those terms mean in common descriptive practice?

For this, it is necessary to show how the two definition types correspond. Usually, if the proscriptive definition is really good, insightful and addressing of the essence of the 'true' and 'real' meaning of the terms, etc, then only a few examples, metaphors, etc, is all that is needed for the general pattern of applicability to be established, recognized as clear, etc.

The concept of symmetry is about invariance of content with transformations of context. To make this abstract definition a little easier to understand, consider the following metaphors and examples.

Consider a black square drawn on an otherwise blank page. Imagine that the paper is now rotated so that the top becomes the bottom. In this transformation, flipping the square end over end, the context of the square has changed.

Imagine your perspective if you were made small and were standing on the square looking out at the room. In this view, the ceiling becomes the floor and the floor becomes the ceiling. The context of the square has changed, but the square remains visually unchanged. The content remains the same, but the context has changed.

Basically, every geometric example that someone can present as a kind of symmetry is a situation where something changes in the relationship between the figure and the ground, whereas the figure itself remains the same, unchanged, visually indistinguishable, etc.

In the abstract, this notion of symmetry also shows shows up in the context of physical science in the form of the notion of lawfulness in general and in terms of the conservation laws in particular.

How is that again? I hear about 'symmetry groups' and other such abstractions when people are talking esoterica about String Theory, yet I gather that you are meaning something more basic.

Yes. If we consider the basis of any scientific theory (a knowledge of this world), we can quickly observe that the notion of 'fundamental laws in the universe' are all derived from, and based upon, the concept of symmetry.

In other words, we can claim and validate the idea that the very notion of lawfulness, in physical science, is in itself an example of the applied concept of symmetry, as it has been formally defined proscriptively here, in terms of the concept of comparison, as inherently always used in the notion of measurement.

For example, consider a the scientific experiment where we are measuring the boiling point of water. Assume that this experiment is being performed here in San Diego, California. If we do our work carefully, we will discover that pure water boils at 100 degrees Celsius, given and environment with standard pressure, etc.

Say that we elect to travel, and we visit a university in Paris, France. Assume that we repeat our work using the equipment provided by that university. We will find that the results of this experiment do not depend on whether it is done somewhere in the USA or somewhere in France. In other words, we assume that it is reasonable to expect that we will get the same result for our 'find the triple point of pure water' in its phase diagram regardless of the location in which the experiment was performed.

The content (the result of the experiment, considered as a content, information about a result) is the same even though the experiment is done in different places (a different context). In fact, all instances of the notion of a 'constancy of the laws of physics', the results of measurements of basic physical constants, etc, are all formulated and considered in this way: as a relationship or observation that will remain applicable and 'true', and will hold in the functional and actionable sense of 'can be relied upon', regardless of the position that one might have in the universe, or the time in which you live, or in which the experiment/test/measurement is being performed. The universality of the 'natural laws of science' is thus an implication of the concept of symmetry -- a kind of invariance of context in terms of all of time and space.

As another, even simpler example, consider putting a liter of water into a closed and sealed bottle on day 1. With the passage of time, on succeeding days, the bottle is still the bottle, and the mass of the material inside -- the water -- will remain the same. In fact, even if we had used other liquids, solids, or gasses, there would in each case persist the same amount of raw material in the bottle -- the same amount of mass (excepting a few exotic cases involving radioactive materials and the like) -- even though the context of time has changed. This symmetry of material content in the context of time is the law of conservation of matter.

As a special case of the above, all of the basic physical laws of conservation of matter and energy are symmetry laws. The very dynamics of the equations that serve as the foundations of science and technology, etc, are all ultimately based on concepts of symmetry at their foundation. In every single case, whether in applied engineering or in observational science, the notion symmetry as combining an unchanging content with a changing context will also consistently apply.

As such, the notion of symmetry is not just important to the notion of physical science, it is the very essence of what it means to be a science. Without the regularity of consistently repeated and regular observations, the very notion of the scientific method, as an epistemic basis, would be faulted.

Ok, so the notion of symmetry is critically important to the very essence of science. Where does the notion of continuity come in? How is that definition consistent with common usage?

To assert continuity is to state that an infinitesimal change in context will always and necessarily result in an infinitesimal change in content. Continuity is where the content of a change is not (is never) 'drastic' or sudden, given any arbitrarily slight change in the context. It is an assertion that changes in content are only partially sensitive to changes in context.

A basic metaphor and example for continuity is to consider 'graphed functions' in algebraic mathematics. In that context, for example, a "continuous function" is one where the curve is everywhere connected to itself. A discontinuous curve refers to a function that will make a jump from one value to another, with no intermediate steps. Such a curve is disconnected.

One could describe a discontinuous function by saying that there exists an infinitely small region along the X axis (a sameness of context), where the value of the function (the content) changes abruptly. The ratio of change of content over change of context is infinite, indicating a break. Discontinuity is an abrupt shift of content while the context is the same.

Insofar as many of the equations defining physical laws in science are defined as absolute real numbers, which form and are part of the continuum of the real number line, there is an implied sense in which the perfection of these relations is an expression of an implied notion of continuity. Naturally, while it is recognized that the information contained in any specific measurement will be finite, it is also often assumed that in principle, if the basic law is 'true' and we had perfect absolute input measurement information, that the equation(s) would also describe a perfectly definite and defined outcome.

At least until relatively recently, it was generally assumed that the notions of space and time were perfect and pure expressions of continuity. Now we have notions of 'quantum foam' and other such exotica at the absolute scale of the microscopic, but at least the general measure and method of continuity, as used in the abstract, is still correct.

Ok, so you have established that the proscriptive definitions given for symmetry, continuity, etc, are real, and that they actually correspond in a deep and fundamental way with what these terms actually mean, and are used to mean, in common practice. Other than as a semantic clarification, a kind of insight, what other value, can those proscriptive definitions actually have?

The deeper significance of these specific constructions and pattern of definitions becomes important when we consider the formation of the tertiary compounds.

Clearly the concepts of symmetry and continuity are not the same, for there are 1) discrete and non-continuous things which are symmetric, and 2) twisted continuous things that are non-symmetric. For this reason, it is valid to consider the formation and inter-applicability of these compound concepts of the intrinsics of comparison, and of their opposites, to one another.

However, when considering the mutual compatibility and applicability of the secondary compounds to form tertiary conceptual compounds, certain conflicts become immediately apparent.

For example, some of these tertiary compound candidates can be immediately be rejected since the duplicate application of the same concept to itself is of no semantic value. In this way four potential candidates are discarded from the list of potentially valid mutual applications.

Another of these conflicts concerns the mutual applicability of the types of comparison to one another in the terms of 'opposite meanings'. Some of the pairs of these concepts are mutually incompatible as they are clearly opposites of one another.

For example, applying the concept of continuity with the concept of discontinuity asserts that for the same context, something is both identically the same as and identically different than itself. Clearly something (in this case content) cannot both be the same, and be different, in the same way at the same time.

In the same way, the formation of a single compound concept out of a 'sameness of content' and a 'difference of content' would also be invalid, since it violates concept of distinctness (as defined earlier). A compound that assumes that the context is both the same and different also violates distinctness.

For example, it would be problematic to assert that for the same content, its context is both the same and different. Thus, there is the limitation (in the formation of compound concepts of the four types of comparison) that something cannot be both the same and different in the same way at the same time.

Insofar as there are two pairs (of the four concepts which are available) which are opposites to one another, then clearly they will not be mutually compatible or applicable. In this way two additional potential third compound candidates can be dropped from the acceptable list of mutual applications. A few other potential candidates will also be dropped where they violate the requirement of distinctiveness.

In addition, due to the inseparability and distinctiveness of the intrinsic concepts of comparison themselves, the formation of the compounds of using these are subject to certain limits.

In particular, one of these limits is 1) where the concept of content appears, the concept of context must appear also, and 2) where the concept of context appears, the concept of content must also appear. This, however, is not a problem as the definitions of the terms always ensures that there will be an even matching of a content with a context and a context with a content, wherever these appear.

However, in a very similar manner, in connection to what was just stated, two more candidates for tertiary compound concepts must also be rejected. This is due to a somewhat more subtle limitation. In the same manner that the concepts of content and context are inseparable, conceptually distinct, and non-interchangeable, the concepts of sameness and difference are also inseparable, distinct, and non-interchangeable. In particular, where the concept of sameness appears, the concept of difference must also appear, and vise versa. The 'number of appearances' of each concept must match the number of appearances of its opposite.

Fortunately, all of these limitations can be accounted for fairly easily by noting that when any one of the concepts in each of the concept axis domains appears, its mate must appear also. All that is needed to determine whether the third order compounds of the intrinsics of comparisons are valid is to check that all of the concepts 'add up' or 'cancel out' under the inseparability relationships. In this way, the four compounds formed by the four types of comparison can be checked rather readily.

In summary form, the limitations just stated can be organized into three basic sets: where [!x] indicates an invalid application to self, [!y] indicates an invalid mutual application of opposites, and where [!z] indicates an invalid assumption of the separability of sameness and difference. As such, the results of the consideration of all candidates for the formation of the third compounds of the intrinsics of comparison may be listed as follows, with 's' and 'd' representing the occurrence pattern and count of the concepts of 'sameness' and 'difference' respectively.

Table of Tertiary Conjunctions:

[!x] continuity and continuity (ss & ss)
[!z] continuity and symmetry (ss & sd)
[!y] continuity and discontinuity (ss & ds)
continuity and asymmetry (ss & dd)
[!x] symmetry and symmetry (sd & sd)
symmetry and discontinuity (sd & ds)
[!y] symmetry and asymmetry (sd & dd)
[!x] discontinuity and discontinuity (ds & ds)
[!z] discontinuity and asymmetry (ds & dd)
[!x] asymmetry and asymmetry (dd & dd)

From the above list, it is clear that there are only two actual combinations which are mutually compatible. These are 1) continuity with asymmetry and 2) symmetry with discontinuity. All other combinations are, for various reasons, incompatible with one another.

Wait -- what -- Incompatible? In what context?

In all contexts -- ie, with respect to the notion of context itself, as established by and within any measurement.

In other words, because the terms used to construct this table were based only and solely on the concept of comparison, and insofar as any and all comparisons are the very basis -- the sole foundation -- on which all domains are founded, then this result is also general to all domains. Even if just given the notion of science, in terms of the scientific method, as based on measurement, grounded in empirical observation, etc, then this result is applicable to any and all actual and real domains, absolutely, totally, and completely, inclusive and including that of the/that/our entire universe, as a special case.

It is actually rather surprisingly difficult to overstate the appallingly general significance of this result. This remains so even though the proof of it is rather simple. It is therefore given an name: 'The Incommensuration Theorem', or the ICT for short.

Concept inseparability, and the associations as defined, require that an eventity/comparison is either 'continuously asymmetric' OR it is 'discontinuously symmetric'. No matter how much we might want it, an eventity/comparison/measurement (a signal or a 'thing') simply cannot be, in an absolute and final sense, either 'continuously symmetric' or 'discontinuously asymmetric'.

The concepts of symmetry and continuity cannot both be simultaneously and fundamentally applied to any eventity/comparison.

The concepts of asymmetry and discontinuity cannot both be simultaneously and fundamentally applied to any eventity/comparison.

Any absolute application of the concept of comparison must ultimately be continuous and asymmetric, or symmetric and discontinuous.

Those seem like fairly sweeping generalizations. How confident can we be in these ideas?

Very confident. Reason being, there really is not that much involved in establishing Incommensuration. In being rather basic, there are not that many points involved, and each of them seems fairly self evident.

The basic skeleton is as follows:.

- 1; the idea that every measurement is a comparison.

- 2; the idea that comparison, as a concept, has the six named necessary and sufficient intrinsics.

- 3; that the definitions of symmetry, continuity, etc, are well mapped to the actual deep usages of those terms.

- 4; that the notions of content/context, sameness/difference, etc, are inherently inseperable.

- 5; that the essence of QM and GR are fundamentally expressed in terms of symmetry, continuity, etc.

If someone were to argue the point, they would need to contradict one of these ideas.

Ok, so we have defined the ICT: Symmetry and Continuity do not go together -- no mixing of water and oil, peanut butter and cholate! So what do you do with it? How can you apply it?

There is actually a fairly large class of concepts that are affected by this. Only a few of the most important ones will be listed here.

As mentioned, the concepts of continuity and of discontinuity are not absent from formal consideration within the objective physical sciences. QM, for example, considers things in terms of explicit discontinuities -- specific particles in explicit definite quantized states. Moreover, there are, in some interpretations, notions of measurement which are modeled in terms of an inherently non-continuous process sometimes called 'the quantum jump'.

On a somewhat deeper level, there is a parallel between the ICT and the Bell Theorem of physics. In essence, the Bell Theorem states that any physical theory of reality cannot both assert that reality is "lawful" and that reality is "local". According to the Bell Theorem, reality can be either completely lawful and somewhat non-local or it can be completely local, and somewhat non-lawful. The concepts of lawfulness and of locality can be regarded as special cases of the more general concepts of symmetry and continuity (respectively).

The Bell Theorem is a special case of the ICT?

Functionally, Yes. They have completely different methods of derivation, development, etc, yet the same overall result is obtained. They do also share a common aspect in the notion of being inherently based on direct experimental results, in one case as modeled in terms of the statistics of correlated contextual measurements of entangled particles, and in the other in terms of a relation between comparison and measurement as an instance causal signaling theory.

In effect, the connection between the ICT and the Bell theorem depends on the specific use of the terms 'lawful' and 'local'. In the same sense that the proscriptive definitions of symmetry, continuity, etc, were shown to be consistent with and wholly inclusive of, the same terms as descriptively defined, the notions of lawful and local are also fully subsumed. To the degree that the terms 'lawful' and 'local' have more formal aspects, and have specific technical meanings, they are not considered as descriptive so much as proscriptive in themselves, thus making the connection between the ICT and the Bell Theorem even stronger and more irreducible.

As such, to establish the ICT as a true generalization of the essence of the Bell Theorem, it is only necessary to show 1; that the notion of 'lawful' cannot not be regarded as an instance of the more general concept of symmetry, and 2; that the notion of 'local' cannot not be considered as an instance and example of the usage and inherent meaning of the concept of continuity.

It is a basic assumption of science that, given similar environmental conditions, the results of a physical procedure performed in one place and time will be the same as the results of the same procedure implemented in other places and times. This constancy of the results of empirical experiments (a sameness of content) performed in different times and places (a difference of context) is an expression of the concept of symmetry. Science assumes that the essential nature of the dynamics of physical process is everywhere the same in the universe.

Science regards the 'laws of physical process' as invariant under transformations of changing times and position. In that the methodology of science itself depends on the notion of repeatable observations, the scientific method itself inherently involves an implicit assumption of symmetry. As such, the mathematical expression of the laws of nature (as discovered by the methodology of science) are all based on notions of symmetry.

For example, as mentioned earlier, the conservation of matter and energy is a symmetry law. The foundations of the theories of relativity are based on ideas of a sameness of various types of relationship under circumstances of changing position and momentum. The ultimate assertion of symmetry of science is to assert that the same physical laws (as content) apply everywhere (an invariance) in the universe (a context). As such, for science to regard the universe as lawful is to assert a fundamental notion of symmetry (2).

Insofar as science cannot remain science and give up the notion of symmetry, it was natural to expect that the notion of locality was instead sacrificed.

The notion of locality as used within science is essentially an assertion that no physical influence, interaction, or signal can travel faster than the speed of light. To assert locality is to assert that all physical process involve dynamics which do not have instantaneous transits across space (a jump between arbitrarily separated points in zero time). If an interaction spans any distance, then it must also span some nonzero duration.

The metrics of time and space are contextual metrics, and interaction and substance (physical matter) is regarded as being a 'content' within that context (3). This notion that there is no influence, interaction, or signal (all of which are content) which can instantaneously cross an arbitrary distance of space (a context) is equivalent to the notion of continuity. For any small change (a process, interaction, or signal, as a content) there must be a corresponding change in the context of that interaction (in time and space). The notion of locality, in asserting that there can be no abrupt instantaneous changes, is a special case of the concept of continuity (4).

However, in that the ICT regards symmetry and continuity to be mutually inconsistent. They cannot be simultaneously applied to the same immanent modal identity. A theory of reality is a theory of being, and thus must be formulated from an immanent, rather than omniscient, basis. Therefore, the notion that reality is absolutely lawful is inconsistent with the notion that reality is absolutely local. In effect, a conception of reality must either allow 1) multiple mutually inconsistent sets of different physical laws which are applicable in different places and times (failure of lawfulness and symmetry), or 2) inherently non-local interactions and changes which are fundamentally non-deterministic (failure of locality and continuity). No theory of reality can assert both total generality, and total determinism.

If the ICT is found to be applicable at or near the basis of all domains, and insofar as there is a strong connection between the domain of mathematics and the domain of physics, does that also mean that the ICT applies somehow within mathematics also? How does the ICT show up in mathematics?

Actually, there is a connection between the ICT and mathematics, in the form of being a generalization of the Godel Theorem. In specific, insofar as the notions of 'Consistency' and 'Completeness' are also functionally subsumed within the notions of symmetry and continuity respectively, the ICT effectively acts as a generalization of Godel.

In considering each and every statement P defined within a formal system, the notion of consistency can be defined as "that there is no statement P which is both true and false". Similarly, the notion of completeness can be defined as "that there is no statement P which is neither true or false".

To see that the notion of consistency is a special case of the more general concept of symmetry, consider that to assert that a given statement P is consistent is to assert that the content, the truth or falsity of P, does not change with changes of context, the method by which P is considered (a method of derivation using other statements of the same formal system). To assert that a formal statement is 'internally consistent' is to assert that, for all statements within that formal system, that there is no statement which is true when derived by one method and then false when derived using another method. When considering the truth and falsity of a statement as its content, and the method of derivation as context, the notion of consistency is strictly equivalent to the concept of symmetry.

To see that the notion of completeness is a special case of the more general concept of continuity, consider that to assert that a given formal system is complete is to assert that there are no discontinuities of content. An inherent discontinuity is an implied (sharp) boundary between statements which have content, a truth or falsity value, and those which do not have a truth or falsity value because they cannot be proven by any method (using any sequence of other statements in the formal system). To assert completeness of a formal system is to assert that there is a continuity of content; that all expressions within the language of the formal system (the context) have a truth or falsity value (the content).

Insofar as the ICT asserts that the concepts of symmetry and continuity cannot be simultaneously applied when making comparisons of statements within the language of a formal system, the concepts of absolute consistency and absolute completeness are also mutually incompatible.

As such, similarly as before, the ICT and Godel arrive at the same essential idea about something essential with respect to the nature of mathematics via two completely different methods. The Godel Proof basically requires the construction of a mathematical language with a minimum level of sufficient sophistication to be able to describe itself, and then to use that recursive characteristic of self description to show that there are either discontinuities of what proof statements are, or are not, 'reachable', given a contextual requirement of proof consistency. As such, the Godel Proof is fairly involved, using the notions of the relationship of recursion (ie, a ring implementing an IM Axiom II metaphor process transform), or more generally, a kind of mutual embedding relationship between two domains, in a fairly fundamental way.

In contrast, the ICT depends on a connection between the notion of relationship and the notion of comparison, as being strict functional isomorphs, within their respective domains, so as to establish that anything that is true about comparison will also be true about the nature of relationship, and therefore, as inherent in any (the) study of pure relationship, as the body of knowledge and domain called 'mathematics'.

It seems that the notion of 'domain' is a very general one. You used it to describe some fairly subjective phenomena earlier; things like language, music, and imagination. As such, is it fair to ask if you have insight into the question: how are minds formed out of brains?

It is observed that consciousness -- the unity of self -- is ultimately and necessarily/irretrievably defined in terms of continuity dynamics -- that is what the notion/concept of coherency resolves to. On the other hand, physical material reality (neurons) is defined, again ultimately, in terms of symmetry. Natural law, the equivalence of all instances of each type of subatomic particle, conservation dynamics, etc, are all symmetry manifestations.

Therefore, the hard problem of consciousness is to be understood as nothing more or less than to exactly ask: How can a continuity be formed out of a basis of symmetry? How is it that my first person experience of the world is one of continuity -- ie, as a single unitary self, in all three aspects of time, space, and possibility (ie, as memory, a centric point of view/locus, and the sense that there is only one actual world present), when we also know that the material world -- substance, force, and probability, is unconscious, defined by multiplicity and governed by fixed absolute laws, and that we are composed of multiple components of only those types?

The ICT demonstrates that it is impossible to structure a system such that there is a simultaneous absolute application of both symmetry and continuity: this is what makes the 'hard problem' hard -- ie it is impossible to start with due to the nature of the construction. It is 'hard' the same way the Bell Theorem and the Godel theorems are hard -- they define limits of what can and cannot be done in physics and mathematics collectively. They, along with the ICT, represent theorems of an inability to form a 'totalization' of a domain with mutually contradictory characteristics.

One cannot compose a continuity out of using only symmetric components any more than one can abstract a symmetry when given only a continuity. The realm of the continuum and the realm of the countably infinite are distinct -- what points of overlap that do exist are without extension. While both symmetry and continuity are absolute endpoint extensions of the concept of comparison (ie, of measurement, of information movements and transactions), neither can be formed directly out of the other.

The difficulties and problems outlined when considering the 'hard problem' are all reflections and manifestations of the underlying ICT principle. Both the notion of the objective and the notion of the subjective are composed and formed out of the real, and that neither can be formed from only the basis of the other.

This is why the concepts of both realism and idealism can be (cannot not be) constructed as an exact parallel of reaction/unconsciousness and prejudice. Philosophers should take note and warning (5).

You mentioned that the ICT prevents 'totalization'. Does that mean that the ICT basically prohibits a Theory of Everything? Ie, as something that would have both perfected symmetry and perfected continuity?

Yes, it does.

The basic manner in which this shows up is in the already observed deep incommensuration between the theories of QM and GR. Despite many attempts by very smart people as applied for nearly a century now, these two great and very well tested models of reality seemingly cannot be reconciled. Somehow, in the very depths of their inherent natures, QM and GR are mutually contradictory, making effectively seperate claims about the inherent nature of what is, and is real, in measurement and causal signaling.

At the very essence of GR is that the notion of continuity applies at the scale of the macroscopic. 'Continuity' in this case is in an equivilance to the idea that there is one single everywhere singly and simply connected space-time fabric. Take away the notion of locality -- as mentioned earler as being subsubmed in the notion of continuity, as the network of available paths by which causal signals can proprogate as defined by the C constant, and there is not really that much left to GR. Moreover, the notion of GR allows for the notion of explicit asymmetry, in time and space, insofar as for example, the event horizon of a black hole is considered to be a true, 'one way passage', in regards to the flow of causal signaling.

Alternately, QM, makes an assertion that at the scale of the microscopic, that quantization, as a form of discontinuity, and therefore of symmetry, is the rule. In QM, the notion of time, interaction, etc, in the deterministic evolution of the wave function, is perfectly symmetric and reversable.

Therefore, insofar as QM is inherently about symmetry and discontinuity, and insofar as GR is about continuity and asymmetry, each of these theories occupies the opposite configuration of what is allowable within the context of the ICT. QM and GR make inherently and fundamentally opposing assumptions about the nature of the relation of symmetry and continuity, and moreover, also make opposing assumptions about the nature of any possible attempt to reconcile asymmetry and discontinuity. Thus, while each of these basic theories of real measurements, experiments, comparisons, etc, is consistent with the ICT in itself, they are each, by that same theorem, irreconcilably inconsistent with each other.

The only thing that the ICT does in regards to the basic theories of physics is to show, as clearly and as simply as possible, why it makes sense to discontinue the attempt to construct a single Grand Unified Theory of reality that attempts to reconcile these in some framework which is both complete (in the sense of allowing for a continuity of consistent coverage over the entire unievrse), and correct (ie, in the sense of being a single consistent set of descriptive laws, everywhere non-contradictory, which itself is a property of symmetry), is basically impossible, and should be abandoned.

Whatever the future of Science, and whatever theories and models are developed then, it will always remain the case that both formats of theory, one seeming to work better in the objective sense, when it is involving assumptions of discontinuity (individual particles, energy states, etc) and symmetry, and one seeming to work better in the subjective sense, when involving the notions of the personally experienced asymmetry in the arrow of time and the continuity in the coherency phenomena that we call consciousness and/or 'mind', intentionality, agency, etc.

[1] The term 'Analytic Metaphysics' rather than 'Immanent Metaphysics' was used here insofar as the emphasis is on the methods used, rather than on the basis from which these ideas come. In effect, the ICT can be considered and presented independantly of the IM, even though ultimately it had its origin in the IM.

[2] Any aspects or process of the physical world which are real and yet which are inherently not repeatable (or observable) would be non-lawful and non-symmetric in time and space and therefore ultimately outside of the scope of study available to the method of science. To assert that something is real is different than asserting that it exists or that it is objective.

[3] Technically, it is incorrect to identify the concept of interaction directly with content. By the root tautology, the notion of interaction is equivalent to comparison, and therefore both content and context are to be regarded as intrinsic aspects of interaction. Thus, neither content nor context can be identified as interaction and comparison itself. The concept of change (in the sense of physical process), however, can be directly regarded as a content, thus restoring the logic of this essay.

[4] To assert absolute locality is to assume a total continuity of interaction at all scales, down to and including the (microscopic) scale of absolute zero in both distance and duration. It is to require that theories of physical reality are defined in terms of deterministic law rather than in terms of causal law. However, the scientific method can only make observations about causality. It inherently cannot (even in principle) make any direct observation or assertion about continuity or locality. The scientific method cannot make any direct observation about symmetry either, even though it must implicitly assume an inherent symmetry in reality in its practice. No single experiment could ever possibly validate the absolute and universal truth of either continuity (the all interaction and change is local) or symmetry (that reality is lawful).