Tokenization of Reality

“Everything that can be digitized, will be” — a sentence that is justifiably becoming a mantra of many businesses. With the talks about digitization of everything, there comes a new way of handling the digitized transactions. Allowing this to be managed by a distributed ledger technology enables a fundamental change in how value is digitally issued, transferred and stored. We talk about tokenization — a process, which refers to creating a singular identifier — a token, on a distributed ledger that represents anything of value. Business-wise it can be any valuable resource, from financial assets to products and services. But based on a premise that everything has a certain value, can everything in our reality be tokenized? Can reality itself be tokenized?

Tokenization has an enormous potential to disrupt and revolutionize systems, leading to more efficiency and democracy. Yes, any system, even systems of how we lead our lives — systems of perception, thinking, feeling. It is a new framework for differentiation of properties and representation of uniqueness.

What is Value

Meaning and value are fundamental to existence and coexistence. In the context of the world, reality expects us to uphold an equilibrium of thoughts and actions based on their meanings and workings. Constructing meaning and value according to perspectives and experience rooted in knowledge becomes even more relevant in times of great complexities and multiple variances. It also fosters change and migration of value, its increase and decrease as well as the controlling mechanisms analyzing the variables influencing these fluctuations. In the contemporary world, value is constructed — by circumstances, setting, emerging demands or disappearing directives. But can value be innate?

Technology in the post-modern era substantially contributes to the pace of changes in knowledge. Postmodern philosophy suggests that epistemology and ideology creates value, but it is more knowledge that is to be constituted by these. It is both psyche and the context, from which we derive the meaning (knowledge) and thus instill value. It is important because in the absence of sensible reasons and transparent arguments whether or not something is valuable, there is a conflict between “practical reason“ and “pure reason“ that appear to be mutually exclusive. It can translate into doubt whether something holds practical value — as a matter of usefulness in the context, or pure value that should be immune to such legitimations.

If we build this argument on a premise of singularity — meaning that both, the contextual and the non-contextual value is relevant and accepted, then everything can be tokenized, because regardless of the contextuality, everything has inherent value. So if something exists, it is directly implied that it has value, because otherwise it would diminish and vanish from existence.

Binary Reality Perceptions

The aforementioned problem of contextuality projects into reality perceptions. If something exists, there is also “other” that also exists, which describes the circular-linear relationship between an observer and observation. This proves that everything in nature, what we label as “reality” is in a constant flow of circulation on the linear spectrum of occurrences. This circulatory movement is responsible for tokenization, which is inherently required for any observation we make. If we make an observation, the next step is to assign it a value and categorize it. This is how our minds perform tokenization, via constantly applying the decimal norm that “confirms” that something is real from a position of multiples, and at least one other position of an observer. In perceptions applying decimal cognitive templates, it is not possible to only observe half of reality — it always comes as a dual concept.

The flow in the decimal and binary structure of reality fosters a constant negotiation, for which the basis is the arithmetic number two. This is the basis for any axiom or topology of mathematical space, explaining arithmetic, geometric and algebraic operations. Diagrammatic depiction of this would be a circle spinning two variables, one of which would be a variable and the other is a constant. Their projections into reality experience would be change, movement, redundancy, ambiguity or any other dualistic construct involving sequential programs of thinking and perceiving. Created by interpretations of observations, sorting out the experiences and the process of assigning a complementary identity to them.

Identity Tokenization

Individual interpretations of observations lead to creating own realities. Technically, when one tokenizes their reality — by assigning value to their experiences, they create a balanced opposition to other possible interpretations and assignations. The variety is what makes reality interesting, as there is an infinite number of possible variables of how to tokenize our experiences. Any tokenization translates into eventuality and is always half-true, regardless of our perceptions. Practically, in everyday life we can observe that there is always a multiplicity of interpretations of each event, to which there are always at least two tokenized versions of truth (half-truth). It explains why for those who are operating on a binary perception, there is always an alternative to every outcome: getting along with someone half-the-time, for there is a possibility of the opposite, understanding something half-the-time, as the alternative to that is not understanding it.

Tokenization of the human mind is a dynamic process, repeatedly trying to establish snippets of realities, all the probable interpretations within the experience and everything that appears to be “technically correct” with the accepted version of the experience/outcome — the token. As everything occurs in pairs as a combination of circularity and linearity — producing circular-linear reality, the infinite circumference of sound (circle) and infinite diameter of light (line) are the primary synthesis conserving the “zero” and the “one” of the binary system. It is possible to set assigned value from having to be exclusively generated through interpretative context and binary exchange. Identities as concretization of the value of selves and object conceptualization as representation of value of assets do not need to be subjected to interpretation, of which the assessment of value is the result. Valuation — or validation process can rely on algorithmic proof to validate a subject or it can claim to be able to validate other subjects by being able to perform such validation (proof of work/proof of stake).

Tokenizing People

In 2019, project Nokenchain came up with a „simplified system for tokenizing people and organizing them into a market“. Authors claim to allow everyone — individuals, businesses, athletes, artists, entrepreneurs, videographers etc. to become a token. The concept is based on affording tools to everyone to become a tokenized entity with or without notoriety and creating liquidity. There is a smart contract referencing the value offer and value exchange, allowing everyone to “propose a new talent or a new entity to tokenize”. They will need to be accepted by the community and the person (token) who discovered a new talent will earn a bonus which will be linked to the newly created token.

All judgments on feasibility aside, let’s have a look at the very essence of this proposal: There is a concern with decentralization of talents and ideas by creating a worldwide network of discoveries of individuals represented by tokens. Through this process of discovery, the discoverer and the discovered become tokenized and also gain tokens which they can stake. Tokenized entities have an opportunity to increase their value and tokenize every act — decision making, promoting, interacting, communicating, recommending, signing, pretty much everything. That is, everything they are or do can be tokenized as well.

So — everyone is their own token, carrying value.

The vertebra of current infrastructures are not being supportive of such valuations due to people having to evaluate themselves via intermediaries that represent a framework of value. In the matrix of existing contexts and criteria of valuation, there is no transparency on how this process works or if the criteria is truly relevant, possibly resulting in false valuations. These new obscure instruments of valuation shed light on the inefficiency of long circuits when it comes to “issuing and confirming value”. 

Tokenized Realities

Understanding the web that structures our reality as being uniformly formatted deliberately stimulates the individual and collective meaning of the content that is being structured. Everything existing in connection to everything — in a network — signalizes and inherent interconnectedness, which acts as an adjacent territory to chain creation.

When we create a chain, attempting to compile a code that is evaluating mathematical expressions of value as an algorithm begins to create a sequence of tokens. These tokens are basic units of meaning (value) interpreted in the light of their immediate context. Contextuality in this case does not include complex binary interpretations, rather it is the order and arrangement of units that matters for interpreting what the particular token is supposed to mean — by itself and in the collection of token units.

All the above illustrates salient qualities of the purpose of tokens, tokenization and the hermeneutic principle. But proximities of contexts and interpretations are not entirely solved by this; it must be examined in further relational analysis. This can be done in connection to the metaverse, for there is a likely an opportunity to try on, measure and revise otherwise invisible and undetectable characteristics of the tokens that normally don’t withstand experimental scrutiny.