DemocraNet: Scale-free CPUs, HFGW Networks, Associational DBMSs, Iconic Languages, AlwaysOns and Laypeople

fractal

I came across this article http://tinyurl.com/58envr in Infosthetics.com regarding a medical iconic language. This lead me to think about iconic languages in general.

What would happen if we developed non-text languages where icons were not just “terms” but were used as “definitions” as well?

Consider having:

Iconic vocabularies.
Iconic grammars.
Iconic syntax.
Iconic linguistics.
Iconic dictionaries.
Iconic thesaurii.
Iconic wikis.
Iconic semiotics.
Iconic animation.
Iconic context.
Iconic databases.
Iconic functions.
Iconic organization.
Iconic networks.
Iconic events.
Iconic fonts.
Iconic classics.
Iconic metrics.
Iconic audio.
Iconic video.
Iconic mechanio (pressure)
Iconic olfio (smell)
Iconic gustio (taste)
Iconic thermio (heat)
Iconic nocio (pain)
Iconic equilibrio (balance and acceleration)
Iconic proprio (body position)

Such languages already exist. Chinese Hanyu for example. But what if a new global iconic language were developed?

In my reading I am discovering that even words are treated by our minds iconically as symbolic clusters. If the first and last letter of a word is correct the remaining letters in the word can be in any order. In fact, we do the same things with words themselves. We create word clusters and shuffle them around to create sentences. I think language does not have the formula Chomsky came up with using random sets of words arranged syntactically. Words are symbols and sentence fragments are symbols that we connect together. We do the same thing with lists which are basically paragraph fragments. All these fragments are are arranged according to the rules of a scale-free network not a hard wired linguistic structure. I think that would shake Steven Pinker up.

The thing that is necessary to point out is literacy and numeracy does not make us any more or less intelligent. It is a symbolic system like any other that trains us to think in certain ways to process language and quantities. Whatever we do we are simply learning another, perhaps more efficient way of processing symbols representative of reality. Plato thought that literacy was dumbing down his students because they did not memorize and meditate on what they learned, choosing to write it down and put it on the shelf instead. Are our children any different if they choose to let computers deal with the mechanical aspect of literacy and numeracy so they can concentrate on higher order operations? Do we agonize over our children being unable to weave cloth and tailor clothing?

If Marshall McLuhan is right, we are not past the point where we are pumping old media through the new internet media pipe. Text will always be with us, I think because it is just too darned useful. But we will utilize it differently as we become able to record, replay, produce, publish, communicate and collaborate using non-textual, non-numeric media and move beyond linear and tabular networks and into netular scale-free networks.

Something that occurred to me about phonetic language like English and syllabic language like Arabic versus iconic language like Hanyu Chinese is a phonetic or syllabic language enable you to encode or decode words according to their sound and store and retrieve them based on a simple index. Hanyu on the other hand provides no association between code and sound. You are dependent on the person you hear the word from to provide the association making coding and decoding author dependent. Iconic storage and retrieval indexes are not always obvious either although they do exist based on the subordinate symbols from which words are composed. The internet poses the remedy to this by enabling the automation of the association between sound and icon and definition.

It seems to me that iconic languages as a technology are undergoing a major evolutionary change that could not be achieved without the internet.

Computing is going through an interesting process:

Note: PL means programming language

Nodular Computer: Mainframe: Priesthoods operate
Nodular Network: ARPANET: Priesthoods connect
Nodular Data: Variable: Noun: Priesthoods Query
Nodular Language: Variable PL: Assembler: Priesthoods Manipulate
Nodular Communication: Variable Packet: TCP/IP Priesthoods Communicate
Nodular Schedule: Sequential Batch

Linear Computer: Minicomputer: Scribes operate
Linear Network: Ethernet: Scribes connect
Linear Data: String dbms: Verb: Scribes Query
Linear Language: String PL: 3GL: Scribes Manipulate
Linear Communication: String Packet: HTML: Scribes Communicate
Linear Schedule: Multi-Tasking

Tabular Computer: Microcomputer: Educated operate
Tabular Network: Internet: Educated communicate
Tabular Data: Relational dbms: Noun Set: Educated Query
Tabular Language: Relational PL: SQL: Educated Manipulate
Tabular Communication: Relation Packet: XML: Educated Communicate
Tabular Schedule: Multi-Threading

What is over the horizon and will accompany Iconic Languages I call “DemocraNet”

Netular Computer: Scale-free CPUs: Laypeople operate
Netular Network: High Frequency Gravity Wave Network: Laypeople communicate
Netular Data: Associational DBMS: Verb Set: Laypeople Query
Netular Language: Assocational PL: Iconic Language: Laypeople Manupulate
Netular Communication: Association Packet: XMPEGML: Laypeople Communicate
Netular Schedule: AlwaysOn

Scale-free CPUs will be solid state computers.  There will be no moving parts at all: Solid State Storage, no fans, no boards and a Network Processor.

High Frequency Gravity Wave Networks will make available bandwidth several factors larger.

Associational DBMSs will allow us to modify databases on the fly without concerns regarding referential integrity or normalization.

Iconic Language will Internationalize visual communication.

XMPEGML as new form of markup language for the standardization of iconic language exchange awaits development.

AlwaysOn would mean that you are always connected to democranet and always processing data.

Everything is in the mix to varying degrees, but each successive community is larger.

Posted in Uncategorized. Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , . Leave a Comment »

Netular Technology versus Psuedo-Netular Technology

fishingnet

I have been having a very interesting discussion on Linkedin.com having expressed my opinion about current information technology and the netular  information technology I would like to see.

The people who have been exchanging their views with me cannot see the forest for the trees.  One is offended that I do not rave about all the social transitions the technologies are offering.  Another spews buzzwords like a chainsaw.  Another assumes my opinion is a product of my impatience for the convergence of the existing technologies.

Einstein once said he would spend a majority of his time defining a problem and a fraction of his time solving it.  A majority of the time on information technology is spent solving and a fraction actually taken to understand.  The consequence is most of the solutions out there are not designed, they are hastily assembled patchworks that because of the inertia of being first on the field are only replaced by further patches.

Our entire system of networks is built upon a foundation of linear and tabular architecture that is present in our CPUs, memory, storage, data structures, programming languages, organization, locations, events and goals.  In reality we are only dabbling in networks and doing an abysmal job of using them to their full effect.  We don’t understand them.

Marshall McLuhan said that when a new media is created the first thing we do is pump old media through it.  That is what we are doing now.  We are taking every form of old media we have and pushing it through the internet.  There is not a single case where we have successfully departed from linear and tabular old media.  I have looked at all the current technology, I have used it, I understand its internals and I stand by what I say.

We need a fundamental change in the way information technology works otherwise we are going to continue with an undesigned brute force attempt to solve our problems without ever understanding them.  The outcome will not be progress, but the perpetuation of flat earth thinking.

Linear and tabular thought are responsible for many of the problems we have in the world.  The biggest is the inability to fully appreciate the uniqueness of everything and everyone in this world.  The supreme example of this has been the long history of Religion, Genocide, Slavery, Nationalism, Imperialism, Racism, Eugenics, Fascism, Nazism, Communism, Marxism, Capitalism and Socialism.  All of them fail us because they depended on linear and tabular models of thought that denied the respect of the individuality of all experience.  True netular thought has the potential to challenge all of these misconceptions.  I think it is appropriate that this transition is on the horizon with the rise of globalism.  I doubt it will be a peaceful transition.

Actually, the insights into the underlying order in networks has made quite a bit of progress. One of the leaders in this area is Albert-Beszlos Barabasi who authored the book “Linked” http://www.nd.edu/~networks/Linked/index.html . Another researcher Kwabena Boahen made a fascinating presentation at TED http://tinyurl.com/6nnkb7 . There is also the work of Simon Williams that has come up with a new associative database architecture http://www.lazysoft.com as well as a commercial product, Sentences.

It is time for everyone to fundamentally change the way they think.