Databases: 50 years of stupidity

stupidity

Database conventions are not best practices.  Database naming conventions are based on random ontological concepts.  Ideas about what constitutes an entity are misdirected.  Programmers know nothing about what a class or an object is or how to name them.  Hierarchical, Relational and Network databases have maintained a persistent and ignorant set of practices that the information technology intelligencia have followed mindlessly.  What we have after 50 years is a brute force patchwork of bad design practices and mediocre engineering that continues to work within the same set of assumptions.  It’s a product of the inertia of intellectual lethargy that dominates not just the technological world, but the world that uses technology in general.  Workers are too busy being inefficient and ineffective to improve their business practices.  They jump at silver bullet solutions that promise results without change.

Database people have never understood data.  Programmers have never understood data.  They have instead tried to please everybody’s ontological misconceptions with grotesque architecture and then shoehorn it all into a physical processor that is about as progressive and efficient as the internal combustion engine.  Eco-nut technologists like to use buzzwords like “organic” to describe the chaotic crap they are producing on the web.  It isn’t organic, its a massive slum composed of any piece of detritus the occupants can build with surrounding a district of monolithic towers of gross excess and shameless waste.  Google’s motto is “Don’t be evil.”  Has any company ever considered having the motto, “Be good”?  The more I work with corporations the more I recognize that goodness is discouraged and evil is whatever the corporation says it is.  If you work for anyone you are part of a Milgram experiment and you are delivering those electric shocks everyday under the direction of psychopaths.  The merit you get promoted for is based on your willingness to flip those switches more than anyone else.  Having a conscience is deemed unprofessional and grounds for termination.

This is the environment within which real innovation has to work.

Hungarian Backwords Notation, a naming convention by Charles Simonyi, has been abused and bastardized by programmers and database administrators with no understanding of semantics, which is most of them.  Consequently, it has been rejected by a large portion of the IT community.  Not even Microsoft knew what it had.  I fought with Simonyi’s concept for years and applied it in several working applications successfully against massive resistance.  The more I worked with it the more I realized that Object Oriented Programming was based on a completely false ontology.  The metaphors were completely wrong.  And the Unified Modeling Language entrenched the misconceptions even further.  Information technology is spawning increasing complexity without any appreciation for underlying order.  The order was datatypes.  There are only a handful of Classes and they are datatypes. The English are backwards, not the Hungarians.

If the world was looked at as a collection of datatype classes the entire philosophy of data and programming and systems would have to change.  Objects do not have properties, properties have objects.  And there are only a handful of properties.  I’ve realized this and it has changed my perspective of data design forever.  Throw away your OOP and your Data Model textbooks.  They’re crap.  Google, Apple and Microsoft are not the future.  Einstein had a better grip on reality than Turing ever did.  The typical mind, including the IT mind, still thinks elephants are bigger than the moon.

Related Links:

PowerPoint Killer

prezi

prezi.com has created a non-linear zooming presentation mapper that deserves your immediate attention.

Globish: The New International Language

mfl

France has lost the language war.  Globish, a new form of pidgin English is being used within France and by businessmen internationally to communicate.  If the trend continues I am sure this will become a full blown dialect within one generation.

BBC Article: Globish

David Carson on design, discovery and humor | Video on TED.com

David’s sense of humor regarding the visual world makes this presentation a gem.

Vodpod videos no longer available.

Video: six supercars in one day

Ah, why not live a little.

Vodpod videos no longer available.

more about “ Video: six supercars in one day “, posted with vodpod

Databases: Structured Associative Model

oraclesentences

For years now I have been struggling with Relational DBMS technology and Associative DBMS technology attempting to get them to do what I want.  In my first efforts, Relational models were structurally restrictive, Dimensional models were unable to grow organically, EAV models are incompatible with relational architecture.  I came upon Simon Williams Associative Model of Data and although enthralled with its potential I found it too had limitations.  It was semi-structured and allowed for too much flexibility.  25 years in Information Technology had taught me that there was a single standard classification system for setting up databases not a plethora of ontologies.  I was determined to find the theoretical structure and was not concerned with hardware limitations, database architecture, abilties of current query languages or any other constraints.

The Associative Model of Data had made the difference in liberating me from Relational and Dimensional thinking.  A traditional ERD of the Associative Model of Data I at first thought would look like the following:

amdschema

Basically what you have is a Schema composed of Nodes with Node Associations through Verbs and Associations with Nodes Attributions through Verbs. The range of Node Entities, Verb Entities, Association Entities and Attribution Entities are endless.  As well the population of the Schema has an unlimited dataset of natural key values.  I have been challenged by Relational database specialists and SQL experts regarding the viability of this model within current limitations, however their arguments are irrelevant.  What is important is the logical validity of the model, not the physical validity.

After receiving the criticism I decided to revisit the model in order to simplify it.  I went over Simon William’s explanations of his model and its application and found I could reduce it to the following:

amdschema02

This was profoundly simpler and better reflected the Associative Model of Data’s Architecture.  But even with this simpler architecture I was not satisfied.  I felt that the Associatve Model although giving the benefit of explicitly defining the associations was a tabula rasa.  Research has shown that tabula rasa’s are contrary to the behavior of the finite physical universe.  There is an intermediate level of nature and nuture.  And this is what I sought to model.

zachman

When I first encountered the Zachman Framework, something about it struck me in a very profound way.  I could see there was something fundamental in its description of systems, however I felt that the metaphors that John Zachman used were wrong because they themselves lacked a fundamental simplicity.  The consequences of this were that those who studied under Zachman ultimately could not agree on what he was talking about.  Also the “disciplines” that Zachman’s Framework generated were continually reinventing the wheel.  Zachman had created a world of vertical and horizontal stovepipes.  To further the confusion Zachman refused to conceive of a methodology based upon his framework.  Consequently, there was no way to determine what the priorities were in creating a system.  I call this the Zachman Clusterfuck.

Zachman’s work spawned years of work for me.  I could see that systems had a fundamental structure, but I could not agree with Zachman.  Focuses and Perspectives were useless terms.  The construction metaphor was useless.  I read anything I could get my hands on dealing with systems, methodologies, modeling, networks and a broad range of other literature across the disciplines.  Out of this came a set of conclusions:

  1. There were a fundamental set of Noun Entities
  2. There were a fundamental set of Verb Entities
  3. There were a fundamental set of Association Entities
  4. There was a clear order in which the Nouns were addressed
  5. There was a clear order in which the Verbs were executed
  6. The structure was fractal
  7. The content was a scale-free network

I made some attempts at creating the vocabulary and experimented with this new Structured Thinking Language.  However, the real break came when I worked with John Boyd’s OODA Loop:

theboydpyramid

The OODA Loop revealed a governing structure for the methodology and guided my way into the following hybrid relational/dimensional/associational model I call the Structured Associative Model of Data:

samd

One of the key things this model demonstrates is the sequence followed by the OODA Loop.  Starting from the top, each dimension set spawns the next.  Choices are created from the dimensions.  There is no centrism to this model which is an inherent flaw in Service Oriented Architecture (SOA), Event based architecture, Data centric architecture, Goal-Directed Design, Rule based systems among others.  The stove pipes of Focuses and Pespectives disappear by reasserting a clear order of priorities and dependencies for achieving success.  The model also supports bottom up inductive as well as top down deductive sequencing.  This will make the system able to reconfigure to handle exceptions.

Some of the things I have learned in designing this model include the realization that unit defines datatype and that all measures are variable character string text.  This is because any displayed value is only a symbolic representation of the actual quantity.  If operations are to be performed on measures they are converted to the correct type as part of the operation.  I also recognized that Unit was necessary to define the scale and scalability of the system.  Further, it became apparent that analog calculations should not be practiced.  Every value should be treated as discrete and aggregated.

Another aspect of this system is the inclusion of currency and amount.  I have been critical of Zachman and academics for their hypocrisy regarding the economics of systems.  All systems have a cost and a benefit and they are measurable in currency.  Contrary to the reasoning of the majority, every decision is ultimately economic.

Tim Brown of IDEO has coined the term “Design Thinking” and has been toying with the concept for some time.  Many designers dwell on the two dimensional concept of divergence and convergence as modes of thought.  If we look at my model, divergence is the creation of choice while convergence is selection of choice.  There is no alteration or deletion of choice in my model as history is preserved.

Now what you have is a unencumbered framework with a clear methodological sequence.

czerepakcognitary

Welcome to the Cognitary Universe.

Barabasi: Scale-Free Networks

linkedLinked by Albert-Laszlo Barabasi has opened up an incredible range of knowledge regarding the laws of networks.  Albert goes far beyond the work of Duncan Watts in Six Degrees to explain the existence of many of the observed properties of complex networks and consequently the behaviour of complex systems.  The random graphs, clustering, power curves, hubs, network growth,  preferential attatchment, fitness, Bose-Einstein condensates are all introduced to the reader.  Ultimately the book is an introduction to the discovery of scale-free networks and reveals that all the past models based on random networks are wrong.

I highly recommend this book to anyone wanting to learn the current understanding of networks and the implications.