Creative Commons: Proposed Protection Categories

logo-creative-commons

Right now, Facebook members are campaigning for Facebook/Creative Commons integration.

I fully support this.

I think facebooks credibility would go through the roof.

I think creative commons would become the defacto standard of content protection.

I think everyone on the web would exercise freedom of expression with more confidence knowing they own their expression.

However, I think creative commons should change its conditions to make it more accessible.

I propose the following, based on the International System of Units

who: anonymous/originator/derivator

what: unit/series/collection

when: once/duration/forever

where: private/group/public

why: loss/balance/profit

how: as-is/constructive/destructive

how much: one/two/many

That should satisfy everyone.

I have forwarded this proposal to Creative Commons.

Link:

Advertisements

Databases: Structured Associative Model

oraclesentences

For years now I have been struggling with Relational DBMS technology and Associative DBMS technology attempting to get them to do what I want.  In my first efforts, Relational models were structurally restrictive, Dimensional models were unable to grow organically, EAV models are incompatible with relational architecture.  I came upon Simon Williams Associative Model of Data and although enthralled with its potential I found it too had limitations.  It was semi-structured and allowed for too much flexibility.  25 years in Information Technology had taught me that there was a single standard classification system for setting up databases not a plethora of ontologies.  I was determined to find the theoretical structure and was not concerned with hardware limitations, database architecture, abilties of current query languages or any other constraints.

The Associative Model of Data had made the difference in liberating me from Relational and Dimensional thinking.  A traditional ERD of the Associative Model of Data I at first thought would look like the following:

amdschema

Basically what you have is a Schema composed of Nodes with Node Associations through Verbs and Associations with Nodes Attributions through Verbs. The range of Node Entities, Verb Entities, Association Entities and Attribution Entities are endless.  As well the population of the Schema has an unlimited dataset of natural key values.  I have been challenged by Relational database specialists and SQL experts regarding the viability of this model within current limitations, however their arguments are irrelevant.  What is important is the logical validity of the model, not the physical validity.

After receiving the criticism I decided to revisit the model in order to simplify it.  I went over Simon William’s explanations of his model and its application and found I could reduce it to the following:

amdschema02

This was profoundly simpler and better reflected the Associative Model of Data’s Architecture.  But even with this simpler architecture I was not satisfied.  I felt that the Associatve Model although giving the benefit of explicitly defining the associations was a tabula rasa.  Research has shown that tabula rasa’s are contrary to the behavior of the finite physical universe.  There is an intermediate level of nature and nuture.  And this is what I sought to model.

zachman

When I first encountered the Zachman Framework, something about it struck me in a very profound way.  I could see there was something fundamental in its description of systems, however I felt that the metaphors that John Zachman used were wrong because they themselves lacked a fundamental simplicity.  The consequences of this were that those who studied under Zachman ultimately could not agree on what he was talking about.  Also the “disciplines” that Zachman’s Framework generated were continually reinventing the wheel.  Zachman had created a world of vertical and horizontal stovepipes.  To further the confusion Zachman refused to conceive of a methodology based upon his framework.  Consequently, there was no way to determine what the priorities were in creating a system.  I call this the Zachman Clusterfuck.

Zachman’s work spawned years of work for me.  I could see that systems had a fundamental structure, but I could not agree with Zachman.  Focuses and Perspectives were useless terms.  The construction metaphor was useless.  I read anything I could get my hands on dealing with systems, methodologies, modeling, networks and a broad range of other literature across the disciplines.  Out of this came a set of conclusions:

  1. There were a fundamental set of Noun Entities
  2. There were a fundamental set of Verb Entities
  3. There were a fundamental set of Association Entities
  4. There was a clear order in which the Nouns were addressed
  5. There was a clear order in which the Verbs were executed
  6. The structure was fractal
  7. The content was a scale-free network

I made some attempts at creating the vocabulary and experimented with this new Structured Thinking Language.  However, the real break came when I worked with John Boyd’s OODA Loop:

theboydpyramid

The OODA Loop revealed a governing structure for the methodology and guided my way into the following hybrid relational/dimensional/associational model I call the Structured Associative Model of Data:

samd

One of the key things this model demonstrates is the sequence followed by the OODA Loop.  Starting from the top, each dimension set spawns the next.  Choices are created from the dimensions.  There is no centrism to this model which is an inherent flaw in Service Oriented Architecture (SOA), Event based architecture, Data centric architecture, Goal-Directed Design, Rule based systems among others.  The stove pipes of Focuses and Pespectives disappear by reasserting a clear order of priorities and dependencies for achieving success.  The model also supports bottom up inductive as well as top down deductive sequencing.  This will make the system able to reconfigure to handle exceptions.

Some of the things I have learned in designing this model include the realization that unit defines datatype and that all measures are variable character string text.  This is because any displayed value is only a symbolic representation of the actual quantity.  If operations are to be performed on measures they are converted to the correct type as part of the operation.  I also recognized that Unit was necessary to define the scale and scalability of the system.  Further, it became apparent that analog calculations should not be practiced.  Every value should be treated as discrete and aggregated.

Another aspect of this system is the inclusion of currency and amount.  I have been critical of Zachman and academics for their hypocrisy regarding the economics of systems.  All systems have a cost and a benefit and they are measurable in currency.  Contrary to the reasoning of the majority, every decision is ultimately economic.

Tim Brown of IDEO has coined the term “Design Thinking” and has been toying with the concept for some time.  Many designers dwell on the two dimensional concept of divergence and convergence as modes of thought.  If we look at my model, divergence is the creation of choice while convergence is selection of choice.  There is no alteration or deletion of choice in my model as history is preserved.

Now what you have is a unencumbered framework with a clear methodological sequence.

czerepakcognitary

Welcome to the Cognitary Universe.

Jared Diamond: Societal Collapse

Vodpod videos no longer available.

more about “Jared Diamond: System Collapse“, posted with vodpod

If you listen carefully to what Jared Diamond is saying in the TED video above, he is describing not a five part, but a six part power curve into a systemic singularity. This has been one of the core themes of discussion of this blog.  We all seem to be too close to our problems to see the commonality.  The interrogatives come into play here:

  1. Goals
  2. People
  3. Functions
  4. Forms
  5. Times
  6. Distances

Times and Distances being the basis on which the higher orders are built.

When we look at the recent economic “crisis” we see 300 trillion in currency circulating and roughly 1 trillion to 2 trillion shifting suddenly and unexpectedly.  We witnessed a systemic collapse, a singularity, a tipping point, a power curve, an exponential change, a phase transition or whatever label you want to call it.  These have been happening everywhere since Time and Distance began in different contexts and orders both in human and non-human systems.

What Jared Diamond and other alarmists are implying is that human society is now a system approaching its final singularity in this century on this planet.  We are implying that today we are experiencing a less than one percent crisis on a power curve into a singularity.  How many more iterations will the global system withstand?  Will humanity make the step into space successfully before we experience a global dark age?  How will the six or more factors in the power curve play out?

The truth to me appears to be that power curves whether they play out or not result in either a systemic climax or anti-climax followed by a systemic collapse.  Would it not be better if we experienced a systemic climax that led to us expanding into the solar system?

Systemic collapse seems to be the fashion of this generation.  Every generation looks with fascination at its own youth, maturition, reproduction and acceleration into mortality.  Some die early, some die late, but all die.  It is an irrevocable law of nature.  It is not about self-interest.  It is about what self-interest is defined as.

Related Posts:

Beyond the Singularity

Servitas and Libertas

Posted in Uncategorized. Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , . 2 Comments »

Induce the Past, Deduce the Future (continued)

Inductive (Analysis) Pattern:

stlinduction.jpg

Deductive (Design) Pattern:

stldeduction.jpg

This assumes a top left, row by row, to bottom right path. As you can see, induction (analysis) instead of being relegated to a single phase is a methodology in its own right.  Also it should be recognized that Induction is a bottom up process, while Deduction is top down.