Thursday, December 17, 2020

Exagerated significance and paranoia

   What's to stop an artificial mind from assigning the wrong significance (weighting) to a piece of data or relationship between 2 pieces of data?

   If the value assigned is lower than warranted then something is lost.  If a larger value is assigned than warranted then we can have the equivalent if a perception problem in a human mind.

   Larger values also can lead to the equivalent of seeing things that aren't there... paranoia?

   How could an artificial mind manage significance issues like our mind's do?

   Perspective.

   Easy to say, not so easy to quantitatively define.

   But definitions aside... the simplest, most mind-like way to design perspective in (and already tried and true) is to have a mind made up of 2 sub minds.

   Each contributes one-half of the total perception to the mind, balances out the other, and can be inherently self correcting.

No comments:

Post a Comment

Organizing, rearranging, and fitting

If you look at the unconscious part of the mind as a data repository full of knowledge stored in a question/answer format...  What happens w...