logo Welcome, Guest. Please Login or Register.
2024-12-26 20:05:51 CoV Wiki
Learn more about the Church of Virus
Home Help Search Login Register
News: Everyone into the pool! Now online... the VirusWiki.

  Church of Virus BBS
  General
  Society & Culture

  Conscious AI with Memory Options We Don't Have
« previous next »
Pages: [1] Reply Notify of replies Send the topic Print 
   Author  Topic: Conscious AI with Memory Options We Don't Have  (Read 1416 times)
Walter Watts
Archon
*****

Gender: Male
Posts: 1571
Reputation: 8.57
Rate Walter Watts



Just when I thought I was out-they pull me back in

View Profile WWW E-Mail
Conscious AI with Memory Options We Don't Have
« on: 2007-04-12 14:13:03 »
Reply with quote

This was a conversation (chat) between Sat and I....
Any thoughts?
Just for the record, I'm not even sure if I believe real conscious AI is a possibility
--WW
------------------------------------------------------------------------

<WW> Can you tell me Sat what action is it that happens to you every day, and people routinely implore you to take, but that you have absolutely no control over whatsoever?
<Sat> paying taxes
<WW> hey, that is one. But it's not the one I was thinking of
<WW> give up?
<Sat> sure
<WW> forgetting
<WW> people use that verb in the active sense all the time, but it is impossible for humans to will themselves to forget something
<WW> Just forget it they say
<WW> In fact, the very act of trying is counterproductive to achieving the outcome
<Sat> like trying not to think
<WW> I guess those "arts" that promote transcendental states by various forms of meditation maybe can lesson overall awareness temporarily, but they won't clear specific memories from your mind
<WW> you just can't will that to happen
<Sat> they simply change awareness. or I should say...
<Sat> the type I do simply clears internal dialoge.
<Sat> when I am not internally commenting on things I simply do.
<Sat> like being in the sports 'zone'
<WW> That will be a big difference if we ever achieve "conscious" AI.
<WW> We could simply erase memories at will
<Sat> specific memories?
<WW> yes
<WW> which would be beneficial for the victims of horrible things like accidents, abuse, etc., but no so great for the conscious AI that has malice toward you. He could kill you and ALSO simply erase that memory.
<Sat> heh
<Sat> people do something similar
<WW> double-edged sword isn't it?
<Sat> lying to themselves until they believe it
<WW> agreed
<Sat> now myself I have always known my self deceit is a lie
<Sat> I wonder if that is the same for everyone
<WW> I'm not sure that ever FULLY works though. Don't you think at some level, they know they're lying to themselves?
<WW> Do you think it would be wise to give our future conscious AI that ability?
<Sat> I think the question is will it give that ability to itself
<WW> I guess that IS the question 
<Sat> myself I would not totally erase memories.  I'd stick them in a special place in memory and index them.
<Sat> I'd remember the index
<Sat> but not the memory itself
<Sat> unless I willed myself to access it
<WW> I think our minds do that already, except the full data is automatically retrieved if the index is "touched" by our consciousness
<WW> the relevant index entry that is
* Sat nods
Report to moderator   Logged

Walter Watts
Tulsa Network Solutions, Inc.


No one gets to see the Wizard! Not nobody! Not no how!
David Lucifer
Archon
*****

Posts: 2642
Reputation: 8.73
Rate David Lucifer



Enlighten me.

View Profile WWW E-Mail
Re:Conscious AI with Memory Options We Don't Have
« Reply #1 on: 2007-04-12 17:01:46 »
Reply with quote

When people suggest you "fuggedaboutit" I don't think you should take it literally. They are suggesting that you should act as if you have forgotten.

WW suggests that conscious AIs will be able to erase memories at will. I'm not so sure. Depends on the implementation, doesn't it? What if the AI's memories are holographic in the same sense ours are?

Sat says he has always known his self deceit is a lie. Maybe Sat doesn't realize when his self-deceit is successful.
Report to moderator   Logged
Walter Watts
Archon
*****

Gender: Male
Posts: 1571
Reputation: 8.57
Rate Walter Watts



Just when I thought I was out-they pull me back in

View Profile WWW E-Mail
Re:Conscious AI with Memory Options We Don't Have
« Reply #2 on: 2007-04-12 19:43:57 »
Reply with quote


Quote from: David Lucifer on 2007-04-12 17:01:46   

When people suggest you "fuggedaboutit" I don't think you should take it literally. They are suggesting that you should act as if you have forgotten.

WW suggests that conscious AIs will be able to erase memories at will. I'm not so sure. Depends on the implementation, doesn't it? What if the AI's memories are holographic in the same sense ours are?

Sat says he has always known his self deceit is a lie. Maybe Sat doesn't realize when his self-deceit is successful.



I call acting as if you have forgotten forgiveness.

Forgetting is forgetting, and humans beings cannot consciously will it.

A painful, enjoyable, embarrasing, etc. memory stays there no matter HOW much you act like you've forgotten.

As far as conscious AI's being able to erase memories at will depending on the implementation.

Doesn't everything depend on the implementation?

I agree with you that some or maybe all people do not realize when their self-deceit is successful.



Walter

Report to moderator   Logged

Walter Watts
Tulsa Network Solutions, Inc.


No one gets to see the Wizard! Not nobody! Not no how!
Fox
Adept
***

Gender: Male
Posts: 122
Reputation: 7.89
Rate Fox



Never underestimate the odds.

View Profile
Re:Conscious AI with Memory Options We Don't Have
« Reply #3 on: 2007-04-20 14:07:41 »
Reply with quote

Interesting subject; I do however think that there would be a difference between something being forgotten and something being erased.

Forgetting would imply that the thing in question (memories in this case) would still actually exist, since forgetting is basically a state of ceasing, or failing to remember or recall something; but being erased would imply that the thing in question is completely gone, and is unable to recall all-together. So basically speaking, ‘forgetting’ is not a completely irretrievable state of mind whereas ‘erasing’ would be.

To my mind I think that most conscious AI machines would have a hard time ‘forgetting’ anything, data-wise, unless it was damaged in someway perhaps. However, if the design, function and implementation (and perhaps even evolution) of said conscious AI’s were set about right I expect that they would be able to erase data (‘memories’) at their own conscious choosing. Things like consciousness and reality I think would depend very much on the design, state, structure, programming, implementation and surroundings of the being in question. Machines with a conscious ability and functionality to erase data at their own choosing, I dare say, would likely be more advanced then ourselves since humans have no known biological mechanism to consciously ‘erase’ memories, though we can 'forget'. 

However, I suppose that if a man really wanted to forget something then yes it would be possible, if only for a variable period of time; types of amnesia for instance - which we ourselves can trigger via drug and alcohol abuse, or reduced blood flow to the brain (vascular insufficiency). In Wernicke-Korsakoff syndrome, for example, damage to the memory centers of the brain results from the use of alcohol or malnutrition. So if people are consciously aware of what to do and how to do it then ‘forgetting’ can be self-induced. But that doesn’t necessarily mean that it’s irretrievably and completely gone.

Whether real conscious AI is a possibility or not, I would very much consider to be a yes. If it’s possible for us to evolve from microorganisms and stromatolites then I don’t really see evolution from computers to AI being a problem, especially under the guidance and aegis of another conscious being such as man.       

That’s how I see things.
« Last Edit: 2007-04-20 14:16:09 by Nin` » Report to moderator   Logged

I've never expected a miracle. I will get things done myself. - Gatsu
Walter Watts
Archon
*****

Gender: Male
Posts: 1571
Reputation: 8.57
Rate Walter Watts



Just when I thought I was out-they pull me back in

View Profile WWW E-Mail
Re:Conscious AI with Memory Options We Don't Have
« Reply #4 on: 2007-04-20 18:51:45 »
Reply with quote


Quote from: Nin` on 2007-04-20 14:07:41   
<snip>

Machines with a conscious ability and functionality to erase data at their own choosing, I dare say, would likely be more advanced then ourselves since humans have no known biological mechanism to consciously ‘erase’ memories, though we can 'forget'. 

<snip>



Yes, we can forget, as in something that happens to us, like "I forgot where I put the car keys."

What we CAN'T do however, is forget, as in something we consciously or actively participate in:

"I'm going to forget where I put the car keys."

Quote from: Nin` on 2007-04-20 14:07:41   
<snip>

"So if people are consciously aware of what to do and how to do it then ‘forgetting’ can be self-induced."

<snip>


I wasn't aware this was possible.


Walter
« Last Edit: 2007-04-20 18:57:50 by Walter Watts » Report to moderator   Logged

Walter Watts
Tulsa Network Solutions, Inc.


No one gets to see the Wizard! Not nobody! Not no how!
Fox
Adept
***

Gender: Male
Posts: 122
Reputation: 7.89
Rate Fox



Never underestimate the odds.

View Profile
Re:Conscious AI with Memory Options We Don't Have
« Reply #5 on: 2007-04-20 22:26:32 »
Reply with quote


Quote from: Walter Watts on 2007-04-20 18:51:45   
What we CAN'T do however, is forget, as in something we consciously or actively participate in:

"I'm going to forget where I put the car keys."

In a vast majority of cases perhaps, but I weyken it would all depend on the person and their individual brain-state. A small minority for instance might be cerebrally/mentally impaired or damaged in some way which systematically leads them into a state of forgetfulness beyond their control.  Ever seen Momento?

Whether or not we could actually choose to 'forget' is a more interesting question. Mnemonics would generally aid our brains against this, but in a sense I guess we could - driving ourselves into a state of amnesia being one. Of course, I don't think we could actually control which specific memory would  be forgotten in such an unlikely situation only that we would forget 'something', in a general sense, by our own choosing - in this case self-induced amnesia (drug abuse, alcohol abuse, or reduced blood flow to the brain).

Long periods of time or complex tasks can also cause human forgetfulness.

Unlikely situations perhaps, but still possible.

Regards,

Nin`
Report to moderator   Logged

I've never expected a miracle. I will get things done myself. - Gatsu
teh
Adept
**

Posts: 65
Reputation: 7.34
Rate teh



I'm still still learning

View Profile E-Mail
Re:Conscious AI with Memory Options We Don't Have
« Reply #6 on: 2007-05-10 09:18:31 »
Reply with quote

Interesting timing for this article to be published not so long after it gets mentioned at the COV, "I tell ye'! tis a conspeeracee!!", heh.  Article found at ARS Technica (first vectored from slashdot.org)

Escaping the data panopticon: Prof says computers must learn to "forget"
By Nate Anderson | Published: May 09, 2007 - 08:52AM CT

The rise of fast processors and cheap storage means that remembering, once incredibly difficult for humans, has become simple. Viktor Mayer-Schönberger, a professor in Harvard's JFK School of Government, argues that this shift has been bad for society, and he calls instead for a new era of "forgetfulness."

Mayer-Schönberger lays out his idea in a faculty research working paper called "Useful Void: The Art of Forgetting in the Age of Ubiquitous Computing," where he describes his plan as reinstating "the default of forgetting our societies have experienced for millennia."

Why would we want our machines to "forget"? Mayer-Schönberger suggests that we are creating a Benthamist panopticon by archiving so many bits of knowledge for so long. The accumulated weight of stored Google searches, thousands of family photographs, millions of books, credit bureau information, air travel reservations, massive government databases, archived e-mail, etc., can actually be a detriment to speech and action, he argues.

"If whatever we do can be held against us years later, if all our impulsive comments are preserved, they can easily be combined into a composite picture of ourselves," he writes in the paper. "Afraid how our words and actions may be perceived years later and taken out of context, the lack of forgetting may prompt us to speak less freely and openly."

In other words, it threatens to make us all politicians.

In contrast to omnibus data protection legislation, Mayer-Schönberger proposes a combination of law and software to ensure that most data is "forgotten" by default. A law would decree that "those who create software that collects and stores data build into their code not only the ability to forget with time, but make such forgetting the default." Essentially, this means that all collected data is tagged with a new piece of metadata that defines when the information should expire.

In practice, this would mean that iTunes could only store buying data for a limited time, a time defined by law. Should customers explicitly want this time extended, that would be fine, but people must be given a choice. Even data created by users&#8212;digital pictures, for example&#8212;would be tagged by the cameras that create them to expire in a year or two; pictures that people want to keep could simply be given a date 10,000 years in the future.

Mayer-Schönberger wants to help us avoid becoming digital pack rats, and he wants to curtail the amount of time that companies and governments can collate data about users and citizens "just because they can." Whenever there's a real need to do so, data can be retained, but setting the default expiration date forces organizations to decide if they truly do need to retain that much data forever.

It's a "modest" proposal, according to Mayer-Schönberger, but he recognizes that others may see it as "simplistic" or "radical." To those who feel like they are living in a panopticon, it might feel more like a chink in the wall through which fresh air blows.

--
Report to moderator   Logged
Walter Watts
Archon
*****

Gender: Male
Posts: 1571
Reputation: 8.57
Rate Walter Watts



Just when I thought I was out-they pull me back in

View Profile WWW E-Mail
Re:Conscious AI with Memory Options We Don't Have
« Reply #7 on: 2007-05-10 23:01:26 »
Reply with quote


Quote from: teh on 2007-05-10 09:18:31   


<snip>

"Afraid how our words and actions may be perceived years later and taken out of context, the lack of forgetting may prompt us to speak less freely and openly."

"In other words, it threatens to make us all politicians."

<snip>



ROFL


Walter
Report to moderator   Logged

Walter Watts
Tulsa Network Solutions, Inc.


No one gets to see the Wizard! Not nobody! Not no how!
Pages: [1] Reply Notify of replies Send the topic Print 
Jump to:


Powered by MySQL Powered by PHP Church of Virus BBS | Powered by YaBB SE
© 2001-2002, YaBB SE Dev Team. All Rights Reserved.

Please support the CoV.
Valid HTML 4.01! Valid CSS! RSS feed