Saturday, January 5, 2019

Tegmark: Life 3.0

I'm reading this book for a local discussion group. Over all I’m still hesitant about this sort of AI model. Granted it likely won’t be like human consciousness but will AI ever achieve consciousness without some form of information/body interaction? Chapter 2 constantly reiterates that information is independent of a physical substrate, but qualify that a substrate is necessary for it. And they grant that advances in the tech substrate has indeed advanced the capacities for processing and storing information. So the substrate is not irrelevant. While the information might be the same despite the substrate indicates that there is a lot more than just information to consciousness. The substrate just might matter when it comes to the latter. 

In that light I recommend this research topic at Frontiers in Science: Consciousness in humanoid robots. All the articles were published this year. I’m particularly interested in “SEIL: Social emotional artificial intelligence based on Damasio’s theory of mind.” The other articles are also relevant to the topic. 

I’m also reminded of what Damasio said in this recent video discussing his latest book, The Strange Order of Things. While Tegmark does cite Damasio in the book, I don’t yet see how he incorporates his insights of emotions and feelings as necessary to consciousness into an AI version. Damasio: 

"There are people that are totally convinced that that minds are nothing but computer programs. And that minds, or even whole organisms, are captured by algorithms. Some say we are actually nothing but algorithms. […] The people that say that obviously have no idea what a mind is, given the result that a mind is the interaction of a brain and the body” (4:45). 


Reading Chapter 5 on the various scenarios for super intelligent AI it occurred to me that one of those options is already in process: The collaborative commons. I did a review of Rifkin’s book The Zero Marginal Cost Society where he lays out how humanity is already using tech to implement an ethos of shared wealth. Granted the book doesn’t address Tegmark’s sort of AI but we could easily include it in our discussion of this emerging societal movement. Maybe even read the entire book and do the next discussion on it? 

He also discusses the types of super AI with Maslow’s hierarchy of needs. Some forms of SAI wants to provide for not just our survival needs but our more evolved needs like meaning and purpose. This ties into Chapter 7 on goals. Physics goal is dissipation, biology’s is replication. The latter maintains and incorporates physics’ fundamental goal by adding an instrumental subgoal. Cultural evolution then further maintains and incorporates the other two with subgoals of its own. Per Damasio humanity evolved feelings as motivators for this hierarchy of goals. This hierarchy also ties into a developing world model (worldview).


So questions arise about SAI’s goals. Humanity required feelings to motivate this process of evolution and development along the hierarchy of needs. But how do we program feelings into SAI to give it motivation to go along with its learning better world models? Without those feelings how can SAI’s goals remain connected to humanity?

Which brings us to Chapter 8 on consciousness. It requires subjective experience which includes feelings; it’s a matter of qualia. He approaches how this transfers to AI by basing it on physical emergence. Something emerges that has properties not contained in its parts, like consciousness. This though questions the above in that the so-called subgoals of biology and culture might instead be emergent rather than subservient to physics’ goal of dissipation. Feelings are hierarchically emergent from biology and culture.
He then examines Tononi’s integrated information theory (IIT). He combines this with previous notions of information being substrate independent to claim consciousness is too. And yet consciousness and information require physical substrate to exist in the first place. We’ve explored how human consciousness is embodied, and how that embodiment indeed is included in and affects higher emergent functions like consciousness. It is certainly not independent of this embodied substrate. Information only theories of consciousness fly off into abstract, disembodied space. Thompson in Waking, Being, Dreaming: “Consciousness isn’t an abstract informational property, such as Giulio Tononi’s ‘integrated information’; it’s a concrete, bioelectrical phenomenon” (343)..
He does note that some aspects of our subjective experience are grounded in our evolutionary origins, like our feelings for survival and flourishing. Yet AI doesn’t have to have them. However as noted above these fundamental drives are necessary for our higher human functions, needs and motivations that emerge therefrom. If something similar is not programmed into SAI’s goal hierarchy then it has no connection to humanity.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.