Feb 10, 2016 11:12 PM
Feb 10, 2016 11:12 PM
|
Feb 10, 2016 11:12 PM
Feb 11, 2016 11:15 PM
(This post was last modified: Feb 11, 2016 11:28 PM by C C.)
Quote:Many people think that the way to understand the brain is to understand how the parts work and then how the combinations of them work and so forth. And that’s been successful in physics. But you can’t understand a computer by knowing how the transistors work. So people have it upside down – the way to understand the brain is to understand how thinking works and once you have a theory of that then you can look at that immensely complicated brain and say “Well, I think this area does this and that…” You can’t do it from the bottom up because you don’t know what to look for…” That still doesn't negate the fact that creating the same relational structural in a purely wooden or marble substrate would produce a non-functional computer. It depends upon harnessing and directing the base properties of physics (and those of chemistry at the next level) to have working components, and also attributes of such manipulated in that engineering design. The idea that the organization is all-important causally and the substrate is irrelevant is bogus (in fact, it's downright bizarre in any science that supposedly doesn't believe in Platonistic offerings of "form" being able to float independent of "stuff"). A steampunk, mechanical computer or version of the brain would be incredibly large, far beyond turtle-slow, and any possible role of electromagnetic fields in synchronizing and uniting distributed areas for "experiential consciousness" eliminated. The thing could calculate arithmetic problems and maybe even guide an outrageously huge robot around in a simple environment with its "walking-stick" sensor tapping against hills or mountains, lake bottoms, gorges, etc to find their location, but not much more.
Feb 12, 2016 02:42 AM
Indeed if they set up an artificial brain composed of synthetic neurons with suitable synthetic synapses then it could learn like biological brains do. Our brains learn now and having not known how they worked hasn't prevented that through the ages.
Feb 12, 2016 04:07 AM
(This post was last modified: Feb 12, 2016 04:22 AM by Magical Realist.)
Quote:The idea that the organization is all-important causally and the substrate is irrelevant is bogus (in fact, it's downright bizarre in any science that supposedly doesn't believe in Platonistic offerings of "form" being able to float independent of "stuff"). And yet nature is filled with examples of emergent structures such as fields, waves, vortexes, tangled hierarchies, etc that impose top down geometrical relations across an array of varying media and substances. Light, matter, and now gravity--setting up wave mechanical relationships defined mathematically in spite of their being made of different components. Perhaps intelligence is no less a holistic irreducible property.
Feb 12, 2016 07:30 AM
(This post was last modified: Feb 12, 2016 07:48 AM by C C.)
There is the outrageous[*] possibility of Boltzmann Brains or Boltzmann Computers fizzing into existence as complex organizations from the very start. Thus bypassing "parts" and "stuff before" as being responsible over the course of an incremental process. [* At least it's outrageous from the standpoint of what the traditional scientist and philosophical naturalist could tolerate in terms of their earliest presuppositions.]
But evolution and contingent environmental circumstances had to slowly, blindly develop the brain up from the physical properties of the "parts" without having a functional scheme or "conceptual understanding of how thinking works" at the start. Or, if "evolution" is a useful idea abstracted from species diversification over time, it's thereby fiction as a literal concrete object or power. So the basic components and body which ancestrally preceded the brain organ were the actual, potent inventive agencies according to whatever their attributes allowed / enabled in mutable response to external influences. But it's quicker for humans to start with a top-down general interpretation of what an existing organization of microscopic interactions collectively seems to be (like a complex system of intelligence). Because starting at the bottom there would be countless numbers of other possibilities or "structures doing things in space" that the basic components of biology, chemistry, and physics could become via their dynamic linkages. Plus, the brain was already in existence to be studied -- or its intelligence provided the inspiration for AI. That particular result of "what you can do with the Lego-bricks of physics / chemistry" was already taken care of. So from an epistemological standpoint top-down is the correct approach (and that's what Minksy's comments are probably crouched in). But in terms of the origin of or what substantively realizes an "operational form" and what makes its functioning possible, the precursors of that higher-level entity can't be treated as irrelevant and can't be substituted by just anything and still achieve the same product or equal product performance. It's important what neurons and transistors are, and how they work and what permits them to do their "switching" job. [Again, some if not all species of functionalism seem to preach this "substrate does not matter" gospel, as exemplified by their attacks on John Searle in the past. The latter's work focused on meaning and consciousness, in addition to just intelligence.]
Feb 12, 2016 08:24 PM
(This post was last modified: Feb 13, 2016 08:04 PM by Magical Realist.)
I have always had a place in my heart for some mode of functionalism, having not read enough on it's demise due to Hilary Putman. My take on it is that when we seek to explain something, we are doing so in terms of explaining the parts as representing functions or operations in an overall system. With the brain, you have this extremely dynamic interplay of states ranging from synaptic firings and chemical reactions to frequency modulated electromagnetic fields all resulting in awareness, feelings, thoughts, perceptions, memories, body movements, etc. What we hope to understand is the role of all these states in composing the overall phenomenon of conscious awareness. In the end we aspire to have constructed an abstract system of functions generalizable beyond their unique cases of instantiation. There is then a surge toward the purely functional in understanding the brain from the get go, functions which may or may not be replicable in other media. But at least a level of operationability grasped as a sort of universalized model or map that simplifies the whole entangled mess into something logically sequential and spatially structural.
|
|
« Next Oldest | Next Newest »
|
| Possibly Related Threads… | |||||
| Thread | Author | Replies | Views | Last Post | |
| The Economic Singularity: Artificial intelligence and the death of capitalism | C C | 0 | 991 |
Jul 25, 2016 03:59 AM Last Post: C C |
|
| Will capitalism survive robo-revolution? + Singularity is near + Simulations, RE data | C C | 0 | 749 |
Apr 5, 2016 03:01 AM Last Post: C C |
|
| 7 unexpected outcomes of the singularity + When machines learn like humans: The end? | C C | 1 | 1,053 |
Dec 15, 2015 11:38 PM Last Post: Magical Realist |
|