top of page
Search

Thoughts on Uploading

  • samrodriques
  • Sep 7
  • 3 min read

I have a background in neuroscience and currently work on AI, so people often ask me for my thoughts on uploading. I thought I would distill them here. These were inspired by a dinner that Fred Ehrsam and Blake Byers hosted a few months ago.


Uploading essentially involves two problems: behavioral cloning (building a computational system that perfectly mimics your behavior) and the consciousness problem (making sure your conscious perception is actually transferred to the machine).


The behavioral cloning problem seems relatively straightforward, perhaps even easy, to get to 80%. Getting beyond 80% will require ensuring that someone’s full episodic memories are transferred over as well, which seems much harder. But, there do not appear to be any fundamental issues here.


The consciousness problem is more challenging. To appreciate this problem: suppose for a moment that I had a perfect atomic copying machine. You put in some matter, press a button, and it instantaneously creates a perfect duplicate of that matter down to the level of subatomic particles. Thus, if I put you in the machine and pressed the button, there would be an instantaneous duplicate of you, identical to the level of subatomic particles, seated perhaps a meter away. The immediate question is: would you see out of two eyes, or four eyes?

Most people respond that you would see out of two eyes. There is no way that the eyes of your duplicate could communicate with the brain of the original.


To extend the thought experiment slightly, imagine that we had a button that could create a perfect duplicate of you in a computer, such that the duplicate had all the same memories, same behavior, same thoughts, and so on, along with two eye-like cameras. And then imagine we press the button. Would you see out of two eyes and two cameras, or just two eyes? Presumably just two eyes. It is reasonable to expect that if someone put a gun to your biological head and pulled the trigger, you would be dead. Your perceptual experience would probably not continue to the silicon substrate.


These thought experiments suggest that consciousness may be somehow bound to the physical composition of matter associated with a brain, which is a problem for uploading. (Incidentally, it is also a problem for teleportation. My default assumption would be that even if we had a teleporter, my subjective experience of entering it and pressing the button might be equivalent to dying.)


(These thought experiments also suggest that there might be some new physics to be discovered, associated with consciousness, although since we have no way to measure it currently there is no testable hypothesis there. There is some discussion here to be had around patients with craniopagus who have some amount of perception out of two sets of eyes, which suggests that it may be possible to have multiple consciousnesses in one brain; and around the situation of split-brain patients, which suggests that it might be possible to split a consciousness in half. It seems like these situations may have some significant implications for the feasibility of uploading, but those implications, if they exist, are currently beyond me.)


So, what can be done? The best proposal I have heard comes from Kevin Esvelt, who essentially proposed a Ship of Theseus model. Especially early in life, the brain is capable of adapting to use some brain areas for purposes other than their “intended” purpose, especially when there is a defect. For example, the visual cortex can adapt to respond to touch in blind people. Perhaps, if one provided a human with a brain computer interface and then progressively knocked out parts of the brain, the human could learn to supplement the missing functions with functions provided by the brain computer interface. If so, perhaps over time one’s consciousness could effectively “transfer” from the brain to the computer, until the brain was nothing more than a glorified interconnect between the computer and the sensorimotor systems, at which point it could be disconnected and replaced by equivalent robotic sensorimotor systems. (At that point, teleportation would also be easy, simply by switching the feed for the sensorimotor systems to a robot somewhere else, although one would be limited by latency.) It is currently unclear how to achieve this. However, if it were possible, this would at least provide me with more confidence that the resulting upload would not be equivalent to death. It might instead be more like the process of severing a split-brain patient or a craniopagus patient. It may be possible to do some experiments on this in mice, to measure to what extent mice can learn to use BCIs to supplement ablated brain functions.


This is all I have for now.


ree

 
 
 

Recent Posts

See All
The Grugbrained CEO

These words inspired by grugbrain.dev . If feel confuse, read grugbrain.dev first. Give good idea on how make good code and be less...

 
 
 

Follow

©2024 by Sam Rodriques

bottom of page