I think you people are vastly overestimating how much we actually know about the brain or severely underestimating how freaking complex it is.
The “you” reading this right now, is a fucking stack of six A4 sized sheets, each one nanometers thick, and crumpled into something which, by all appearances, looks to an external observer as an oversized walnut seed, cooled and maintained by a network of 400 miles capillaries, and isolated from the world by the blood brain barrier, which can only be described as a fucking miracle.
No. No one is going to be implanting any memories soon
AI is better at recognizing patterns than we are. The brain may be unfathomable to us, but technology already exists which could recognize the signals in your brain that represent memories and reproduce or alter them.
Neuralink and similar devices are being used right now, today, to record the thoughts of animals. The first neuralink patient is alive and well, meaning it’s already being used on humans.
Do you really think this technology won’t exist in our lifetime?
Do you really think this technology won’t exist in our lifetime?
Yes, absolutely. What you’re describing is AGI. If an AI could untangle engrams from branched clusters of extremely plastic neurons, it could understand and improve it’s own thinking. It would actually be self aware before it could untangle the mess that our brains are. And I don’t see AGI happening with our current material and resource constraints before I die. Seeing brain regions being active and de-novo engram implantation is about as close as an LLM is to AGI.
It is as you say, the scale doesn’t even exist at this point
Even the recent fly brain mapping, enhanced with AI, had to take a destructive approach to map a half a milligram brain and these people are thinking matrix reloaded already
Respectfully, this sounds like opinion and doubt rather than a credibly timeline. Other than rattling off industry terms the only support you’ve given your argument is “I don’t see AGI happening”. You’ve collected an impressive shopping basket of buzz words but done little to dissuade me or the engineers developing this technology that it won’t be ready within a lifetime. Stay tuned.
Oh, and “its own thinking” not “it’s own thinking”. His, hers, its.
Your extrapolation has about as much support. I don’t really know what bothers you about the vocabulary I used but I can say I don’t play much attention to punctuation marks when inputting text with a swipe keyboard on my phone.
But you expect us to care about your opinion? Be correct and be nice or you won’t get to finish the discussion. It’s like a recipe, you have to do the work to get the product.
Maybe memories are actually really simple. Like the words on a screen. An arrangement of symbols, then a boatload of meaning and interpretation and rationalization. So all you need to do to make memories is to insert a few words. The brain’s “memory interpreter” does the rest of the work.
For example, we insert the words “brother appears”. Then, for the “new memory”, we reference your memories of your brother. His appearance and the sound of his voice. Then we contrive a narrative explaining why “brother” is at this place and time. Etc. Voila! You now have a memory of your brother standing there saying some stuff.
So to make a memory, it wouldn’t require a grand delicate manipulation of brainstuff. Just a simple thing.
I think you people are vastly overestimating how much we actually know about the brain or severely underestimating how freaking complex it is.
The “you” reading this right now, is a fucking stack of six A4 sized sheets, each one nanometers thick, and crumpled into something which, by all appearances, looks to an external observer as an oversized walnut seed, cooled and maintained by a network of 400 miles capillaries, and isolated from the world by the blood brain barrier, which can only be described as a fucking miracle.
No. No one is going to be implanting any memories soon
AI is better at recognizing patterns than we are. The brain may be unfathomable to us, but technology already exists which could recognize the signals in your brain that represent memories and reproduce or alter them.
Neuralink and similar devices are being used right now, today, to record the thoughts of animals. The first neuralink patient is alive and well, meaning it’s already being used on humans.
Do you really think this technology won’t exist in our lifetime?
Yes, absolutely. What you’re describing is AGI. If an AI could untangle engrams from branched clusters of extremely plastic neurons, it could understand and improve it’s own thinking. It would actually be self aware before it could untangle the mess that our brains are. And I don’t see AGI happening with our current material and resource constraints before I die. Seeing brain regions being active and de-novo engram implantation is about as close as an LLM is to AGI.
It is as you say, the scale doesn’t even exist at this point
Even the recent fly brain mapping, enhanced with AI, had to take a destructive approach to map a half a milligram brain and these people are thinking matrix reloaded already
Being 70-80 years old sucks. My condolences. We’ll mess around with AGI when you’re gone and I’ll think about you
Haha bro thinks the AGI will not be messing around with him LMAO 🤣
Ants vs pest control kind of thing.
It’s like comparing amoebas messing with us while we mess with them
Respectfully, this sounds like opinion and doubt rather than a credibly timeline. Other than rattling off industry terms the only support you’ve given your argument is “I don’t see AGI happening”. You’ve collected an impressive shopping basket of buzz words but done little to dissuade me or the engineers developing this technology that it won’t be ready within a lifetime. Stay tuned.
Oh, and “its own thinking” not “it’s own thinking”. His, hers, its.
Your extrapolation has about as much support. I don’t really know what bothers you about the vocabulary I used but I can say I don’t play much attention to punctuation marks when inputting text with a swipe keyboard on my phone.
“pay much attention” not “play”. I’d be more careful with that keyboard if I were you. Wouldn’t want to lose any credibility.
I thought I made it clear enough I didn’t give a shit.
But you expect us to care about your opinion? Be correct and be nice or you won’t get to finish the discussion. It’s like a recipe, you have to do the work to get the product.
If you primarily engage in typos versus ideas I don’t particularly consider you worth discussing anything with anyway.
Maybe memories are actually really simple. Like the words on a screen. An arrangement of symbols, then a boatload of meaning and interpretation and rationalization. So all you need to do to make memories is to insert a few words. The brain’s “memory interpreter” does the rest of the work.
For example, we insert the words “brother appears”. Then, for the “new memory”, we reference your memories of your brother. His appearance and the sound of his voice. Then we contrive a narrative explaining why “brother” is at this place and time. Etc. Voila! You now have a memory of your brother standing there saying some stuff.
So to make a memory, it wouldn’t require a grand delicate manipulation of brainstuff. Just a simple thing.
Memory and simple are words that you can only read when saying “memory IS NOT simple”
For fucks sake, our body stores memories for preferences in our literal guts
Memory is a lot of things except simple
Words are simple. But if you consider what they refer to, words are complex. See?