Startup aims to sell a brain implant to improve memory

Well, eventually it MUST be tested in humans, you know. And those suffering from damage to this area might just disagree with how “horrifying” it is.

Get used to this type of thing; implants, whether “upgrades” or simple replacements, ARE going to become more and more common and affordable. Welcome to transhumanism, I guess ^^’ ?

That damn Eschaton is gettin’ pretty immanent…

4 Likes

Finally! A literary reference I got! (as opposed to Don Quixoti or something . . . )

1 Like

oh yeah, I’ve always wanted to get a brain implanted!

It’s interesting how much the article refers to the brain as if it’s a computer and behaves like one. While a convenient analogue to a device we understand, we’re learning more and more that this just isn’t the case.

2 Likes

Pardon? The brain IS a computer, quite literally, just an evolved, analog, organic one rather than constructed, digital, and silicon.

2 Likes

Is it, though? Or is a computer just the best way we have to think about it? Just like we used to consider the human body to be a mechanical machine, or prior to that a system of hydraulics, and prior to that clay. We use the closest metaphor we can understand to describe ourselves. But that doesn’t mean it’s accurate.

1 Like

Yes, it is; the brain satisfies every part of the definition. In fact, I challenge you to show how the brain is not a computer.

please choose one of

( ) install service, 99000 $
at our medical facilities

(x) self install kit, 99$
includes disinfectant, drill bits, gauze.
warranty void.
If you need help during the installation procedure, dial 1-900-POT-ATOS (1.99$ /min)

1 Like

Depends on how you define computer.

But how’s this for starters…

This suggests to me that our “memory” isn’t analogous to computer “memory”, where things are stored and retrieved.

Humans have a bit rot problem?

The problem happens when you try to define “computer.”. The article you cite gives a very specific definition of what a computer is, and says “well, the human brain isn’t that.”

Well, of course it isn’t. I could define a pencil as “a cylinder of graphite enclosed within a long piece of wood.” That definition, however, would exclude charcoal pencils, mechanical pencils, colored pencils, and many other things that people would call a “pencil.”

A thing should not be defined based on how it does something, but on what its function is. A computer takes in information, processes it, and outputs the processed information. That is the essence of a “computer,” and that is exactly what our brains do.

A pencil is a tool for making marks on surfaces. If you want to distinguish it from a pen or marker, it makes those marks by abrading and depositing material from its point onto the surface. You can pick up a pencil and make a more restrictive definition that fits the pencil you’re holding, but the more you make it about the form and not the function (as the article you linked does), the fewer people you’ll have agreeing with you about your definition.

So, no, a brain is not directly analogous to an electronic binary computer. There is, however, a similarity of function, and both systems transmit information using electric signals, so there are necessarily going to be some parts of each that are analogous to the other.

The brain is not an electronic binary computer, but that doesn’t mean, when certain parts of the organic analog brain fail, that those parts can’t be augmented, simulated, or even replaced outright, by electronic circuitry.

8 Likes

It’s likely more a problem that the memory may not be actually “stored”, but rather the brain subtly alters itself to incorporate some recognition or recognizance of the thing that has happened.

A big difference between a brain and a computer is that the brain is constantly reconfiguring itself and learning, by its nature. If you could have a computer that would re-solder its own leads in realtime, you might be able to make a good analogy between a brain and a computer. But with a brain, given identical “inputs”, you will never have identical “outputs” because the very acts of thinking, observing, responding, feeling, emoting, change the brain.

You thinking back on your memory of something, changes the memory of that thing.

Now, people are in the process of making computers that could in theory one day emulate the function of a human brain. But that, I would argue, becomes no longer a computer as we would define it today, but something else that we don’t currently have a name for.

Using computer terms, and applying them to human function, can very easily cause blind spots to develop, where we think our system works like a computer. It’s very deep into our lexicon, but the analogies don’t actually properly describe what’s going on.

The first thing that leaps to my mind is, whoa, if this were to actually work there might be a possibility of experiencing someone else’s memories, and by extension, their feelings. I somehow doubt different brains would process things identically so it might be even more far fetched than the implant itself, but the idea that you could feel even a close approximation of someone else’s experience in your own brain would be quite revolutionary.

See also: Memory Wall by Anthony Doerr

Software can easily modify itself, and even my laptop is hardware-reconfguring itself all the time (it has a SSD, and it periodically retires sectors and remaps the remaining ones).

People have been making “computers can’t think because they can’t do this” style arguments for decades. (When I was in college the big claim was that computers would never be able to beat a grandmaster in chess. Needless to say,that was a long time ago!) Every time a computer manages to do the this, naysayers move the goalposts. After 40 years of computer progress, “no true Scotsman” gets a little old.

1 Like

Nope! That’s why they’re doing tests, duh.

Eager as I am to stick wires in my brain and become superhuman (amazingly this is not sarcasm), I am curious about how narrow the use case is here. I don’t seem to have too much trouble forming memories, but retrieving them is an issue. Would this help there, or no?

And I suppose I’d probably wait 'til the first few rounds of test subjects don’t die or go insane, jeez, fine.

1 Like

Not necessary. A single neuron has a very limited range of responses, so a meatbrain must be physically reconfigured to substantially change its function. A properly-designed computer can emulate a huge range of different systems in software; adapting outputs in response to changing data is a fundamental characteristic of software.

No True Scotsman. Anything Turing-complete is a computer, any computer can be programmed to solve any mathematically-expressible problem if there is sufficient time and storage available, and any problem involving the motion of matter and charge can be expressed mathematically. There can be no doubt that it is physically possible for a sufficiently powerful computer to fully emulate a human brain. Whether we’ll ever manage to build that computer, whether we will ever understand the brain well enough to write the emulator, and whether it can run in realtime is open for debate. But if a time traveler handed you the program today, you could run it on any sufficiently powerful cluster of modern computers.

1 Like

That’s not at all what I’m getting at. I have no doubt that computers can now or will some day be able to emulate many if not all of the functions of the brain. Probably outperform the brain at it in many regards.

What I’m getting at is that they have different “works.” By ascribing to a brain all of the capabilities, quirks, and paradigms of a computer, you begin to draw a quite inaccurate conception of how the BRAIN works and what the BRAIN can do. Brain memory does not behave like computer memory. Brain learning does not behave like computer learning.

Perhaps, some day, if computers have advanced to a level that they can emulate everything the brain does, in the way the brain does it, THEN we can use computers in order to serve as an accurate metaphor for the brain.

But there are a lot of things a computer CAN do that a brain CANNOT. A brain cannot respond identically to repeated input. A brain cannot act as if some piece of input never came into its purvey. A brain isn’t neatly separated into a CPU and separate areas that are specifically engineered for singular purposes. A brain cannot smell a dead rat without triggering memories of the first dead rat it ever smelled.

Many of the things that are fundamental about how a brain works, learns, and so on, are optional for a computer. You could have it learn or not. Self modify or not. Make connections between movement and sensation or not. But for a brain, these are just “how it works”, and it’s a brain because it does these things.

Thus, there are many computerlike things about brains, and many brainlike things about computers. But the analogy is flawed, and understanding that the analogy is flawed allows for a more nuanced discussion about the nature of the brain.

You’re arguing that it’s wrong to think of a living brain as equivalent to a computer, and that’s true. But it has no bearing on whether or not a computer can emulate a brain.

Well, I do have a little exposure to the science of brain operation, mainly through osmosis from being married to a research neuroscientist who studies the electrochemisty of the brain. My experience is that while such experts might find fault with any given project in AI, few would deny that the brain is a computer that is not fundamentally different from those we’ve had in hardware for decades and on paper since the 1930s. At least in academics, the only people who seriously argue otherwise tend to be far removed from the actual scientific study of such things.

I agree. I think the problem comes in trying to interface one type with another type, which is what they are doing here. I just wonder how much success they’ll have with interfaces that rely on electrical signals. There are other types of signals, i.e., chemical, that are part of the organic computer that are not used in the interfaces I’ve heard of. I think we’ll need interfaces that also use neurotransmitters (and there are at least a hundred of them). But I’m just speculating.