Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you were offered the chance to become as intelligent as Feynman but only if you turn over a copy of all your future thoughts to the private business that made this possible, would you take it? Something non-invasive, like a computer that images your brain while you sleep.

If we are machines, then it seems reasonable that within a few millennia we might have this capability. I'm trying to think up some scenarios where it might be reasonable to turn over your brain to a business.

The ability to become "immortal" would probably be enough for most people. Is your disembodied mind still you? Do you feel an attachment to the idea of preserving it? What about modifying it?

Where it gets really strange is if you think of your wife or husband making a backup of their brain, then running a simulator on that backup. I.e. giving it "life" by letting the simulated neurons fire. Does that count as thinking? What if you can see those thoughts? What if you can communicate? Would you love them just as much as you love your SO?

Probably useless questions, but I can't help but wonder.

(I've expanded my comment since it was posted; the "no"'s are in response to the first question.)

Also, https://www.youtube.com/watch?v=IFe9wiDfb0E



I think we should really fix society before we develop that kind of technology. We need a socio-cultural breaktrough of the same magnitude as the technological breakthrough. Something on par with the development of monotheism, enlightenment, capitalism, or democracy. A "communistic" utopia (for lack of a better word) without the flaws of its historical implementations. Then the question is moot. There is no private business, or indeed any other entity, government or private, that you would make such a deal with. There is no second party that wants to withhold technology, or steal your thoughts. Rather it will be "From each according to his ability, to each according to his needs" - you get the augmentation for free, and you contribute what you feel comfortable sharing back to society.

Often, utopia is portrayed as the result of a technolocial breakthrough, sometimes "if we could read each others thoughts there would be no misunderstanding and conflict". I believe it is the other way around - utopia is a precondition for singularity-type technology. At least, if we want it to be beneficial and not hellish.

Brain implants are easy, communism is going to be hard.


I agree. Once we develop the ability to upload and modify brains, it will be easy to morph them. We can extract the primal desire to defend yourself, to become angry, to want to own property. These emotions will be reserved for the founders.

It will take time to get there. Many will resist, so we have to keep our goal a secret. But people are gullible. With our morphogenic brain technology, we can give people experiences they've only dreamed of. Think of it. The high of heroin, with none of the downsides. The ability to know instantly when any of your loved ones are in distress. The ability to shape yourself and your children into any form you desire.

If we appeal to base emotions, if we appeal to their need to control, to own, to shape, then we can implement our plan.


I hope you're being 100% sarcastic.

If you're not, I must ask: is your name Andrew Ryan perchance?


> Many will resist, so we have to keep our goal a secret.

This is not democracy. Every dictatorship begins with "I know better what is good for you than you do".

You should watch movie "Equilibrium" with Christian Bale. I don't think you really want that, it only sounds good in theory.


Well, it was firmly tongue in cheek. :) I've been thinking of trying to write some short stories pulling from various themes in technology. Most ideas have been done to death, but with a bit of skill it might be possible to write something worth reading.

A tangent, but: I've been wondering how a novelist builds their skill. They're not born with it. One idea would be to watch a movie and write it like a book, transcribing it scene for scene. I'm not sure whether that'd help, though.


No ideas about the second, but an aside on the first.

'Most ideas have been done to death' is a central reason I have stopped reading novels that much. Ah its one of those, he will do this. Ah yes, I see the twist. its one of those.


I highly empathize with this. These days I'm just interested in imagination. I don't care if a book or novel or movie is good. I will most likely never consume it. I'm looking up synopses and scanning them for innovative ideas.


Re tangent: Steven King's "on writing" or Ray Bradbury's "en in the art of writing"

write 1000 words a day, be crap for decades, eventually you get skilled.


The other, and much more likely in my opinion, future is that developing this kind of technology will cause us to "fix" our society, or rather society will grow to support the new technology. It is unlikely that private business will relinquish it's control willingly, we must aim for a future that makes private business irrelevant (massive automation removing the need for humans to "pay" for anything)


I'm curious why you worry about control by business, but not control by the government?


My government, in theory, derives every last bit if its power from my consent and the consent of my peers; and in so far practice diverges from that, that can be corrected or at least discussed. Even foreign governments at least have that relation with someone, even if they don't have it with me. With private enterprises, there isn't even anything to correct or discuss from their perspective. They're "pure tyrannies", as Chomsky put it, while governments are theoretically democratic. The magic free market turning selfish bastards fighting each other instead of entropy into some useful force doesn't seem to be enough just by itself.

Though now that I think of it, if people got their shit together in the ways required to kick their own governments in the dick where required, that would also translate to paying attention with how we vote with our wallets (it also would translate to any corp trying just one line of cutesy marketing-speak of sinking to the bottom before you can say iceberg, which would improve the world by orders of magnitude within one week), so in the end maybe "capitalism or communism", or anything of that nature, matters much less than the individuals do... after all you can't polish turds, and you don't need to array gold nicely for it to shine.


The rookie cop his first day on the job has more power over you than Bill Gates does.


When someone misses a train because of some forced Windows update, they just call the police and they force Bill Gates to fix that right up.

How about this instead, just imagine I'm so dumb that you have to spell your point out fully. I don't get what you're trying to say, what rookie cop? A specific one, or "the" rookie cop, every single one? And who is Bill Gates? If it's not worth spelling it out for you it's not worth trying to speculate for me.


After I pointed out I don't believe in the dichotomy, the reply is something that apparently is supposed to be an argument for it. But you don't have a fixed amount of resources you spend on either being afraid of the government or corporations, and any amount of reclaiming your dignity as a critically thinking and responsibly acting human person will help with both "threats".

To not even acknowledge that, and go "but government", or "but corporations", really means at this point I need you to rephrase my whole comment in your own words to believe you did more than scan it. (That goes for the person downvoting me asking you to elaborate, too. If it's soooo obvious to you, make with the goods.)


I don't really see those as different entities. Governments are just businesses with a different name. I pay them for a service.


> communism is going to be hard.

Not if everyone has brain implants!


Touché. That's why I'm not getting a brain implant :-D


"if we could read each others thoughts there would be no misunderstanding and conflict"

For that to be true you have to assume that everyone has enough empathy to come to agreement. It isn't really much different from the way things are now (without being able to read each other's thoughts).


You've got it backwards. Our brains are what is holding society back.

We need a socio-cultural breaktrough of the same magnitude as the technological breakthrough

That's what transhumanism is. It's a restructuring of the human system around symbiosis with AI.


"It's a restructuring of the human system around symbiosis with AI".

This sounds more like to basically have a socio-cultural breakthrough, you want a technological entity (some AI or whatever it turns out to be in the future) to control and direct human socio-cultural thoughts and execution. Transhumanism seems to be more oriented towards equipping humans to be interfaced(through some HCI - human computer interface) with computers around an AI, and other human built accessories (atleast for now) to enhance their mental abilities to project them as more powerful humans. One question is, how much do we actually know the power of our brain and abilities, if its properly used rather than enhancing with computers etc., Yes, there are great use-cases for HCI. But giving control to an AI to make your decisions or control humans in the name of technological break through is absurd.


you want a technological entity (some AI or whatever it turns out to be in the future) to control and direct human socio-cultural thoughts and execution.

Yes exactly!

Transhumanism seems to be more oriented towards equipping humans to be interfaced(through some HCI - human computer interface) with computers around an AI, and other human built accessories (atleast for now) to enhance their mental abilities to project them as more powerful humans.

It's currently sold that way, yes, but I don't see it that way. I envision it more like what I have been told/read about is the "Borg" from Star Trek (admittedly I've never seen it).

But giving control to an AI to make your decisions or control humans in the name of technological break through is absurd.

As absurd now as walking around on the moon was in the 1st Century.


"As absurd now as walking around on the moon was in the 1st Century."

Yes true(wonder why we don't shuttle down to the moon since we landed half a century ago and still never went back???), it is going to be absurd till we have an AI which is as capable(or even half as capable) as much as a human and we can debug that AI or maybe that AI debug's itself reflexively. Till then, I would say its absurd anytime to think that an AI could take control of a human and guide the human in its actions. Till then, I will keep my brain intact and keep away from any AI implants or any kind of implant altogether).


I don't think we need superhuman AGI to start having AI guide human actions. We already let AI guide us in daily life all the time, navigation, purchasing recommendations etc...

It's not a question of IF we will have AI direct our behavior, it's just a question of scale.


I understand, we use AI in one form or another in our daily lives. But for very specific and specialized narrow cases. As you mentioned, its that scale that matters which our brain does seamlessly and us as the generation with AI builders around, we are no where anything to perform in scale with an AI except very specialized tasks. It naturally will improve over time, but AI attaining AGI status and maybe taking control over socio-economic or socio-cultural changes would be a long shot unless we humans don't destroy ourselves by destroying life on earth.


We are somewhat offtopic, but: That sounds a bit like the church in the middle ages saying: "your sins are what is holding society back, that is why you can't have nice things". Or like many utopians, who wanted to create a "better man". Or their detractors, who said utopia doesn't work because people are flawed.

I don't want to change people, I want to change society.


I don't want to change people, I want to change society.

Somehow you think those are different things and I'm not sure why. Society only changes when enough people individually change. You can force change through government or coercion, but it's temporary and unsustainable.

utopia doesn't work because people are flawed.

Right, I'd agree with that.


What I mean is, I don't want Utopia for some kind of Adornoian "liberated human" or socialist übermensch, I want an Utopia in which I myself can live. Despite my flawed socialisation and biology. I think we can refactor society, instead of reforming or revolutionizing it. Keep the same people, don't coerce them into anything. Don't wait for future technological improvements, but reap the benefits of centuries of previous improvements. We have been technically able to satisfy the needs of everybody from a purely material point of view for decades now. Now we need to fix the allocation of these goods, and the crisis of political representation (or the lack thereof for many people).


I personally think that is impossible. It would be like training a rat to sing opera. We don't fundamentally have the hardware that would allow it.

We have been technically able to satisfy the needs of everybody from a purely material point of view for decades now.

"Earth provides enough to satisfy every man's need but not every man's greed" - Gandhi


Mass starvation and large-scale 'cleansings' will probably be a hard sell if the only upside is "Everyone can read your mind".


i don't think any technology should be put on hold because society isn't perfect especially when said technology has the potential to improve society immensely.


> If you were offered the chance to become as intelligent as Feynman but only if you turn over a copy of all your future thoughts to the private business that made this possible, would you take it?

"And, my students, if we equate both expressions, then, ... ahem, by the way, do you know how easy it is to get a brain extension from Acme Corp?"


Honestly, I think it's more worth focusing on the near term ethical problems of such technology rather than the far off ones. So let's say that this project does pan out, and three years down the road we determine a programmer with one of these neural interfaces installed can open 20 terminal windows at the same time. With a special set of emacs/vim bindings they can bang out code 100x faster than regular programmers. So now companies are sponsoring their programmers to get the surgery.

Sounds like a pretty good deal right? Well in three-five years time, surgery probably won't change much. Installing such devices is still going to be pretty risky. And even then, having the hardware in introduces its own medical risks. Our super-programmers can't ever go swimming because of the risks of the fiberoptic and power jack plug going through their skull getting infected. This is a problem current medical implants have, so probably one we will still have in a couple years time.

There are a whole host of other problems with this too. IE what do you do when the hardware in your head is obsolete? Can the company that paid to put it in fire you because you aren't keeping up, in part because you have obsolete hardware in your head?

[0]http://amputeeimplantdevices.com/what-are-the-risks/


Deaf people today face a similar choice: getting a cochlear implant, you don't control the software (afaik). This is a significant part of why I still have no cochlear implant. If we want freedom of thought in the glorious transhuman future, well, that could evolve out of what's happening today, and we ought to value human autonomy more in today's systems.

DRM is another such arena: as I said in another thread the other day, do you want it enforced by your brain implant? No? Then think about how you want to allow the issue to get framed.

Ditto for compelled decryption.


The question you should ask is would Feynman have agreed to it?

Because that's the person who will have to put up with it. You can only ask such a question if you're willing to give the future recipient of the gift the option to back out and be sent back to your present state. And if that new individual isn't 'you' in a legal sense you might find that contract nul and void anyway.

> Where it gets really strange is if you think of your wife or husband making a backup of their brain, then running a simulator on that backup.

There is a Black Mirror episode around that called 'Be right back'.


>There is a Black Mirror episode around that called 'Be right back'.

Well, they were approximations based on people's social media presence, and the episode's conflict mainly came from the imperfection of the approximation. I think you get a very different set of cultural conflicts if the simulation is of an accurate copy of the mind (issues more like in the book Permutation City).


makes me this of the original star trek which I have been rewatching recently. in the first episode [0] another race is capable of causing hallucinations and making humans experience anything they desire. later on you find out that the captain (not kirk in the very first episode) is found by kirk and now can only say yes or no through an version of Stephen Hawkings chair. Spock knowing that his brain is still fully operational decides through loyalty to Captain Pike to bring him back to this alien race so he can enjoy living in this fake world and not be in the handicapped state that he is. the episode closes with the Captain Kirk showing interest in wonder at the opportunity provided by this race.

I suspect given the opportunity many would take this. ( without knowing any of the consequences I tentatively would certainly take it in that situation)

[0] https://en.wikipedia.org/wiki/The_Cage_(Star_Trek:_The_Origi...

[1] https://en.wikipedia.org/wiki/The_Menagerie_(Star_Trek:_The_...


The Gentle Seduction is a short story that you might find interesting.

http://www.skyhunter.com/marcs/GentleSeduction.html


What troubles me is not the permission to read, but to write. While potentially useful for a host of medical applications, this is a scientifically plausible proposal to create technology that will literally have the ability to control minds. While you might be comfortable with sharing intellectual property with a business entity, giving that same entity the ability to adjust internal motivation, or alter sensory input is a dangerous Faustian bargain. How much would you trade for free will?


> What troubles me is not the permission to read, but to write.

Argh, I can just imagine having extremely realistic advertisements beamed directly into my mind, in exchange for something like immediate translation.

It's gonna be really weird when those perfume ads start before Christmas.


> If you were offered the chance to become as intelligent as Feynman but only if you turn over a copy of all your future thoughts to the private business that made this possible, would you take it? Something non-invasive, like a computer that images your brain while you sleep.

No. If you were offered wings so you could fly, but in exchange you must spend rest of your life in a cage, would you take it?


Honestly it would depend on the size of the cage.


2x2x2 meter cage.


It further depends on the amenities on offer, and the bird's size.

Hell, folks (including me) want to live on a Mars base, where "living in a cage" is probably an apt description of life for at least the first couple generations.

I'd sure as hell hand over a copy of my brain to a corporation in exchange for the ability to back up my brain and functionally live forever.


You missed the part I was replying to, when I replied it was the only content of the parent post. My analogy is about deal where you as a human get wings (ability to fly), but have to spend the rest of your life in a cage.


2x2x2 AU would still be too small.


Would it then become important for that business to give these minds that would not make them feel trapped and caged? Wouldn't the quality of results be hampered if lots of suffering were to occur? Could the mind at some point become useless since it would be drowned in depressive, cyclical, and often illogical thought? That, too, would create a research point, but would be pretty inhumane and I would hope an ethics oversight panel would not allow such states to exist.

So many questions surround this... If the body is dead, but the brain is still active of reasonable thought and to that end communication, who decides it's fate? Does a mind deteriorate once there is no outside stimuli of the senses?

Not that I would agree to be in this state, but man there would be huge fields of research to do.


There's a Black Mirror episode about this, but saying which episode it is would be a big spoiler. Just watch the whole series if you haven't.


Would it then become important for that business to give these minds that would not make them feel trapped and caged?

Sounds like a more reasonable basis for the genesis of the Matrix than what was portrayed in the movie. :-)


A more interesting question is, what if you wake up one day as one of 10 copies of yourself?

Would you today be worried about preserving the life of all 10 clones (with memories)?

We are all afraid to die - but what if tomorrow there were clones with the memories you have today?

The question basically is posed here

http://existentialcomics.com/comic/1


From the moment those copies exist, I'm not them and they're not me. We do not share subjective experiences. If you think we're equivalent and expendable, you'd best murder me in secret and pretend it never happened. You know, like incinerating me inside a teleporter or someth... oh crap, we're going there.

Yeah, I have a few issues with that comic.

> The man was not a murderer.

This is one of a few logical leaps I can't accept. His past selves are already "dead". His future selves do not yet exist. If, for example, all my ancestors are dead and my offspring not yet conceived, how does my suicide harm them in any way? None will so much as feel the grief of losing me, but I'm supposed to feel guilty over "murdering" them? (Let's not follow up on what this would mean for topics such as abortion!)

If future and past selves are of such importance that we not murder them, how can present copies be valued any less? If destroying the original is not necessary, then that must surely be murder as well. It's worse, actually, because then it's not even hypothetical; "The Machine" would be intentionally and unnecessarily ending a life.

Why do I possess the subjective experience of me, not you, and not some other me separated by time or space? Barring notions of a soul, either those other instances are not real in some sense or else there's something phenomenologically special about each, which I can only attribute to being uniquely present at a particular point in space-time. If the teleported me is a clone, their existence ought to have nothing to do with me, thus I should not be murdered for their sake. However, if the teleported me instead can only exist as a function of my past-self ceasing to be, then that's little different than how I am from any moment to the next.

TL;DR: If it works like the movie version of "The Prestige" where one copy is murdered purely for convenience, I'm not stepping into that teleporter.


The movie The Prestige deals with the concept of self copies too.

https://en.wikipedia.org/wiki/The_Prestige_(film)


I really wonder how valuable would augmented intelligence be , when AI will be so much smarter.

If it would be the same as physical strength today,maybe most people won't bother with that too much, and instead focus on other things,maybe more core to feeling well.

One of those things could be the experiences meditation practices talking about, enlightenment etc, and it seems that we won't need neuron level sensing to do that, but something more at the affordable fmri level or maybe eeg+localization, and those seem to be here much before intelligence augemntation.

So I wonder if we'll ever see intelligence augmentation.


We basically already have intelligence augmentation; it's just in the form of computers and smartphones.

Interestingly, despite widespread adoption of both, there's still some people "smarter" (i.e. better at certain tasks) than others because they can better apply the tools available.


All extrapolated technological symbolism that, when experienced as reality, will fail to stand up to the actual non-symbolic experience of existence.


what happens when someone copies a simulation of you and tortures it digitally to get all your passwords and information?


Just make sure not a forget a very important phrase: ignorance is bliss.


This this this... being intelligent will not make you happy.

I wouldn't take hyperintelligence for free.


People already essentially turn their brains over to businesses today, without such grand promises of intelligence or immortality.

Its actually maybe even a pretty accurate description of religion


Religion does make a grand promise of immortality.


To be fair, only some do. Others just assume endless reincarnation is a thing and promise escape. Whether you see that as real distinction is something else entirely.


which is what I said


yes I would download all the knowledge I could into offline storage and only connect for updates.


Reminds me too much of Black Mirror




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: