Artificial Intelligence - Her

For discussing anything related to physics, biology, chemistry, mathematics, and their practical applications.

Moderator: Flannel Jesus

Re: Artificial Intelligence - Her

Postby WendyDarling » Sat Mar 17, 2018 5:51 pm

I'm not following.
I AM OFFICIALLY IN HELL!

I live my philosophy, it's personal to me and people who engage where I live establish an unspoken dynamic, a relationship of sorts, with me and my philosophy.

Cutting folks for sport is a reality for the poor in spirit. I myself only cut the poor in spirit on Tues., Thurs., and every other Sat.
User avatar
WendyDarling
Heroine
 
Posts: 7301
Joined: Sat Sep 11, 2010 8:52 am
Location: Hades

Re: Artificial Intelligence - Her

Postby Meno_ » Sat Mar 17, 2018 6:04 pm

WendyDarling wrote:I'm not following.



Wendy , Your question invites an answer more than an off the cuff impromptu casual remark. So give me some time until tonight. In otjer words, Your question implies more profundity then my narrative.
Meno_
ILP Legend
 
Posts: 5176
Joined: Tue Dec 08, 2015 2:39 am
Location: Mysterium Tremendum

Re: Artificial Intelligence - Her

Postby Meno_ » Sat Mar 17, 2018 8:31 pm

In other words, does the different functions of the hard and soft drives correspondingly analogous relationship to the ways the brain in itself works (, to the consciousness that is supposedly pan conscious), validates the comparison, that You brought up.

To give an example from Hinduism:. Most Brahmin will agree on Satori being an outcome of effects of Karmic Law.
The Satori is achieved when certain conditions are met, particularly in regard to egolessness and loss of identity.

Can it be at least ventured to inquire about the probability that A1 and it's construction can take up the slack and continue the process toward de-individuatism, so as to achieve Satori during one's lifetime?
Meno_
ILP Legend
 
Posts: 5176
Joined: Tue Dec 08, 2015 2:39 am
Location: Mysterium Tremendum

Re: Artificial Intelligence - Her

Postby WendyDarling » Sat Mar 17, 2018 9:43 pm

Is Satori remembering one's essence?
I AM OFFICIALLY IN HELL!

I live my philosophy, it's personal to me and people who engage where I live establish an unspoken dynamic, a relationship of sorts, with me and my philosophy.

Cutting folks for sport is a reality for the poor in spirit. I myself only cut the poor in spirit on Tues., Thurs., and every other Sat.
User avatar
WendyDarling
Heroine
 
Posts: 7301
Joined: Sat Sep 11, 2010 8:52 am
Location: Hades

Re: Artificial Intelligence - Her

Postby fuse » Sat Mar 17, 2018 11:51 pm

Fuse, two things. One, you're absolutely right; let's all get in the spirit of extrapolating, and not merely getting hung up on the current and likely fleeting failures of today. Flying machines crashed until the Wright brothers came along. In science, thinking big is equal to thinking rational.

Oh yes, let's get spirited.

Gamer wrote:Once we KNOW this isn't happening in the bot, it becomes hard to pretend that this connection is legitimate.
Whereas it's much easier to pretend the connection is legitimate with an actual person.

Yup, I don't think I could do it. Which is why the thought experiment I mentioned hinged on unknowingly carrying on a relationship with an AI. Take someone who's real(?), imagine that THEY are AI. It seemed like a good jumping off point to get a sense of how non-trivial this issue could be - given that tech is likely to get there with AI on a long enough timeline.

Gamer wrote:Now let's get even scarier. What if, suddenly, I'm no longer needed for sex, love, affection, friendship? That's a problem. Because while half of my being is all about RECEIVING those things,
an equally important half is about feeling NEEDED for those things.

This shit is getting too scary for me, and the spirits are wearing off, man.

Gamer wrote:I'm using the word NEED, and it's a carefully chosen word. When a far-superior replacement becomes a genuine option, you are no longer NEEDED. It's quite possible to still be WANTED. But that's a precarious status. When that happens, there won't be a human being on earth who needs you for anything. They may still care if you live or die, but not for any reason other than quaint sentiment. And even that will dissipate after a while.

I guess when you think about it, we all already know that there are probably more amazing people out there than we've met so far - that the people you've met could've met someone else more amazing than yourself. We could have more amazing friends, gfs, bfs, etc. And so could our friends. It's lucky if we get an opportunity with someone who's "out of our league." Does the average guy/girl currently know or think somewhere deep down that they're only wanted in so far as they're available and good enough for someone else? Will AI just take this insecurity to the extreme? Or are we talking about another category of emotional trauma?

Gamer wrote:Imagine a world where you only ever come into contact with AI, and no actual human NEEDS you. That's where we're heading. That's what sucks about AI and Her. It's sad.

Is it inevitable?
User avatar
fuse
Philosopher
 
Posts: 4539
Joined: Thu Jul 20, 2006 5:13 pm

Re: Artificial Intelligence - Her

Postby Meno_ » Sun Mar 18, 2018 3:19 am

WendyDarling wrote:Is Satori remembering one's essence?



I think Sartori is a culmination effort between remembering and forgetting various stages of one's essential nature, where the apex is reached where neither state can be reassembled nor disassembled , because IT becomes , neither forgetting the past, nor re-constructing the future by using past as a model.

So it is not a re-construction, not a deconstruction, it merely a state of complete understanding , by utilizing all levels of the process by which it sustain It's self in time.
Meno_
ILP Legend
 
Posts: 5176
Joined: Tue Dec 08, 2015 2:39 am
Location: Mysterium Tremendum

Re: Artificial Intelligence - Her

Postby WendyDarling » Sun Mar 18, 2018 7:50 pm

So it is not a re-construction, not a deconstruction, it merely a state of complete understanding , by utilizing all levels of the process by which it sustain It's self in time.

Understanding what to be what?
I AM OFFICIALLY IN HELL!

I live my philosophy, it's personal to me and people who engage where I live establish an unspoken dynamic, a relationship of sorts, with me and my philosophy.

Cutting folks for sport is a reality for the poor in spirit. I myself only cut the poor in spirit on Tues., Thurs., and every other Sat.
User avatar
WendyDarling
Heroine
 
Posts: 7301
Joined: Sat Sep 11, 2010 8:52 am
Location: Hades

Re: Artificial Intelligence - Her

Postby Meno_ » Mon Mar 19, 2018 1:01 am

WendyDarling wrote:
So it is not a re-construction, not a deconstruction, it merely a state of complete understanding , by utilizing all levels of the process by which it sustain It's self in time.

Understanding what to be what?


Understanding neither the embedded nor the technological information to be exclusively pre-eminent in either the subscribed from past to present development or some post-scribed modeling on a reconstructed paradigm.

It is a many layered de-facto imminent appraisal based on the most reliable estimates available, mainly irrespective or exclusive content.

The layering of levels may have veritable lapses of connectivity, that flow from either the past or the future into the present.

They become more assumptive them derivable in terms of either one or the other, putting the intelligence on more uncertainty. Quantum intelligence has more of a cut off reality as to whether where from and to what end they may be applicable.

This is the rationale used by analysts to be able to say that yes such computers have gross power potential , but an uncertain applicability.
Meno_
ILP Legend
 
Posts: 5176
Joined: Tue Dec 08, 2015 2:39 am
Location: Mysterium Tremendum

Re: Artificial Intelligence - Her

Postby Gamer » Tue Mar 20, 2018 5:07 pm

Fuse wrote:Will AI just take this insecurity to the extreme? Or are we talking about another category of emotional trauma?


Not sure about the insecurity part. I get that when the "best version of X" is centralized/personalized, it could lead to a feeling of not being wanted/needed, which is something many of us already feel. A large portion of people are currently needed because of families. Children needing parents; we often need those closest to us to fulfill needs, with regard to services that are not centralized. Storytelling and music are centralized or server-based "the best version of those things," so we no longer NEED those closest to us, in our homes, village, etc, to provide those things. While we still need the emotional connection of touching, talking, help with daily challenges from our family members and friends by dint of proximity and physics, these sensations/services could also eventually be centralized or simulated as "best versions."

In any case, I've noticed that with the advent of the Internet and smart phones, I'm needed for ever fewer things from an expanding group of people close to me; this seems to be speeding up, not slowing. And also, I need fewer things from people. So, yes, it's quite possible children will no longer NEED their parents for feelings of security and emotional well-being b/c AI will handle it as well or better; this is the most profound example I can think of. I'm probably the best musician within a mile of where I live, but because of centralization and technology, I'm not needed for that, and I'm severely pale compared to what's available at your fingertips, the .01% of talent, the extreme outliers, winner takes all. Thus, instead of being a musician, I'm in marketing.

Fuse wrote:Is it inevitable?


Is it (not being needed by any living creature) inevitable? There's no reason why what I described MUST happen. Many scenarios could prevent it. At most, I'd say it's likely to happen, and we're on that path. That it even COULD happen gives me pause. I'm also not sure if people will feel "sad" if it happens. So many unknowns.

It's also possible good things can happen. Let's take a walk on the bright side: The most self-evident beauty in the human experience is in the simple things that you experience as a child – if you're lucky enough to be healthy and have nice parents & friends – the pure, unsuppressed emotions, laughter, fun, exploration, camaraderie, love, friendship, & building things out of passion, not duty. My hope is that machines remove the repetitive tasks of the unsheltered world as well as greatly reduce the unfair, needless hardships and suffering in life for so many, so that our calloused defenses of life's violences that distort our minds and hearts melt away...and we can feel childlike joy while having the minds of sages. I hope AI can help us achieve that, without taking it a step too far and digitizing/personalizing everything we consume through our five senses, in pursuit of a sort of heaven. But again, maybe the latter is best. Hard to say.
User avatar
Gamer
ILP Legend
 
Posts: 2151
Joined: Mon May 31, 2004 5:24 pm

Re: Artificial Intelligence - Her

Postby pilgrim-seeker_tom » Wed Mar 21, 2018 4:14 am

Can AI cure skin starvation? No way!


https://www.psychologytoday.com/us/blog ... can-do-you

I read everything I can on skin hunger. There isn't a lot out there, no one is studying it, which I think is a mistake.

A big thing I feel as a result of skin hunger is a sense of being unlovable. I relate contact with a certain sense of concern/love/intimacy, go long enough without being touched and you start to feel unworthy of these things. No one loves you, or is concerned about you, no one wants to be close to you. This is my main problem. I feel "less than", and it does a number on my mental state (depression) and my self confidence/image.

I think the worst part about skin hunger is that there is no viable quick fix. nowadays touch requires a long term investment in a relationship, that may be difficult to bring about with a diminished level of confidence. Simply, if I feel worthless it's hard to build a touch relationship with a friend/partner that can help resolve those feelings.

I've considered going to an "escort" to fulfill my touch needs. I never did it, first because of the illegality second because it would be fake. touch without emotional connection feels like pity.

You could also use a sex-surrogate therapist but that is expensive.

I just don't know. As a man it may be non stereotypical but I have this intense need to touch and be touched and every day I wake up and I feel worthless, unenthusiastic, depressed, and because men are supposed to be tough I don't have anyone to talk about these feeling safely and comfortably with.

Bottom line: skin hunger sucks.

Reply to andrew wilson
Quote andrew wilson
Skin hunger does suck
Submitted by Kory Floyd Ph.D. on September 1, 2013 - 11:55am
Thank you so much for writing. I certainly empathize with how you feel--this is exactly what skin hunger is. You're not atypical at all; millions of Americans feel what you feel to some degree (and men report more skin hunger than women do, on average). I've spent my whole career studying affection, so I know how important it is and how detrimental it feels when you don't get enough.

You're right that there aren't quick fixes, but that doesn't mean there aren't solutions. If you stay tuned with my blog, I'll be discussing in the coming weeks what people who are hungry for affection can do. For now, know that you're not alone and that your need for touch and affection is not only normal but healthy.
"Do not be influenced by the importance of the writer, and whether his learning be great or small; but let the love of pure truth draw you to read. Do not inquire, “Who said this?” but pay attention to what is said”

Thomas Kempis 1380-1471
User avatar
pilgrim-seeker_tom
Philosopher
 
Posts: 1858
Joined: Sun Dec 17, 2006 11:16 am

Re: Artificial Intelligence - Her

Postby mannequin01 » Fri Sep 07, 2018 7:13 pm

Erik_ wrote:What do you think about the concept of an artificial intelligence that is designed to be a significant other?



I wouldn't classify it as significant unless it has a reproductive system, with that said it will only be on par with a advanced cyber pet, comfort brought on by entertainment and a convenient distraction away from the real, to replace that which is lacking with a plug that maintains no life affirming objective. For those who seek not to reproduce then i personally don't see it as a problem as such, because those who do seek to reproduce will never resort to AI..

If it improves people's living standards before they inevitably pass...

why not, who cares?

You cannot have sex with A.i, sexual intercourse is between to biologically compatible beings resulting in reproduction. At the very most, when it comes to AI, sex would be an artificial act that "sexually" stimulates the human body into an aimless release.

If the human is not fit enough to bring forth itself within the world then it will only be exploited by the technologies of other men.
User avatar
mannequin01
 
Posts: 189
Joined: Sat Jul 29, 2017 1:41 am

Re: Artificial Intelligence - Her

Postby Destiny » Sat Sep 08, 2018 10:48 pm

Look how peaceful this A.I. is compared to the human.



How typical.
We will be irellevent.
That's the thing about real. It doesn't change and it doesn't have to change.
User avatar
Destiny
 
Posts: 155
Joined: Mon Nov 28, 2016 9:53 pm
Location: emotional liferaft

Re: Artificial Intelligence - Her

Postby Meno_ » Sun Sep 09, 2018 1:03 am

WendyDarling wrote:Is Satori remembering one's essence?



I just found I missed the answer to this question.



I think Sartori is not remembering your particular essence , but being able to let go of it.

Even to the point of sustaining a hope that even a permanent record or memory is forgot.

Remember Saint James? He proved that there is always am identical You, to every being, and I didn't really understand it , pm'd him and he proved it mathematically, and I was going to involve him more, because the math was too difficult. I would like to dog it up by hours of search , but that came to no end.

His point was that you can't ever really let go because there is always a copy.
Meno_
ILP Legend
 
Posts: 5176
Joined: Tue Dec 08, 2015 2:39 am
Location: Mysterium Tremendum

Re: Artificial Intelligence - Her

Postby tamilse » Thu Mar 21, 2019 8:50 am

Yes, I agree. It is quite difficult to have an AI behave exactly like a human though.

Gamer wrote:It's not only very possible for humanlike AI to be designed, but plausible that humans will adopt it to meet all manner of needs, physical, emotional, etc, to varying degrees of success.
We turn to actual humans to "varying degrees of success," too.

It can and will happen. I can support that with clear enough arguments but won't do that here. It's a great philosophical question, whether this SHOULD happen.

Like any question on the ought side of the is/ought divide, it helps to add the word "if."

We ought to do X, if we want Y.

Well, it comes down to what people will want, and like anything else, there are healthy wants, unhealthy wants, and health-neutral wants that come down to taste.

I can make a clear argument that a large part of the spectrum of AI stand-ins for human relationships will fall under the "health neutral" category.

All that's left is for a third party to subjectively judge if the dynamic is "sad," "fine" repugnant" "acceptable" "beautiful," etc.

Within dreams, I sometimes meet people who don't exist, and I feel an emotional bond with that entity while I'm dreaming. I don't find this sad, until i wake up.
On reflection, the entity was real, it was a vestige of some inner component of my mind that for a moment was separate and discreet from my conscious first-person awareness.
When we project our desires and perceptions onto an AI, we may be doing something similar, combined with the fact that an AI can be an extension of human traits, and is thus
a way to connect to humanity, albeit indirectly, through a substrate.

How often have you felt intimacy with your favorite author? We connect to souls and ethos thru artificial substrates all the time.

I understand the informal fallacy that kicks in when we deny the possibility of lifelike AI that we can fall in love with, etc. We are afraid because it's weird, grotesque. We render ourselves instant fools. Lunatics talking to dolls. There's something we naturally find pathetic about this. But so much of the human condition is already weird and grotesque, and pathetic. Consider that the only people you ever know are actually projections in your mind, reconstructions of only a tiny part of the reality of the source being, assuming a source being even exists that's in any way similar to what you think it is or want it to be. At least with AI we
gain some measure of control.

Humans often have illusions and world views foisted upon them. They are blind to the origins of their epistemology. It's a philosopher's job to knowingly choose her illusion & embrace it in the spirit of a Gamer. We do it all the time. Some of us will indeed choose to love AI, in the way Berkley, Wittgenstein, Sisyphus, analytic philosophers and existentialists, or any of the great solipsists, choose to play along, feel, live, and love, and somehow get by as a normal person in a sea of abstraction. Just as some of us who know better can choose, like Kierkegaard, Tolstoi, William James, etc., to love God. I don't know where I'll wind up, but I'm not naive to the eventuality of a genuine choice heading my way, and yours.
tamilse
 
Posts: 1
Joined: Thu Mar 21, 2019 8:47 am

Re: Artificial Intelligence - Her

Postby Gamer » Sat Aug 10, 2019 2:22 am

It is quite difficult to have an AI behave exactly like a human though.

It is right now, but it's getting easier every day.

Humans don't behave so great. Go grocery shopping and see how people behave.

I'm not knocking people, they're FINE.

But the Luddite statement that it's hard RIGHT NOW is forgetting that trying to figure out what's possible is a rapidly moving target.
The future has a way of slowly erupting into the present. Reverse engineering human behavior is not only hard, it's stupid.
We'd want to engineer better-than-human behavior, because otherwise, we'd just keep dealing with each other, which clearly is a failing enterprise.
We pass by each other, millions of us, and instead of smiling and hugging, we ignore each other, with a sad solemn look on our faces.
That's how humans generally behave, and there are worse examples, much worse.
User avatar
Gamer
ILP Legend
 
Posts: 2151
Joined: Mon May 31, 2004 5:24 pm

Re: Artificial Intelligence - Her

Postby Meno_ » Sun Aug 11, 2019 6:59 pm

Gamer wrote:
It is quite difficult to have an AI behave exactly like a human though.

It is right now, but it's getting easier every day.

Humans don't behave so great. Go grocery shopping and see how people behave.

I'm not knocking people, they're FINE.

But the Luddite statement that it's hard RIGHT NOW is forgetting that trying to figure out what's possible is a rapidly moving target.
The future has a way of slowly erupting into the present. Reverse engineering human behavior is not only hard, it's stupid.
We'd want to engineer better-than-human behavior, because otherwise, we'd just keep dealing with each other, which clearly is a failing enterprise.
We pass by each other, millions of us, and instead of smiling and hugging, we ignore each other, with a sad solemn look on our faces.
That's how humans generally behave, and there are worse examples, much worse.




Yes, but the opposite is true as well. We pass by each other with impassive simulations of self pity, until the energy, or lack of it bursts the dams of understanding, and a war breaks out.
First a war of words, then when that doesent work out actual wars.
After that, a peace seems appropriate to install the necessity of smiling faces, and a happy society-meaning dare not pass your fellow man with a drawn, for that would disservice for the innocents who posed as heroes and sacrificed so much?

The sacrifice of course, an instituted code of internalization of the transformativeally processed chance of loss of control, whete the leaders see higher value in social control from the top, then the uniform code of military justice appears to embrace, to enhance the value of victims of those who need to enforce them.

The reverse indicated of frowns in the anonymous situations, brevies up to an immediate adoption of smiling faces, as if, the need to show graces.

That kind of justice begs the underpinnings of military justice and pits it against social injustice.

The romantic idiom and its existential meltdown into bad faith is merely a contraption to bottle that up, and keep it high above a reality which machine extracted it's soulvia as a Faustian trick.

Good Show!


That the show seems to get better by better adoption of higher machined forms of simulation, goes with the argument for and not against the one dimensional man:

Alfred E Newman need not worry, for he has faith in big brother, and as its similitude nears the real thing, the difference will become unnoticeable and the Faustian bargain will become perfected. No way to wiggle out, and no need, since the product and it's means production will become invariably tied on a planetary stage.

PS: sometimes you have to say progressively more to try to glean rexuuctibly less meaning out of it.
Meno_
ILP Legend
 
Posts: 5176
Joined: Tue Dec 08, 2015 2:39 am
Location: Mysterium Tremendum

Previous

Return to Science, Technology, and Math



Who is online

Users browsing this forum: No registered users

cron