John Prine would have known what to do about this:
> Blow up your TV.
> Throw away your newspaper.
> Move to the country.
> Build you a home.
> Start a little garden.
> Eat a lot of peaches.
> Try to find Jesus
> on your own.
Hey, that gives me an idea: we could start a company which lets people talk to an AI version of God (Buddha, Mohammed, Gaia - it's selectable in your profile). Unless someone already has, which seems likely, now that I think about it. I hear the foundations of the world groaning under the weight of all this.
Buddha is just dead silence. Until you throw it a koan and then it scares the living shit out of you. Speaking of which, I have not tried koan with ChatGPT yet.
I’m not sure I’m tracking on the second point. Can you clarify the crap data/obfuscate part? What will make us harder to be known/manipulated? (If you’re saying that)
It's like watching the end of the human era in slow motion, an unraveling of everything that makes us sentient and beautiful, thread by thread by thread, until we're a lifeless heap of organic material that AI uses to spin her own future. But I do want to laugh with kittens.
Ai is exciting right now because its wrong answers appear to be creative. The better it becomes the more generic and boring it will become. Social media users will generate so much crap data using ai that surveillance capitalism will no longer be feasible. ai will obfuscate, not take over.
Let's face it - most humans aren't all that good at being parents, lovers, friends, loving sons or even baristas & bartenders. AI will allow us to exist in a world where we can be supremely alone while never being lonely (unless we want to). It seems pretty perfect to me.
Do you think that meaning in the Frankl sense can be created there? And in a world of sublime captivation would meaning matter anymore?
Separate track: Is there a point at which you feel anxiety about surrendering into this new world? I’m guessing that neither of us want to float in a vat to have a perfectly controlled experience. Where do you grow uneasy with the pivot to digital experience? Obviously somewhere, but where is that edge for you?
For me that edge is when we’re ready to injure a person to have or restore a digital relationship. There’s real evidence we’re seeing this already with Replika.
First of all, with all due respect to Frankl, we live just because our instincts are set up through evolutionary pressure. Not due to meaning or search for meaning. My personal favorite model is the Maslow pyramid. And self actualisation is a pretty flexible concept...
That being said, you're right I'm not keen on being in a vat/living in a constant artificial state of happiness, though I cannot quite articulate why. After all, if this is because I enjoy the "pain"/efforts it takes to achieve something, a machine good enough to induce artificial happiness would also be good enough to induce "effort" beforehand, to make the "achievement" feel worthwhile.
Where do I grow uneasy about artificial experiences? I'm not sure. But I don't find it (morally) problematic that someone would be willing to fight/hurt a person to restore a digital relationship. Presumably, you would be willing to fight/kill for your friends and/or spouse... What does it matter that your friend/spouse is ethereal or corporeal?
https://apple.news/AoiOKuS90TpmKWZvUipWKTg
I'm stunned. I'm not stunned.
John Prine would have known what to do about this:
> Blow up your TV.
> Throw away your newspaper.
> Move to the country.
> Build you a home.
> Start a little garden.
> Eat a lot of peaches.
> Try to find Jesus
> on your own.
Hey, that gives me an idea: we could start a company which lets people talk to an AI version of God (Buddha, Mohammed, Gaia - it's selectable in your profile). Unless someone already has, which seems likely, now that I think about it. I hear the foundations of the world groaning under the weight of all this.
Buddha is just dead silence. Until you throw it a koan and then it scares the living shit out of you. Speaking of which, I have not tried koan with ChatGPT yet.
https://apple.news/AIWQ_xjDPRLe9aTsfQm9wCQ
Not wasting any time…. https://www.msn.com/en-us/news/technology/ai-human-romances-are-flourishing-and-this-is-just-the-beginning/ar-AA17Rhuz
I’m not sure I’m tracking on the second point. Can you clarify the crap data/obfuscate part? What will make us harder to be known/manipulated? (If you’re saying that)
This is horrifying.
https://www.nytimes.com/2023/02/16/technology/bing-chatbot-microsoft-chatgpt.html
It's like watching the end of the human era in slow motion, an unraveling of everything that makes us sentient and beautiful, thread by thread by thread, until we're a lifeless heap of organic material that AI uses to spin her own future. But I do want to laugh with kittens.
Ai is exciting right now because its wrong answers appear to be creative. The better it becomes the more generic and boring it will become. Social media users will generate so much crap data using ai that surveillance capitalism will no longer be feasible. ai will obfuscate, not take over.
TURN. IT. OFF.🙉🙈
Yes... And it will be glorious.
Let's face it - most humans aren't all that good at being parents, lovers, friends, loving sons or even baristas & bartenders. AI will allow us to exist in a world where we can be supremely alone while never being lonely (unless we want to). It seems pretty perfect to me.
Do you think that meaning in the Frankl sense can be created there? And in a world of sublime captivation would meaning matter anymore?
Separate track: Is there a point at which you feel anxiety about surrendering into this new world? I’m guessing that neither of us want to float in a vat to have a perfectly controlled experience. Where do you grow uneasy with the pivot to digital experience? Obviously somewhere, but where is that edge for you?
For me that edge is when we’re ready to injure a person to have or restore a digital relationship. There’s real evidence we’re seeing this already with Replika.
First of all, with all due respect to Frankl, we live just because our instincts are set up through evolutionary pressure. Not due to meaning or search for meaning. My personal favorite model is the Maslow pyramid. And self actualisation is a pretty flexible concept...
That being said, you're right I'm not keen on being in a vat/living in a constant artificial state of happiness, though I cannot quite articulate why. After all, if this is because I enjoy the "pain"/efforts it takes to achieve something, a machine good enough to induce artificial happiness would also be good enough to induce "effort" beforehand, to make the "achievement" feel worthwhile.
Where do I grow uneasy about artificial experiences? I'm not sure. But I don't find it (morally) problematic that someone would be willing to fight/hurt a person to restore a digital relationship. Presumably, you would be willing to fight/kill for your friends and/or spouse... What does it matter that your friend/spouse is ethereal or corporeal?