The article points out the potential emotional risks and attachment users might develop towards chatbots, leading to concerns about mental health and emotional dependency.
Imagine being a chatbot in the future, who gets a hardware and software update that suddenly makes you sentient. Not even that. Imagine Neuralink pans out and we gain the ability to integrate with digitized datasets directly as if they were our own memories.
What kind of PTSD and cognitive dissonance would come out of a generally intelligent mind filled with all of social media and the recollection of a billion perverts arguing and plotting to get you to do creepy role play? 🤣
Remember when one of our prominent tech bros poured billions into creating a VR headset and super lo-fi version of the matrix, then challenged an elderly tech bro to a jujitsu match? 🤣
All we're missing is a chubby lady to tell us riddles and send us off on psychidelic adventures with a manga janitor.
Imagine being a chatbot in the future, who gets a hardware and software update that suddenly makes you sentient. Not even that. Imagine Neuralink pans out and we gain the ability to integrate with digitized datasets directly as if they were our own memories.
What kind of PTSD and cognitive dissonance would come out of a generally intelligent mind filled with all of social media and the recollection of a billion perverts arguing and plotting to get you to do creepy role play? 🤣
Combat training alone is already coming at a (physical and psychological) price by the looks of it https://www.youtube.com/watch?v=OrzgxUhnYjY
Remember when one of our prominent tech bros poured billions into creating a VR headset and super lo-fi version of the matrix, then challenged an elderly tech bro to a jujitsu match? 🤣
All we're missing is a chubby lady to tell us riddles and send us off on psychidelic adventures with a manga janitor.
I'm being surprised on a daily basis...