pull down to refresh

If AGI’s role is to handle the cognitive heavy lifting, then the true value of human thought might lie in what transcends cognition—those elusive qualities that AI can’t replicate. Intuition, creativity, and emotional experience aren’t governed by facts or logic; they’re deeply tied to our personal and subjective experience. These may represent the essence of human consciousness—qualities that live on the other side of cognition’s asymptote, untouchable by AI.
Research shows that artistic and scientific creativity may rely on different brain networks. Artistic creativity, tied to emotional and intuitive processing, contrasts with the logical reasoning used in scientific creativity. If AI assumes more of our problem-solving and data analysis, we might see a shift—a neuroplasticity-type modification in how our brains work—focusing more on creativity, intuition, and emotional experiences. In this new reality, our deeper human qualities could come to the forefront, redefining how we experience the world.
I am not sure how much i agree with the writer, but as i have only cognitive ability and scientific creativity, much less artistic creativity, intuition and emotional experience, i am likely not in a position to have an indormed opinion.
The artistic people among us (@plebpoet, @dillon,... tag others you can think of if you think it's worth it), do you believe AGI will be able to do what you're doing one day? Or do you think you're humanity is what gives you your creativity?
I think the distance from LLMs to AGI is much, much larger than the distance from early chat bots to LLMs. Like, we jumped from the floor onto the sofa, now we imagine jumping to the moon.
Sam Altman has economic motives for grossly misrepresenting the proximity of AGI. Of course AGI would change the world. So would cold fusion or time travel.
I'm more worried of large crowds of people taking LLM output as truth, or crashing the NASDAQ when the parlor tricks fail to astound anymore.
reply
*your humanity *Informed
reply