I agree with your view of AI Kathleen: it's like an amazing paint mixing machine. I don't really understand the leap our reptile brains are making to sentience, war, and annihilation.
Humans play God in so many ways, but we've always managed to outlive it. Maybe I could bear to have more of a sense of urgency, but if a biological weapon takes us out — fair.
Kathleen, good points all, but there's a MUCH bigger point. For the first time in human history, we are developing technologies that, with one mistake or deliberate action, can completely destroy civilization and in some cases, our entire species. AI may gain that type of control, but not for awhile. Other technologies include gene editing (Covid was just a minor example), nuclear weapons of course, quantum computing, EMPs, increasing human/machine integration, and others. As the severity of errors and unintended consequences from these technologies dramatically increase, the impacts and potential for complete societal failure increase. We are like 8 YOs evolutionarily who are given guns without any training
I'm definitely not trying the downplay the danger of it all (because it definitely exists), but I do think that the human race is a bit like cockroaches — we're going to outlive just about anything.
My guess is that, like with every technological revolution, we're going to evolve around it. (I also think that we have this distressing reaction to all of them too, but we're still here.) It might take some of us out of the equation in the process, but I suspect we'll learn to coexist with it.
I agree with your view of AI Kathleen: it's like an amazing paint mixing machine. I don't really understand the leap our reptile brains are making to sentience, war, and annihilation.
I mean… there’s probably some evolutionary advantage to our ability to peek around the corner and catastrohphize that helps us outlive these things. 😂
AI does not bother me as much as man made viruses.
Nice turn of a phrase with the "doppler effect". Well done!
Stahhhp it! ☺️
Humans play God in so many ways, but we've always managed to outlive it. Maybe I could bear to have more of a sense of urgency, but if a biological weapon takes us out — fair.
Kathleen, good points all, but there's a MUCH bigger point. For the first time in human history, we are developing technologies that, with one mistake or deliberate action, can completely destroy civilization and in some cases, our entire species. AI may gain that type of control, but not for awhile. Other technologies include gene editing (Covid was just a minor example), nuclear weapons of course, quantum computing, EMPs, increasing human/machine integration, and others. As the severity of errors and unintended consequences from these technologies dramatically increase, the impacts and potential for complete societal failure increase. We are like 8 YOs evolutionarily who are given guns without any training
I'm definitely not trying the downplay the danger of it all (because it definitely exists), but I do think that the human race is a bit like cockroaches — we're going to outlive just about anything.
My guess is that, like with every technological revolution, we're going to evolve around it. (I also think that we have this distressing reaction to all of them too, but we're still here.) It might take some of us out of the equation in the process, but I suspect we'll learn to coexist with it.