AI is already crazy now, I don’t think it’s going to get any dumber.
I think that society will become split into two parts,
Those who use AI / don’t mind it, and those who HATE it.
The ones who use it will probably find themselves doing a lot of super cool things, probably with not much effort… (Until Skynet comes around '>w>)
Those who don’t use it… Well, I mean we already have people like that in the world, so I guess there’s no need to predict anything. Either way they’ll both find themselves in some flavour of shit one day.
What I’m not looking forward to is AI learning how hardcore kicks work…
highly doubt that, as far as on a majority level. some people might try it. most will fail/die. succumbing is much easier. hardly anyone in nowadays existence is equipped to actually live fully off grid and succeed.
I think a lot of things are going to change in the near future. Right now, apart from chat bots and content creation AI (LLMs) is being integrated into a lot of products to do semantic search (RAG based due to context window limitations), summarization, content classification and the like. I believe that in the future all of this will be integrated into the OS.
The other thing that is coming is better and better integration into IDEs (for coding) like GitHub copilot getting way better.
I think we will also see, more and better diffusion models that will do things like music. Which I don’t like at all, but it’s coming, so I think that apart from the big celebrity, influencer adjacent artist, we will see a lot more people just doing music for community and connecting with other people. My take is that when most music will be generated by AI, people will start making music just to connect with others.
I say this phrase way too much, but I’ll say it again: I admire your optimism.
I’m not the biggest Venus Theory / Benn Jordan fan (although my sources tell me they’re actually two, separate entities with their own thoughts and feelings), but I found it to be interesting that he (the first one) recently made a video about this.
Looks like you guys got a lot of predictions right and it has only been a year; where does that leave us now?
i think the closer we get to singularity, the more we will be able to solve the problems we are dealing with now fairly easily… corrupt governments, corporatocracy, oligarchy, distribution of wealth and power. that will all go away once we all become something between The Borg, Gattica, and UFOP. We’ll try not to, but we will. Humanity will become much more hive mind. Intellectual property will not exist. Money will become worthless. Bad eggs in the batch will very quickly be trained how to not be shit heads.
Our viewpoints will shift rather quickly. I think in that future humans will have a very good maybe even regulated/encouraged exploration of our humanity, but I still believe in many ways that we also risk becoming Skynet ourselves. We are getting closer and closer in tech to fusing biology with technology and the deeper we look into space the more humans are wanting to get all up in it.
frankly i don’t know if the above will necessarily be a “good” thing.
what could also happen is the wrong dude has the keys to the nukes and man made cataclysm hits the reset button in the most fucked up ways. the Earth always heals itself no matter how we abuse it.
3rd option: asteroid. man, some days… i’m like… where u at asteroid?!?! all this living long and only kinda prospering bullshit got me like come on, just a little tap on the earth’s crust end it all.
i’m kidding. mostly. talking out of my ass for sure
@wayne - Your post made me think of Star Trek. If I remember the initial premise correctly, the general betterment of humanity that leads to Earth becoming peaceful globally and starting the federation of planets happens after WW3 or something hugely destructive like it. The survivors realized they were done with the old system and there had to be something better than total annihilation…
All in all, I agree with that prediction… I don’t see how humanity will ever improve significantly (and sustain social and worldwide peace) without getting to the brink of extinction at its own hands first.
Regarding the utopian visions about AI, this might be the outcome in the long run, but in the short run, please remember that these technologies are being developed by private corporations that are investing hundreds of billions of dollars and they are legally obligated to their shareholders to turn a profit.
I saw a video a couple of months ago (that unfortunately I couldn’t find a link to) of Sam Altman saying something like “if you asked specialists 10 years ago which jobs were at most risk to be done by AI, they would say that first repetitive manual work, then cognitive based work and last creative work. but now we think it’s going to be the exact opposite “. This is one of the most depressing things I’ve ever heard.