My friend sent me a transcript of some conference about AI that took place here in Romania recently. The main theme was, would the recent innovations in tech and artificial intelligence bring about a new Dark Ages or a new Enlightenment? It's precisely the same redundant, self-defeating crap that most of the "intellectual strata" of today are on about, naturally likening themselves to the illustrious minds of the Enlightenment, rather than the sad peasants of the Dark Ages.
The professor holding the talk outlined three ways to change our own personal thinking/approach to learning that would help us face the AI takeover with more grace or something.
Take charge of our own curriculum. He said, explaining as if it were some great big revelation that we spend our defining years being led by the nose by self-interested third parties in the form of school/university. Absurdly, he didn't suggest we remove said third-parties, just that we somehow become more self-taught and aware, which to me seemed kinda absurd.
Be more aware that we're ever learning. Put a greater price on introspection, since our brain is a sponge that's ever-moving. And in today's day and age, everything we come across can be a potential teacher.
Finally, the man suggested being more proactive. Taking on new roles so as to learn new behaviors and attitudes.
Privately, I was all for the changes this man suggested, yet I felt a little miffed. These were all conclusions I personally reached ten years ago. They were conclusions that the people I've known all these years, on the fringe of the educational community, who often favor self-led unschooling/homeschooling practices have also known for years.
And I'm sure we're not the only ones. As I pointed out to my friend, it seems to me the real innovators caught onto these sane educational practices ten, twenty, thirty years ago. They're often the ones in charge of the big tech companies. Leading the AI. Now, it's just trickling down to the masses, which takes away from the valiant, hopeful tone of the professor's lecture. Rather, it's a little depressing that we're only catching on now. Seems tardive, lazy, and all in all not very helpful.
Always, it is the early adopters who will reap the real rewards, not the latecomers to the party who are only folded in once everyone else has grown bored with these "innovative ideas". Personally, I'm not overly interested in what any academic has to say about the manner in which I (or anyone else should learn), since academia is a thing of the past. It is, quite clearly, on its way out, despite some interesting efforts to revive it and keep it afloat.
Academia killed itself, both through the bizarre political involvement it has in the West, as well as through the sudden influx of information. In part, I'm sorry to say, since I too once nurtured the romanticized dream of a university of enlightenment, and idea-sharing, and mutual growth, but the moment the Internet first sprouted, academia started a long and difficult illness.
I'm even sorrier to say that many of us missed the starting ramp. Through lazy, complacent thinking, too much of our education (as a society) has relied on old, unreliable systems. There were people who jumped off the ramp early on, but we mislabelled them as fringe, and told them they would amount to nothing. Now, it's decades later, and the stragglers are catching up, but the fringe early adopters have not, in this interim, been idle. They've run far up ahead, further than we could ever dream of catching up.
Another thing that bugged me about the talk (as many other talks in today's space), was the intense, jarring insistence on what I need to change about my life. As ever, the focus seems to be on small, personal changes, which also strikes me as self-defeating.
Always, in this neverending discourse of doom and gloom, we're told how we need to change our small, personal existence on this planet to save Ukraine, fight climate change, combat the AI takeover. All the while, the real players aren't changing jack.
I don't think turning off my lights will much affect Putin's welfare. Certainly not as much as if Biden stopped funnelling weapons into Ukraine would.
Would my turning off the light, or using less plastic have an impact on our ecosystem? I don't know. I mean, I try to do my bit, but then again, it would be a lot more impactful and a lot less jarring if these dictums of how to behave came accompanied by global efforts from big corporate conglomerates.
Finally, all these "helpful" suggestions as to how I can adapt to the changing AI world. Learn more skills. Learn better. Become more aware. Pay attention. Broaden your mind.
Rich coming from a system (and yes, universities and uni professors are very much a part of the toxic system) that's spent so many resources on keeping us down, with our minds closed.
It's always about how I can adapt, but never about how Elon Musk or Bill Gates or the Google guy whose name I never bother Googling could change. How they could maybe stop. How we could actually ensure AI doesn't come back to bite us in the ass, and take over so many people's jobs.
Because (I've grown tired of saying it) while I may be capable of broadening my own mind and learning better, taking on new roles, and directing my own education, not everyone is.
What about those people? Fuck 'em?
I'd agree, but that's not in keeping with our faux-humanitarian "we're all in it together" approach. We seem to be on this global play-pretend that everything will turn out okay with a smile and a gung-ho attitude. Just like we were in 2020. Alas, things didn't turn out okay then. People died. Freedoms were given away. Children were forever scarred.
If the text seems a little impassioned, it's because it is. That is where a cheer and a gung-ho attitude will get us. With people dead, and kids scarred. So every time, I hear someone getting peppy and talking about how we can "adapt" to the brave new world, as breezy as if we were summer-accessorizing, I get a little sick in the stomach.
You should too.