
I'm not ashamed, I'll be the first person to admit that I really don't use AI that much if at all. At least not knowingly. It seems pretty much every SAAS (Software as a service) is integrating "AI" into their products these days. I see it as more of a marketing gimmick since that is one of the big buzz words right now, but I think eventually there could be some benefits.
As a former English Literature major and someone who has been writing for quite some time now, I don't really feel like I need AI. In fact, I wrote a post a while ago talking about a letter I wrote for my wife to give her employer. She thought sure I used AI to write it because of how good it was. Nope, all me baby...
I was working out this morning trying to think about what I was going to write about today, when for some reason, my mind started drifting towards AI. That's just how my mind works, it's a literal maze in there, but the connections generally make sense to me. Occasionally @mrsbozz will ask me to walk back how I got from one topic to another and she is always amazed at how quickly I can do it and the path my mind took to get from point "A" to point "B".
But I digress...

So as I said, this morning I was thinking about AI and how some people think we are getting pretty close to the point of "singularity". There's another term I have heard people use, but I can't seem to recall it right now. Basically though, it's the point where artificial intelligence surpasses human intelligence. Of course, if you browse some of the comments online (especially when it comes to politics lately), you might guess that we have already passed that point. It seems like a pretty low bar that is getting lower by the minute...
More specifically, I was thinking about the science applications of AI. We have already seen AI used to decipher scrolls and ancient texts that have long been unreadable. What happens when this gets applied to science.
As I said, this morning I was foreseeing a time when AI starts to answer physical and chemical science questions and concepts that we potentially aren't ready for. In addition to that, I feel like once that first law of physics gets blown out of the water by some new concept AI discovered, it's going to be a sort of domino effect.

@tarazkp has written in the past about how he feels AI can be quite detrimental to society. While I will be the first to pull out a calculator when the occasion calls for it,
I also think there is a critical hazard in getting the answers without putting in the work.
We've all seen the movies where AI tries to take over the world. What if it doesn't happen like that. What if the world ends because we were screwing around with stuff we didn't really understand to begin with. Sure, once the puzzle is solved, the pieces might fall into place for one or two high functioning people, but will that be enough?
Of course, the opposing point could be made that AI is just a tool used to come up with those answers. I think that's where the big question comes in, is AI a tool, or is it cheating? Is there even a difference?
As I said, I have a feeling once that first block gets knocked out of place, the whole Jenga tower is going to come tumbling down. I can't tell you what it will be either. Cold fusion? Faster than light travel? Teleportation? Something we haven't even considered yet. Which again begs the question are we ready for the responsibility of managing something we can't even wrap our heads around right now?
Trust me, I know I'm not the one who is going to figure it all out. I just hope that when something like this does happen, at least one person actually understands the "why" and the "how". Yes, there is a certain bliss in just using things without understanding the machinations behind the scenes, but where does that leave us if we are all ignorant?
I'm not sure I am okay with where this is heading.
My Sports Account - @bozz.sports
