There is artificial intelligence (AI) and artificial general intelligence (AGI). The former is about intelligently operating machines that can respond to input from the environment to take action and achieve a specific goal. This is not like natural intelligence (NI) in biology like humans and other animals have.
Artificial general intelligence however, is like natural intelligence, only it's not natural, hence artificial. AGI is about doing what a human can do, performing intellectual functions that human beings can. Current "narrow/weak AI" development hopes to progress towards AGI, a real "strong/full AI". The desire to create AGi/full-AI has been surrounded by hype for decades.

Source
Is real general artificial intelligence around the corner?
Marvin Minsky, who co-founded the MIT AI laboratory, has done work on cognition and AI. He is considered to be the father of AI. He once said that "... from three to eight years, we will have a machine with the general intelligence of an average human being. I mean a machine that will be able to read Shakespeare, grease a car, play office politics, tell a joke, have a fight."
But this was said way back in 1970. His 3 to 8 year vision didn't seem to work out. It seems there has always been hype around technology, robotics and smart machines to do what humans can do, and this comes with unrealistic timeframe expectations.
There has always been hype around robots and robot intelligence that is artificially created by humans. When nothing came about from the big promises and hopes of a future, this field of research, AGI or AI, was abandoned by many thinkers for decades. Some saw it as a pipe dream because the prerequisite technological developments had not yet been created to further develop the concept. Thus came the "AI Winter" where we have been slowly unthawing our way out of.
The dream of a techno robotic future was crushed. It's taken time to work up to that image and dream again. There is buzz anew surrounding AI and the deep learning being applied to beat humans in games like chess and Go.
The first early AI blunder was focusing on rule-based learning to try to emulate human reasoning. The lab results fueled the hype, and then when a report showed that all they were good at was in the lab, but not real world complexity. The funding dried up and so began the AI winter. No more government funding, no more students who instead sought out more promising careers.
But recently things have been moving to make AI more popular again. 1997's Deep Blue chess defeat of Kasparov. In 2005 131 miles was driven by an autonomous car. And in 2011 IBM's Watson beat two Jeopardy contestants. Having hit the mainstream again, AI was back in the hot seat and out of the cold long winter, with new visions and dreams for the future.

Source
The savior of the AI winter was deep learning which forms a neural network hierarchy to filter information and make sense of the world. In looking at an object, its scans layers of various types, like layers of edges, texture, brightness, and so on. Then an object can be recognized from previous training. With enough training, patterns are developed and new unfamiliar objects can also be recognized as something as well.

Source
Much of the terminology we use towards AI is metaphorical and anthropomorphic. "Neural" network, "cognitive" computing, deep "learning". Machines don't think or learn. These are just familiar terms for us to simplify communication through referencing something with ourselves. This has been done for thousands of years, as far back as Ancient Egypt.
Even when we say AI, its not true AI yet. It's just computations being done, but not a free thinking artificial intelligence. Machine learning is not real learning. Neural networks are not neurons. Just to be clear. We should not mistake the metaphors for a literal reality. Some people don't make this distinction, and take the terms literally and put too much trust into the technology. When accidents can happen, such as with self-driving cars, then "AI" gets blamed, but it was always a machine programmed by humans, not a free thinking entity.

Source
Many people think that with enough power, the programs could begin to learn on their own and go from their. So now there is an AI seduction going around. People don't understand how AIs function due to their level of complexity, so they imagine an emergent intelligence is on its way, just around the corner. But it isn't. We don't even understand how consciousness is created in ourselves, so we can't realistically create it somewhere else anytime soon.
Rule-based mathematics, or deep learning, might never yield a real artificial intelligence. A cat is a cat, real or in a picture, for the "AI" we have now. The AI doesn't distinguish between the two. It doesn't have a real understanding of the world like a human.
This doesn't stop more students from taking up AI studies though. There is a new vision for AI in the future, and people are trying to get in early. Whether it pans out or not has yet to be seen. The field of AI may be settings itself up for yet another fall if it focuses too heavily on one field again, like it did with rule-based learning. Deep learning may seem great now, but the limitations may show up sooner or later.
Here is the Gartner Hype Cycle. It shoots up with the hype, then falls, then levels off.

Source
The super hyper around AI may lead to it's downfall again with more unrealistic expectation of it in the near future. But the mainstream keeps chugging along the image of AI and its potential, with no signs of a bubble popping or that there is a bubble. AI is a hot topic with a lot of expected growth to come. The industry is booming. The hype around fake news is a drive for AI to detect and even create it.
Here is a more detailed Gartner roadmap:

Source
Maybe things will boom and more progress will be made, or maybe we will reach another block for a few decades. For now, it's a hot investment that many people are trying to get into. Autonomous vehicles and AGI are still 10+ years away according to the Gartner hyper cycle of emerging technologies. Blockchain technology is another hype that still has 5-10 years before we see wide-spread adoption in the mainstream. A lot of things are coming tough.
References:
- Artificial intelligence
- Artificial general intelligence
- Marvin Minsky
- AI is So Hot, We’ve Forgotten All About the AI Winter
- AI winter
- New Scientist Issue 3082
Thank you for your time and attention. Peace.
If you appreciate and value the content, please consider: Upvoting, Sharing or Reblogging below.
me for more content to come!
My goal is to share knowledge, truth and moral understanding in order to help change the world for the better. If you appreciate and value what I do, please consider supporting me as a Steem Witness by voting for me at the bottom of the Witness page.

