In experimenting with language learning, a research algorithm ended up deviating from human language in a way that wasn’t really useful… it started generating what one might call “functional gibberish”. It was functional in that it continued to carry information, but it wasn’t very efficient or useful.
The result however was fascinating because it showed the algorithm’s capability for generating its own encoding scheme.
This shows how far an automated social language product can take you. It certainly depends on how they manage and cotrol their algorithms which is responsible for driving the end result.
In Facebook's case, the algorith wasn't really 'shut down' like how it was being reported and shared world wide.
This practice is fairly common where an algorithm designed doesn't output the desired results or is not accurate and many a times has to be re-built and redesigned altogether.
With that being said, AI definitely has the capacity and capabilty of one day becoming 'alive' and 'conscious'. Only time will tell where we are headed and what future holds for us.
The result however was fascinating because it showed the algorithm’s capability for generating its own encoding scheme.
This shows how far an automated social language product can take you. It certainly depends on how they manage and cotrol their algorithms which is responsible for driving the end result.
In Facebook's case, the algorith wasn't really 'shut down' like how it was being reported and shared world wide.
This practice is fairly common where an algorithm designed doesn't output the desired results or is not accurate and many a times has to be re-built and redesigned altogether.
With that being said, AI definitely has the capacity and capabilty of one day becoming 'alive' and 'conscious'. Only time will tell where we are headed and what future holds for us.

No comments: