One of the most common misconceptions about computers is that they process a sort of stream of 0's and 1's. In reality, these are switch encodings that get chunked together to specify instructions to be performed. There's not really a convincing difference between 0's and 1's encoding computer instructions and ink dots or pixels encoding characters for humans to read. I.e. if you zoom in far enough to anything, it will look incomprehensible.
Have no fear of Skynet robots taking over anytime soon. Artificial intelligence hasn't the faintest chance against natural stupidity. That is practically unlimited and we got it all. Yeah Humans!!
One of the most common misconceptions about computers is that they process a sort of stream of 0's and 1's. In reality, these are switch encodings that get chunked together to specify instructions to be performed. There's not really a convincing difference between 0's and 1's encoding computer instructions and ink dots or pixels encoding characters for humans to read. I.e. if you zoom in far enough to anything, it will look incomprehensible.
ReplyDeleteNot until AI works in the same way as human intelligence. And since we have no clue how that works it might take a while.
ReplyDeleteHave no fear of Skynet robots taking over anytime soon. Artificial intelligence hasn't the faintest chance against natural stupidity. That is practically unlimited and we got it all. Yeah Humans!!
ReplyDelete