In this day and age, our focus on automating simple tasks is ever increasing, but how much will this focus actually produce, and will increased automation actually be counter-productive?
Basically, let me give you an example to better depict what I'm speaking on. Say, for instance, that in the far future every vehicle is controlled by an intelligent computer that coordinates with every other vehicle and the road in order to get you to your destination. This boasts numerous positives, such as there would be no more accidents, no more traffic, much faster travel times, less human stress, more accountability for being late, etc.. However, this draws the problem that without accidents and traffic, there's no need for insurance companies. There's far less need for mechanics and patrolling/detail police. This relinquishes a vast amount of jobs, and a vast amount of money spent from the market, which was not good.
So, my random thought from the cellar is, what if our technological progression is already, or will have to be, halted because of the necessary welfare of human society?
In other words, what if we hit a wall in our advancement that we don't pass; because if we did, we couldn't accommodate for everyone?
-
Edited by Woupsea: 12/31/2015 3:33:13 AMWe'll probably continue past the wall if it means saving money in the long run, people act on self interest and I don't think that sympathy for the poor mechanic that you'll be putting out of business will do anything to change that. This reminds me of the conspiracy that doctors have the ability to cure most diseases off but choose not to in order to maintain their job security