Machines, future enemy of mankind and are possible for its extinction...
By alanthony333
@alanthony333 (116)
Philippines
October 11, 2008 11:33am CST
As of now, many modern machines are made. They already have cars that can move by its own and even computers and many things that have a mind that can understand.
But, can this things arrived in such a point that they will betray us and imposed superiority?
As of today, even if they do, they can not fight us. But in the future, they can. It is possible that a war vehicle can be unmanned and it is possible that it can think of its own and can even betray us. Because of that, they fight those who made them. Computers can send messages with others who are the same with them that are advanced. Even in the terminators and stealth movie, they really got a point on that.
Now, do you believe that there can be a possibility of such?
2 people like this
3 responses
@alanthony333 (116)
• Philippines
12 Oct 08
Now, i really wanted to read that book. But, i still have difficulty where can i get that. I am living in Philippines and is still 15 years old so i really do not have idea where to get it except for the national bookstore. By the way, does that novel also tells about anti-machine thing..? Just asking..
Thanks...
2 people like this
@alanthony333 (116)
• Philippines
12 Oct 08
Mmm, i haven't read it yet. I will search about it.
By the way, i just got the idea from terminator and matrix.
Thanks...
1 person likes this
@xtedaxcvg (3189)
• Philippines
11 Oct 08
I believe the only time machines would truly be man's enemy is when one of its own (man) betrays him. Machines have a set of protocols and a list of objectives and it is only limited to them so the corruption would definitely come from outside. The input would definitely be the culprit whether it is accidental or on purpose. You have to remember that machines are non-sentient beings so they cannot truly discern good nor evil.
@alanthony333 (116)
• Philippines
11 Oct 08
The last line, that is really my point. They do not know what is good and evil and that means they treat everything neutral and the same. They do not classify a good and bad situation. They do not have feelings. Because of the technology, it is possible that one day, they can have the power to think by their own and one day think to be the most superior being in the universe.
By the way, thanks for the reply...
@xtedaxcvg (3189)
• Philippines
12 Oct 08
Again, the input is the key. Unless someone programs a machine to annihilate the human race it won't do so. Yes, machines don't have the ability to discern good and evil. But it also lacks the drive and motive to initiate such actions.
@alanthony333 (116)
• Philippines
12 Oct 08
I think you were right. But, there is a possibility that the one who putted the input can make the computer have those abilities right..?
Thanks...
@stvasile (7306)
• Romania
11 Oct 08
There is a possibility of that happening, but I don't think it will.
You are following the Terminator or I, Robot scenario, where machines develop self consciousness and own will, overriding somehow the three basic laws of robotics.
However, I support the other scenario, that prevails in Isaac Asimov's Foundation megaseries (I'm including The Robot Novels, The Foundation series, the Empire series - the 15 books with a number in front of the following list http://en.wikipedia.org/wiki/Foundation_series#List_of_books_within_the_Foundation_Universe)
In that series robots eventually master a plan for Universal harmony, by merging the Galaxy into one supreme being, according to the Gaia hypothesis (http://en.wikipedia.org/wiki/Gaia_hypothesis).
@alanthony333 (116)
• Philippines
12 Oct 08
I also think so. There really is a possibility like the stealth movie when a lighting struck the mind of the U AV. They can be a time when they have there own consciousness and maybe think to be superior on us all...
By the way, thanks...
1 person likes this