This is a poll. Thank you to vote before posting comments.
It seems plausible, if not rather likely, that one day humans will be able to build machines more intelligent than themselves. This would likely have all sorts of consequences, some of them good, some of them bad, for humanity as a whole or for some, possibly many, individuals. However, assuming we could do it, either we would do it or we wouldn't. Further, once someone discovers how to do it, it becomes very difficult not to do it....
Dilemma: Should we build machines more intelligent than human beings if we could?
It seems plausible, if not rather likely, that one day humans will be able to build machines more intelligent than themselves. This would likely have all sorts of consequences, some of them good, some of them bad, for humanity as a whole or for some, possibly many, individuals. However, assuming we could do it, either we would do it or we wouldn't. Further, once someone discovers how to do it, it becomes very difficult not to do it....
Dilemma: Should we build machines more intelligent than human beings if we could?