@Wise
The 3 directives does not require to create a machine with the 3 directives. Nor is necessarily the intention to harm humans to create a machine without the 3 directives.
Machines are pretty stupid. They do exactly what they are told to do. If you have not told it specifically that creating a robot should not be created without the 3 directive may harm humans and if the AI has not determined that the machine without the 3 directives can do harm to humans (aka have not thought of it), is perfectly possible for the machine to do it without violating his directives.
This issue was thoroughly examined in Frank Herbert's "Destination: Void" series (a separate 'universe from Dune). http://en.wikipedia.org/wiki/Destination:_Void