Neural Network Quine

Research Watch

NEURAL NETWORK QUINE

Two researchers seek to create self-replicating neutral network

ISAAC GODFRIED

 
Image-1.png
 
 

Oscar Chang and Hod Lipson, researchers at Columbia University, set out to create a self-replicating neural network as described in their paper, “Neural Network Quine.” But no, the results of their ambitious goal did not have the men shouting “it’s alive!” to the stormy heavens.

Machine learning’s neural networks were inspired by the synapses of the brain, part of technology’s long-standing effort to mimic biological systems. The ultimate mimicry, of course, would be self-replicating systems that take on a life of their own, challenging the meaning of ‘life’ itself. 

Alas, the results were more of a fizzle than a bang. Chang and Lipson begin their paper by surveying the idea of self-replication in computing and the link between it and biological replication by DNA, all very sexy ideas and the reason this paper got so much media attention. But the authors fail to add much to that lofty tradition. They don’t provide any state of the art (or even high scoring) experimental results or show any concrete way in which their approach would be useful. Nor do they provide any compelling examples for the overall utility of the quine. Quines are computer programs that reproduce their own source code.

In fact, an earlier version of the paper was rejected by the workshop track at the International Conference Learning Representations. Several of the ICLR reviewers noted that the paper is actually quite similar to some of Jurgen Schimdhuber’s work on “self-referential neural networks” in the early 1990s. Many ‘new’ concepts in machine learning are similar to work done several decades ago when scientists lacked the computational power to run experiments.  

 
 

THE NETWORK REPLICATES ITSELF BY LEARNING TO OUTPUT ITS OWN WEIGHTS

 
 

Chang and Lipson set out to develop a method in which “the network replicates itself by learning to output its own weights.” This isn’t exactly self-replication, but is an interesting exercise nonetheless. They propose viewing “a neural network as a differentiable computer program composed of a sequence of tensor operations.” 

They include an auxiliary task of classifying MINST images (a very common, albeit ‘toy’ task that involves accurately classifying handwritten digits). When they find that the classification makes the self-replication more difficult, they rather grandly claim that this is a Darwinian effect analogous to the trade-off between survival and reproduction by animals: as life becomes more difficult, fertility declines. 

Much of the paper delves into the challenges of making a neural network self-replicating and its ability to complete the necessary auxiliary task.  This involves modifying the loss function and adapting stochastic gradient descent to be more efficient.  The vanilla quine and the auxiliary quine are mostly the same except for a few minor tweaks to account for the additional task. Additionally, they define several non-gradient descent methods of training the quine, including hill climbing, and a method they define as “regeneration” where they replace “the current set of parameters with the weight predictions made by the quine.” 

Finally, the authors describe the experiments utilizing the quine. In the end they conclude that their network performs well at self-replication, but there is room for further improvement. On the auxiliary task they achieve 90% on MINST with the auxiliary quine, compared to 96% for a standard non-replicating neural network. Those results are not surprising nor insightful into the broader applicability of quines, however.  

Still, the idea is interesting and could provide a foundation for future work on a universal replicating neural network that could serve more practical purposes – repairing damaged computer code, for example – though there is no suggestion of how that would work.

 

EYE ON A.I. GETS READERS UP TO DATE ON THE LATEST FUNDING NEWS AND RELATED ISSUES. SUBSCRIBE FOR THE WEEKLY NEWSLETTER.