The problem is to teach the computer to do an addition. Computers know numbers: he "knows" that after 1 goes 2 after 2 goes 3 and so on. Having that data computer can easily get the next number.

Next, computer have the knowledge that x+0=x and x+(y+1)=(x+1)+y. This axioms let computer to do addition. For example, to add 5 and 3, computer makes following: 5+3 = 5+(2+1) = (5+1)+2 = 6+2 = 6+(1+1) = (6+1)+1 = 7+1 = 8.

But this is too long to add numbers in such a way. The problem is to develop a program that can improve this way of addition using the rules of mathematics and logic. The goal should be that addition must be executed in O(log(N)) time, not O(N) time, N is the magnitude of added numbers.

Does this program have any science value? Is there any program that can do such things?