English 中文(简体)
XOR problem solvable with 2x2x1 neural network without bias?
原标题:

Is a Neural network with 2 input nodes, 2 hidden nodes and an output supposed to be able to solve the XOR problem provided there is no bias? Or can it get stuck?

最佳回答

Leave the bias in. It doesn t see the values of your inputs.

In terms of a one-to-one analogy, I like to think of the bias as the offsetting c-value in the straight line equation: y = mx + c; it adds an independent degree of freedom to your system that is not influenced by the inputs to your network.

问题回答

If I remember correctly it s not possible to have XOR without a bias.

I have built a neural network without bias and a 2x2x1 architecture solves XOR in 280 epochs. Am new to this, so didn t know either way, but it works, so it is possible.

Regards,

Yes, you can if you use an activation function like Relu (f(x) =max(0,x))

Example of weights of such network are:

Layer1: [[-1, 1], [1, -1]]
Layer2: [[1], [1]]

For the first (hidden) layer:

  • If the input is [0,0], both nodes will have an activation of 0: ReLU(-1*0 + 1*0) = 0, ReLU(1*0 + -1*0) = 0
  • If the input is [1,0], one node will have activation of 0 ReLU(-1*1 + 1*0) = 0 and the other activation of 1 ReLU(1*1 + -1*0) = 1
  • If the input is [0,1], one node will have activation of 1 ReLu(-1*0 + 1*1) = 1 and the other activation of 0 ReLU(1*0 + -1*1) = 0
  • If the input is [1,1], both nodes will have an activation of 0: ReLU(-1*1 + 1*1 = 0) = 0, ReLU(1*1 + -1*1 = 0) = 0

For the second (output) layer: Since the weights are [[1], [1]] (and there can be no negative activations from previous layer due to ReLU), the layer simply acts as a summation of activations in layer 1

  • If the input is [0,0], the sum of activations in the previous layer is 0
  • If the input is [1,0], the sum of activations in the previous layer is 1
  • If the input is [0,1], the sum of activations in the previous layer is 1
  • If the input is [1,1], the sum of activations in the previous layer is 0

While this method coincidentally works in the example above, it is limited to using zero (0) label for False examples of the XOR problem. If, for example, we used ones for False examples and twos for True examples, this approach would not work anymore.





相关问题
Backpropagation issues

I have a couple of questions about how to code the backpropagation algorithm of neural networks: The topology of my networks is an input layer, hidden layer and output layer. Both the hidden layer ...

How to determine for which value artificial neuron will fire?

I m trying to determine for the articial neuron shown below the values (0 or 1) for the inputs i1, i2, and i3 for which it will fire (i0 is the input for the bias weight and will always be -1). The ...

Prototyping neural networks

from your experience, which is the most effective approach to implement artificial neural networks prototypes? It is a lot of hype about R (free, but I didn t work with it) or Matlab (not free), ...

AIML pattern matching - howto?

I m having a problem trying to understand how does AIML pattern matching works. What s the difference between _ and *? And how I should use them to get the best match? I have this document only, ...

Continuous output in Neural Networks

How can I set Neural Networks so they accept and output a continuous range of values instead of a discrete ones? From what I recall from doing a Neural Network class a couple of years ago, the ...

Competitive Learning in Neural Networks

I am playing with some neural network simulations. I d like to get two neural networks sharing the input and output nodes (with other nodes being distinct and part of two different routes) to compete. ...

热门标签