Suppose you have a neural network with linear activation functions. That is, for each unit the output is some constant c times the weighted sum of the inputs. a. Assume the network has one hidden layer. For a given assignment of the weights w, write down equations for the value of the units in the output layer as a function of w and the input layer x, without any explicit mention of the output of the hidden layer. Show that there is a network with no hidden units that computes the same function. b. Repeat the previous calculation, but this time for a network with any number of hidden layers.