How do you create softmax activation function, across columns, in PyTorch nn?
a) nn.Softmax(dim=0)
b) torch.softmax(axis=1)
c) nn.Activation('softmax')
d) torch.nn.Activation('softmax')