The Anatomy of a Siamese Network

A siamese neural network starts with two identical processing branches which share weights -- meaning the corresponding weights at each side start identical and receive the same updates as the network is trained. The representations at the outputs of the branches can be directly compared, or flow into further processing before an output is generated. A consequence of the shared weights is that the output is symmetric w.r.t. the ordering of the inputs -- i.e. in the figure below if "input A" were to flow into "branch B" (and "input B" into "branch A"), the output would be the same.

Figure illustrating the architecture of a siamese network.