To answer some of your questions:
- For a circular convolution (the operation we usually use for binding) the resulting vector will be dissimiliar to both bound vectors.
- For a sum the resulting vector will be somewhat similar to all the constituents (by how much depends on whether the vectors will be normalized after the sum).
- The dot product is more of a comparison operation. It tells you how similar two vectors are. Because they are reduced to a single scalar, you cannot really compare that single number to either of the original vectors.
In theory, I believe (not entirely sure), no information is lost because the vector components are real numbers with infinite precision. (Though, there are some specific vectors which act as an absorbing element and would destroy all information. Like multiplying with 0.)
In practice, however, neurons can represent the vectors only with a limited precision. Thus, in a real system information will be lost. But this is not necessarily a problem as long as enough information is retained to recover the original vectors with a cleanup memory.
I assume you mean assign a number to each letter and encode that number as a binary vector? In that case you many vectors are already highly similar as measured by the dot product. Adding those vectors together will give a vector similar to the summands, but it will probably also be similar to other vectors in this representation.
In the SPA we usually use vectors that are almost orthogonal. To encode the 26 letters 26 dimensional vectors with a single dimension set to 1 for each letter could be used. In that case each vector pair is dissimilar (orthogonal in fact) and the sum is similar to all of its constituents. Note that we don't need perfect orthogonality for this to be still approximately true. This allows us to fit more almost orthogonal vectors into the space than there are dimensions.
Given two arbitrary vectors P1 and P2 and assuming that P1 is not the absorbing element there should be a transformation from P1 to P2 I believe. If that transformation also brings you from P2 to P3, depends on how P1, P2, and P3 are constructed.