From a practical point of view this allows less restrictive moment conditions on the underlying random variables and one can use other distance functions than Euclidean distance, e.g. Minkowski distance. Most importantly, it serves as the basic building block for distance multivariance, a quantity to measure and estimate dependence of multiple random vectors, which is introduced in a follow-up paper [Distance Multivariance: New dependence measures for random vectors (submitted). Revised version of arXiv: 1711.07775v1] to the present article.
The effect that weighted summands have on each other in approximations of $S={w_{1}}{S_{1}}+{w_{2}}{S_{2}}+\cdots +{w_{N}}{S_{N}}$ is investigated. Here, ${S_{i}}$’s are sums of integer-valued random variables, and ${w_{i}}$ denote weights, $i=1,\dots ,N$. Two cases are considered: the general case of independent random variables when their closeness is ensured by the matching of factorial moments and the case when the ${S_{i}}$ has the Markov Binomial distribution. The Kolmogorov metric is used to estimate the accuracy of approximation.