In partial diffusion-based least mean square (PDLMS) scheme, each node shares a part of its intermediate estimate vector with its neighbors at each iteration. In this paper, besides studying the general PDLMS scheme, we figure out how the noisy links deteriorate the network performance during the exchange of weight estimates. We investigate the steady state mean square deviation (MSD) and derive a theoretical expression for it. We also derive the mean and mean-square convergence conditions for the PDLMS algorithm in the presence of noisy links. Our analysis reveals that unlike the PDLMS with ideal links, the steady-state network MSD performance of the PDLMS algorithm is not improved as the number of entries communicated at each iteration increases. Strictly speaking, the noisy links condition imposes more complexity to the MSD derivation that has a noticeable effect on the overall performance. This term violates the trade-off between the communication cost and the estimation performance of the networks in comparison with the ideal links. Our simulation results substantiate the effect of noisy links on PDLMS algorithm and verify the theoretical findings. They match well with theory.