On the subject of splitters, as a general rule each 2-way split cuts signal power by 3.5 dB, or a little more than half-power. When folks add to a coax network, it's not uncommon to split an existing line that's already been split, then split it again. Those losses are additive, so if you put four 2-way splitters in series, you can end up with the last two leads in the chain (off the last splitter) both being -14 dB while the first split is only -3.5 dB. You can and often will get different performance at the various end users, though that also depends on how good each TV is at sensitivity and error correction (assuming digital).
The better way is to use a 4- or 8-way splitter, which will share the pain across all ports while eliminating losses associated with each cable end (splits are internal), with home runs to each TV. Big splitters are also usually (but not always) amplified, to negate the effects of splitting more than that of long cable runs, since a clean piece of good quality cable doesn't lose that much signal (a pretty fuzzy statement).
When we first moved in, my cable co. replaced the mess of splitters with a single amplified 2-way splitter with one leg to the modem*, and the other to an 8-way splitter that serves most of the rest of the house (had to add another 2-way off the 8-way eventually). And just so you know, there are amplified splitters that split off for a modem at the first internal 2-way split, then split the remaining ports for TV.
*Cable guy said they always split off for the modem first, so it has the cleanest signal they can reasonably provide, then split the rest to the TV's and cable boxes where applicable. That was almost 20 years ago, and I have no idea what's considered best practice these days. And now every TV in my house that has cable has a box, since it's all digital and encrypted here.