It didn't specify if fresh water meant non-salt water from a natural water source, or treated drinkable water. I'd need to see the rainfall averages for the region the data center is in, susceptibility to drought to know how much of a problem this actually is. Data center locations are often chosen based on cooling costs, availability of greener electricity, water, etc. This blanket statement, "only 3% of the world is fresh water" is pointless and alarmist language because a good portion of the world is desert or arid. Some places have very little water, others have an over-abundance, and location matters. Most of the water problems regarding access to clean water and access to drinkable water have more to do with over populating a region because of the nice weather (LA, CA for example) beyond what is sustainable or practical. Putting a data center in Illinois or something has very little do to with the problems of access to water in southern CA, where movies like Chinatown as far back as the 70s depicts the problem of access to water being a huge problem because it gets little rainfall being close to the desert and isn't a good place to put a large populace and not have that problem.
If they're building the data center in the desert or a drought susceptible region, where fresh water usage is way past its limits, fine, but if the data center is in the Upper Midwest or parts of the Pacific NW, the consumption of water there isn't going to have any impact on the areas that have a consumption issue.
I couldn't find on this page how to quantify the breakdown between water lost on site to evaporation or to rejection from pure to grey water, and water lost elsewhere for electricity production, or earlier for semiconductor manufacturing. But I have the impression that water lost on site is quite small compared to indirect loss.