YUP and even if they build them our electrical infrastructure is so under-prepared to power them the Chinese are going to slaughter us on costs. Besides the fact that every few months new consumer-level local AIs are released that are so good that the average person, should they feel the need to use a datacenter AI, could just do it on her home computer.
I've tested Gemma4 on my local computer. It's good. 10+ tokens per second on a 26billion parameter MOE model. Supports text, voice, video, and image input. Can run on a video card with 8gb of RAM. I seriously don't know what the future is for big datacenter AI when local models are like this.