When you're training an AI, that "mere data" adds up. Random error averages out, getting closer to zero with every data point. Systematic error leaks information about the system that keeps making the error.
A Harry Potter book doesn't ruin an AI's world model by contaminating reality with fantasy. It gives it valuable data points on human culture and imagination and fiction tropes and commercially successful creative works. All of which is a part of the broader "reality" the AI is trying to grasp the shape of as it learns from the vast unstructured dataset.
A Harry Potter book doesn't ruin an AI's world model by contaminating reality with fantasy. It gives it valuable data points on human culture and imagination and fiction tropes and commercially successful creative works. All of which is a part of the broader "reality" the AI is trying to grasp the shape of as it learns from the vast unstructured dataset.