This has been a perennial problem in advertising before the algorithmic age. Changes in behavior over time will change the data and, thus, the outcome. But the pace of change will be slow, too slow for many. Consciously overriding how one uses the data & model outputs can implement important checks on potentially discriminatory impacts without corrupting problematic data sets. The problem here is much bigger than embedded bias (which of course is bad). The problem is in assuming that past behavior always and everywhere indicates future actions. This assumption leaves no room for individual choices, for individuals who proactively seek out atypical opportunities, for individuals who do not fall into the middle of the distribution. The assumption that average past behavior determines future outcomes is too deterministic with or without the bias element, especially when using data from prior eras when consumers made decisions without realizing that their choices were being watched, cataloged, and computed. Social media changes behavior both because more options are available and because people are more self-conscious that they have an audience. People are not merely statistics. They are not merely passive recipients of marketing collateral. The bias issue is real, but it also sheds light on a much bigger problem regarding how behavioral data is used.