> I recall early facial recognition model having pretty hard time with dark skinned faces
The recently adopted app Customs and Border Protection mandates for asylum seekers still has that problem, its not a mere historical footnote of early efforts.
> In fact I remember the ChatGPT team putting tons of engineering into supervised learning to not let it fall victim to regurgitating these biases.
And I remember seeing demonstrations (here, on HN) within the last week of how ChatGPT still falls victim to regurgitating those biases. (And then will lie about its ability to assure that it won’t do so in the future.)
The recently adopted app Customs and Border Protection mandates for asylum seekers still has that problem, its not a mere historical footnote of early efforts.
> In fact I remember the ChatGPT team putting tons of engineering into supervised learning to not let it fall victim to regurgitating these biases.
And I remember seeing demonstrations (here, on HN) within the last week of how ChatGPT still falls victim to regurgitating those biases. (And then will lie about its ability to assure that it won’t do so in the future.)