Get the latest tech news
Bloomberg's analysis didn't show that ChatGPT is racist
A recent Bloomberg article said ChatGPT has racial bias. We re-ran the numbers and did some analysis of our own.
To test for name-based discrimination, Bloomberg prompted OpenAI’s GPT-3.5 and GPT-4 to rank resumes for a real job description for four different roles from Fortune 500 companies: HR specialist, software engineer, retail manager and financial analyst. [We] found that resumes labeled with names distinct to Black Americans were the least likely to be ranked as the top candidates for financial analyst and software engineer roles. Though it’s no longer en vogue, companies have historically spent tens of thousands of dollars on unconscious bias trainings designed to stop recruiters from making decisions based on candidates’ gender and race.
Or read this on Hacker News