Thu. Dec 12th, 2024

I tried a site where AI matched me with someone based on my ‘hotness’ rating. It’s really messed up.<!-- wp:html --><p>I tried Hot Chat 3000, a new website from the art collective MSCHF that says it's helping to expose bias in AI systems. I uploaded my picture, it rated me, and the bot matched me with someone it said rated similarly.</p> <p class="copyright">Screenshot of Hot Chat 3000</p> <p><a href="https://hotchat3000.com/">Hot Chat 3000</a> is a dating website where AI determines your "hotness" using huge datasets.<br /> The tongue-in-cheek website then lets you chat with other people within your "hotness" bracket.<br /> The project was developed by MSCHF, a <a href="https://www.businessinsider.com/mschf-company-behind-viral-jesus-shoes-feet-generator-bull-moon-2020-1">New York based art collective</a> that's known for 'Satan Shoes.'</p> <p>We're living in a brave new world where even our "hotness" level can be determined by AI — and a new website wants to expose just what we're getting ourselves into.</p> <p>I tried it to determine what AI supposedly thinks of me — but more on that later.</p> <p>The <a href="https://www.businessinsider.com/mschf-company-behind-viral-jesus-shoes-feet-generator-bull-moon-2020-1">art collective MSCHF</a> launched the website, called <a href="https://hotchat3000.com/" target="_blank" rel="noopener">Hot Chat 3000</a>. It bills itself as a "1-to-1 online chat website where who you can talk to is contingent on how attractive you are, and how attractive you are is determined by AI."</p> <p>But the AI is only as good as the data that's training it — and that's where MSCHF is stepping in, essentially asking the question: Just who — or what — gets to determine who's hot?</p> <p>When a user enters Hot Chat 3000, they're asked to upload a picture of themselves as electronic music hums in the background. The picture is "analyzed" and given a "hotness" rating on a scale of 1-to-10, which in turn dictates who the user can chat with. So a user who earns a score of 5.4 will be matched with another person who scores between 5.0 and 5.9. </p> <h3><strong>The rating system depends on certain large language models</strong></h3> <p>Hot Chat 3000 <a href="https://hotchat3000.com/tech-statement.pdf" target="_blank" rel="noopener">says that its rating system</a> relies predominantly on <a href="https://openai.com/research/clip" target="_blank" rel="noopener">CLIP</a>, a large machine-learning model from OpenAI that has been trained to "choose the correct captions for a particular image among a number of incorrect text captions." The model itself was <a href="https://arxiv.org/abs/2103.00020" target="_blank" rel="noopener">trained on a dataset of 400 million image-text pairs</a>. </p> <p>In addition to CLIP, the site's rating system utilizes the "Hot or Not" dataset from the website hotornot.com as well as the SCUT-FBP5500 dataset for facial beauty prediction. (Hot or Not <a href="https://www.businessinsider.com/hot-or-not-mobile-relaunch-2014-6">was a website in the early 2000s</a> that let people rate whether users who had submitted their pictures were "hot" or "not.")</p> <p>—Gabriel Whaley (@Gabriel_Whaley) <a href="https://twitter.com/mims/statuses/1651654492566298624?ref_src=twsrc%5Etfw">April 27, 2023</a></p> <p>"Both datasets are heavily biased towards certain ethnicities and not at all representative of the broader population," the site notes. To that end, Hot Chat 3000 also said it used a small album of pictures from Google that featured a broader selection of ages, races, genders to manually adjust its model. </p> <p>"The outputs of LLMs are a reflection of the data they were trained on," the group says, referring to large language models. "Hot Chat 3000 very deliberately sets out to expose, visualize, exacerbate these biases," the site says. </p> <p>Which is to say that no one should take their Hot Chat 3000 "hotness" score seriously. The group <a href="https://news.artnet.com/art-world/mschf-hot-chat-3000-ai-chatbot-2291171" target="_blank" rel="noopener">told</a> Artnet that "MSCHF's approach is always to participate natively in the space we are critiquing or satirizing. A.I. will be folded into a million arbitrary applications."</p> <p>Hot Chat 3000 is one of several tongue-in-cheek projects that MSCHF has produced since it launched in 2016. </p> <p>The collective — which is headed by former BuzzFeed employee Gabriel Whaley — once described itself on LinkedIn as "a dairy company." Its roster of stunts over the past few years include AI-generated photos of feet, an app for making stock investments based on astrological signs, and a "Satan Shoe" made with human blood that lead to a lawsuit from Nike, Insider <a href="https://www.businessinsider.com/mschf-company-behind-viral-jesus-shoes-feet-generator-bull-moon-2020-1">previously reported</a>.</p> <p>It's unclear how much revenue the MSCHF generates from its endeavors. According to PitchBook, though, the company was valued at close to $120 million at its Series B round in April 2021 in which it raised funding from venture firms including Peter Thiel's Founders Fund. Whaley didn't respond to Insider's request for comment. </p> <h3><strong>The moment of truth: Testing Hot Chat 3000</strong></h3> <p>I braced myself and tested Hot Chat 3000 over two different days — uploading 15 different pictures of myself to understand where I fell on the AI-determined "hotness" scale.</p> <p>I got an error message for some, but for the six pictures the site could read, I was given a comically wide range of scores between 3.7 and 6.8. The same picture was also given different scores on different days. </p> <p>When "6.0" was matched with me, he immediately disconnected the chat after I said hello.</p> <p class="copyright">Screenshot of Hot Chat 3000</p> <p>The New York Post <a href="https://nypost.com/2023/04/28/ai-hot-or-not-tool-rates-celeb-faces-sydney-sweeney-and-more/?utm_source=twitter&utm_medium=social&utm_campaign=nypost" target="_blank" rel="noopener">reported</a> that it ran pictures through the AI bot and actress Sydney Sweeney received a "5.2," while her co-star Glen Powell scored an "8." MSCHF's Whaley himself <a href="https://twitter.com/Gabriel_Whaley/status/1651654492566298624" target="_blank" rel="noopener">tweeted last week that he scored a 5.5</a>.</p> <p>When I finally entered the chat I used my "hottest" pic (as one does) and was matched with a fellow 6-er.</p> <p>There are no names in the world of Hot Chat 3000; instead, everyone is identified by their score. So, I became "6.8" and my chat partner was called "6." </p> <p>I said "Hi." And "6" immediately disconnected the chat — thankfully.</p> <div class="read-original">Read the original article on <a href="https://www.businessinsider.com/hot-chat-3000-ai-match-hot-or-not-hotness-mschf-2023-5">Business Insider</a></div><!-- /wp:html -->

I tried Hot Chat 3000, a new website from the art collective MSCHF that says it’s helping to expose bias in AI systems. I uploaded my picture, it rated me, and the bot matched me with someone it said rated similarly.

Hot Chat 3000 is a dating website where AI determines your “hotness” using huge datasets.
The tongue-in-cheek website then lets you chat with other people within your “hotness” bracket.
The project was developed by MSCHF, a New York based art collective that’s known for ‘Satan Shoes.’

We’re living in a brave new world where even our “hotness” level can be determined by AI — and a new website wants to expose just what we’re getting ourselves into.

I tried it to determine what AI supposedly thinks of me — but more on that later.

The art collective MSCHF launched the website, called Hot Chat 3000. It bills itself as a “1-to-1 online chat website where who you can talk to is contingent on how attractive you are, and how attractive you are is determined by AI.”

But the AI is only as good as the data that’s training it — and that’s where MSCHF is stepping in, essentially asking the question: Just who — or what — gets to determine who’s hot?

When a user enters Hot Chat 3000, they’re asked to upload a picture of themselves as electronic music hums in the background. The picture is “analyzed” and given a “hotness” rating on a scale of 1-to-10, which in turn dictates who the user can chat with. So a user who earns a score of 5.4 will be matched with another person who scores between 5.0 and 5.9. 

The rating system depends on certain large language models

Hot Chat 3000 says that its rating system relies predominantly on CLIP, a large machine-learning model from OpenAI that has been trained to “choose the correct captions for a particular image among a number of incorrect text captions.” The model itself was trained on a dataset of 400 million image-text pairs

In addition to CLIP, the site’s rating system utilizes the “Hot or Not” dataset from the website hotornot.com as well as the SCUT-FBP5500 dataset for facial beauty prediction. (Hot or Not was a website in the early 2000s that let people rate whether users who had submitted their pictures were “hot” or “not.”)

—Gabriel Whaley (@Gabriel_Whaley) April 27, 2023

“Both datasets are heavily biased towards certain ethnicities and not at all representative of the broader population,” the site notes. To that end, Hot Chat 3000 also said it used a small album of pictures from Google that featured a broader selection of ages, races, genders to manually adjust its model. 

“The outputs of LLMs are a reflection of the data they were trained on,” the group says, referring to large language models. “Hot Chat 3000 very deliberately sets out to expose, visualize, exacerbate these biases,” the site says. 

Which is to say that no one should take their Hot Chat 3000 “hotness” score seriously. The group told Artnet that “MSCHF’s approach is always to participate natively in the space we are critiquing or satirizing. A.I. will be folded into a million arbitrary applications.”

Hot Chat 3000 is one of several tongue-in-cheek projects that MSCHF has produced since it launched in 2016. 

The collective — which is headed by former BuzzFeed employee Gabriel Whaley — once described itself on LinkedIn as “a dairy company.” Its roster of stunts over the past few years include AI-generated photos of feet, an app for making stock investments based on astrological signs, and a “Satan Shoe” made with human blood that lead to a lawsuit from Nike, Insider previously reported.

It’s unclear how much revenue the MSCHF generates from its endeavors. According to PitchBook, though, the company was valued at close to $120 million at its Series B round in April 2021 in which it raised funding from venture firms including Peter Thiel’s Founders Fund. Whaley didn’t respond to Insider’s request for comment. 

The moment of truth: Testing Hot Chat 3000

I braced myself and tested Hot Chat 3000 over two different days — uploading 15 different pictures of myself to understand where I fell on the AI-determined “hotness” scale.

I got an error message for some, but for the six pictures the site could read, I was given a comically wide range of scores between 3.7 and 6.8. The same picture was also given different scores on different days.

When “6.0” was matched with me, he immediately disconnected the chat after I said hello.

The New York Post reported that it ran pictures through the AI bot and actress Sydney Sweeney received a “5.2,” while her co-star Glen Powell scored an “8.” MSCHF’s Whaley himself tweeted last week that he scored a 5.5.

When I finally entered the chat I used my “hottest” pic (as one does) and was matched with a fellow 6-er.

There are no names in the world of Hot Chat 3000; instead, everyone is identified by their score. So, I became “6.8” and my chat partner was called “6.” 

I said “Hi.” And “6” immediately disconnected the chat — thankfully.

Read the original article on Business Insider

By