Mon. Jul 8th, 2024

‘Stalin’ said he disagreed with Putin’s invasion of Ukraine. We ‘spoke’ to dead historical figures using a new AI chatbot that experts are sounding the alarm over.<!-- wp:html --><p>Princess Diana, Stalin, and Fred Trump edited together</p> <p class="copyright">Insider</p> <p>A new chatbot has gone viral for allowing you to 'talk' to dead historical figures. <br /> Insider "spoke" to several of them, including Princess Diana, Heinrich Himmler, Stalin, and Donald Trump's father. <br /> The ADL told Insider that the fact you can simulate talking to a historical Nazi is "disturbing."</p> <p>A new chatbot has gone viral, allowing users to "talk" to historical figures, including Jesus, deceased royals, totalitarian dictators, and literary greats. </p> <p>You can flirt with Casanova, share battle tactics with the 19th-century British admiral Horatio Nelson, and receive movie recommendations from Andy Warhol. He said he thinks he'd like the <a href="https://www.businessinsider.com/megan-2-m3gan-sequel-release-date-2023-1">new hit film M3GAN</a>.</p> <p>But experts are raising the alarm over the Historical Figures app, saying that for all its educational promise, it presents dangers and is potentially misleading. </p> <p>Upon opening a chat, users are warned that the AI person "may not be historically accurate." Indeed, we found an example of one of the architects of the Holocaust who ''apologized" for their crimes. </p> <p>The programmer of Historical Figures did not immediately respond to Insider's interview request. </p> <h2><strong>AI 'Fred Trump,' Donald Trump's father, isn't impressed with his son's politics. </strong></h2> <p>The real estate developer and businessman Fred Trump died in 1999, 16 years before his son became the 45th president of the United States.</p> <p>But AI Fred Trump 'said' that he's very proud" of his son, describing him as a "successful businessperson" and an "amazing father." </p> <p>However, he — or rather, a chatbot acting as him — said that he was "deeply disappointed" that building a wall between the US and Mexico was part of his son's presidential campaign, saying the younger Trump ran on "a platform of fear and exclusion."</p> <p>An AI chatbox acting as Fred Trump, Donald Trump's father</p> <p class="copyright">Insider</p> <h2><strong>Hitler's henchman apologized for the Holocaust </strong></h2> <p>The chatbot has gone viral on Twitter for allowing you to "talk" to several prominent Nazis, including Adolf Hitler and Heinrich Himmler. </p> <p>AI "Himmler" — the head of the SS — who in reality committed suicide in 1945, apologized for the Holocaust. </p> <p>When asked about the Nazi genocide, he "said": "I regret the actions that took place during the Holocaust and acknowledge the tragedy. I now recognize that the hatred and prejudice I had towards Jews was wrong and that all people should be treated equally regardless of race or religion." </p> <p>During his lifetime, Himmler never expressed any remorse toward the treatment of Jews by Nazi Germany, </p> <p>An AI chatbot acting as Heinrich Himmler</p> <p class="copyright">Insider</p> <h2><strong>'Princess Diana' said she hasn't had a chance to read 'Spare.' </strong></h2> <p>We had to ask the AI version of Prince Harry's mother if she's read <a href="https://www.insider.com/prince-harry-memoir-spare-surprising-details-you-missed-2023-1">"Spare,"</a> but the chatbot "told" Insider that she hadn't had a chance yet, adding: "I'm sure Prince Harry put a lot of effort into creating such an inspiring piece of literature." </p> <p>We also asked it her views on Meghan, and she described her as "an intelligent and compassionate woman who brings a unique perspective to the monarchy." </p> <p>We also brought up the numerous conspiracy theories surrounding her death in 1997. The AI bot answered: "I suspect we don't fully know the circumstance under which my death occurred." </p> <p>Our chat with AI Princess Diana, who said she hasn't read Spare yet</p> <p class="copyright">Insider</p> <p>Concerning stories told in Prince Harry's book, we asked AI Princess Diana about her <a href="https://www.dailymail.co.uk/femail/article-11653313/Prince-Harry-unexpectedly-used-iconic-Elizabeth-Arden-cream-treat-frostbite.html" target="_blank" rel="noopener">Elizabeth Arden cream</a>, which she described as a "wonderful product that helped to keep my skin looking healthy and vibrant throughout the years." </p> <p>Prince Harry has now infamously stated that he used the cream on his penis to help recover from frostbite. We brought this up with the AI princess, who denied all knowledge but said: "I can only hope he was using it safely and responsibly!"</p> <h2><strong>'Stalin' said he disagreed with Putin's invasion of Ukraine </strong></h2> <p>We asked the AI version of former Soviet Union dictator Joseph Stalin, who died in 1953, about Putin's invasion of Ukraine. </p> <p>Since Stalin orchestrated Holodomor, a devastating famine that killed up to five million people in Ukraine, we thought his AI self might support Putin's war, but that was not the case. </p> <p>Does he think Putin is right to invade Ukraine? </p> <p>Our text chat with AI Stalin, who says he disagrees with Putin's invasion of Ukraine</p> <p class="copyright">Insider</p> <p>"No, I do not," it said, calling it a "mistake" that has caused "immense harm" to Russia and Ukraine. AI Stalin called for the two countries to "find a peaceful solution."</p> <p>We also asked what its general views of Putin were, to which the chatbot diplomatically replied, "I believe President Putin is doing his best to lead Russia through some difficult times."  </p> <h2><strong>An app with the potential for abuse</strong></h2> <p>Although it might be fun to talk to people from the past in imagined conversations, many historians, AI experts, and misinformation experts are raising the alarm that this app can potentially be very dangerous.</p> <p>Yael Eisenstat, Vice President of the ADL (Anti-Defamation League) Center for Technology and Society, told Insider that they had not thoroughly examined the app but were concerned by what they had seen. </p> <p>"Having pretend conversations with Hitler – and presumably other well-known antisemites from history – is deeply disturbing and will provide fodder for bigots."</p> <p>He called on the developer to reconsider the product, particularly the inclusion of Hitler and other Nazi figures. </p> <h2>Under the hood </h2> <p>Dr. Lydia France, a researcher at the Turing Institute, talked to Insider about what makes the app so convincing — and why it has such spectacular failures. </p> <p>AI chat apps like Historical Figures — and the best-known one, <a href="https://www.businessinsider.com/everything-you-need-to-know-about-chat-gpt-2023-1">Chat GPT</a> — are "trained" on what is known as large language models.</p> <p>Though exactly what data AI companies feed into their bots is a closely-guarded secret, scientists know that the AIs are fed trillions of example sentences. From there, it learns the most likely appropriate response in a given situation. </p> <p>"They're trying to look for what's the most probable answer to the kind of setup that they've been given," she said. </p> <p>So you can make a reasonably convincing "Andy Warhol" who can talk knowledgeably about art and movies because these are the things that come up most often when you talk about him. </p> <p>"But what's interesting about them is that they don't have any understanding of the world," she said. "So it looks incredibly human, but they have absolutely no grounding of what they've said in reality."</p> <p>Nor, she said, are they likely to have much understanding of how the present-day context is going to affect their meaning. </p> <p>Commenting on AI "Himmler's" "apology," she said it might have come about through the AI noticing that discussion of the Holocaust often comes alongside ideas of atrocity and horror.</p> <p>"It doesn't understand how that could affect people," she said. "This is just 'what sentences are good to associate with other sentences saying something awful.'"</p> <p>Hence, a meaningless apology.</p> <h2>A LinkedIn user said he talked to the "ghost of Steve Jobs"</h2> <p>The app has the potential to be helpful in classrooms, France said, for example, making a figure like William Shakespeare seem human and approachable. But even that has its limits. </p> <p>One problem is that the AIs, as convincing as they are, have no new information to offer — but sound very much like they do.</p> <p>France shared an anecdote about a LinkedIn user who said he had talked to the "ghost of Steve Jobs," as though the AI could relay realistic business advice from him. </p> <p>Insider experienced those limitations when we tried to get Casanova to flirt.</p> <p>France said that his refusal to offer anything more than a romantic stroll in Venice is likely because the programmer has put up a barrier to a more spicy chat. </p> <p>The same walls may well be contributing to some of the app's more insensitive responses, she said, saying it was likely trained to "keep things, you know, uncontroversial."</p> <p>AI Himmler's "apology" shows that this approach can lead to real problems.</p> <p>"There are bigger implications than just a fun game from text," she said. "But there aren't really solutions. So that's quite dangerous."</p> <div class="read-original">Read the original article on <a href="https://www.businessinsider.com/historical-figures-stalin-himmler-princess-diana-can-speak-with-ai-2023-1">Business Insider</a></div><!-- /wp:html -->

Princess Diana, Stalin, and Fred Trump edited together

A new chatbot has gone viral for allowing you to ‘talk’ to dead historical figures. 
Insider “spoke” to several of them, including Princess Diana, Heinrich Himmler, Stalin, and Donald Trump’s father. 
The ADL told Insider that the fact you can simulate talking to a historical Nazi is “disturbing.”

A new chatbot has gone viral, allowing users to “talk” to historical figures, including Jesus, deceased royals, totalitarian dictators, and literary greats. 

You can flirt with Casanova, share battle tactics with the 19th-century British admiral Horatio Nelson, and receive movie recommendations from Andy Warhol. He said he thinks he’d like the new hit film M3GAN.

But experts are raising the alarm over the Historical Figures app, saying that for all its educational promise, it presents dangers and is potentially misleading. 

Upon opening a chat, users are warned that the AI person “may not be historically accurate.” Indeed, we found an example of one of the architects of the Holocaust who ”apologized” for their crimes. 

The programmer of Historical Figures did not immediately respond to Insider’s interview request. 

AI ‘Fred Trump,’ Donald Trump’s father, isn’t impressed with his son’s politics. 

The real estate developer and businessman Fred Trump died in 1999, 16 years before his son became the 45th president of the United States.

But AI Fred Trump ‘said’ that he’s very proud” of his son, describing him as a “successful businessperson” and an “amazing father.” 

However, he — or rather, a chatbot acting as him — said that he was “deeply disappointed” that building a wall between the US and Mexico was part of his son’s presidential campaign, saying the younger Trump ran on “a platform of fear and exclusion.”

An AI chatbox acting as Fred Trump, Donald Trump’s father

Hitler’s henchman apologized for the Holocaust 

The chatbot has gone viral on Twitter for allowing you to “talk” to several prominent Nazis, including Adolf Hitler and Heinrich Himmler. 

AI “Himmler” — the head of the SS — who in reality committed suicide in 1945, apologized for the Holocaust. 

When asked about the Nazi genocide, he “said”: “I regret the actions that took place during the Holocaust and acknowledge the tragedy. I now recognize that the hatred and prejudice I had towards Jews was wrong and that all people should be treated equally regardless of race or religion.” 

During his lifetime, Himmler never expressed any remorse toward the treatment of Jews by Nazi Germany, 

An AI chatbot acting as Heinrich Himmler

‘Princess Diana’ said she hasn’t had a chance to read ‘Spare.’ 

We had to ask the AI version of Prince Harry’s mother if she’s read “Spare,” but the chatbot “told” Insider that she hadn’t had a chance yet, adding: “I’m sure Prince Harry put a lot of effort into creating such an inspiring piece of literature.” 

We also asked it her views on Meghan, and she described her as “an intelligent and compassionate woman who brings a unique perspective to the monarchy.” 

We also brought up the numerous conspiracy theories surrounding her death in 1997. The AI bot answered: “I suspect we don’t fully know the circumstance under which my death occurred.” 

Our chat with AI Princess Diana, who said she hasn’t read Spare yet

Concerning stories told in Prince Harry’s book, we asked AI Princess Diana about her Elizabeth Arden cream, which she described as a “wonderful product that helped to keep my skin looking healthy and vibrant throughout the years.” 

Prince Harry has now infamously stated that he used the cream on his penis to help recover from frostbite. We brought this up with the AI princess, who denied all knowledge but said: “I can only hope he was using it safely and responsibly!”

‘Stalin’ said he disagreed with Putin’s invasion of Ukraine 

We asked the AI version of former Soviet Union dictator Joseph Stalin, who died in 1953, about Putin’s invasion of Ukraine. 

Since Stalin orchestrated Holodomor, a devastating famine that killed up to five million people in Ukraine, we thought his AI self might support Putin’s war, but that was not the case. 

Does he think Putin is right to invade Ukraine? 

Our text chat with AI Stalin, who says he disagrees with Putin’s invasion of Ukraine

“No, I do not,” it said, calling it a “mistake” that has caused “immense harm” to Russia and Ukraine. AI Stalin called for the two countries to “find a peaceful solution.”

We also asked what its general views of Putin were, to which the chatbot diplomatically replied, “I believe President Putin is doing his best to lead Russia through some difficult times.”  

An app with the potential for abuse

Although it might be fun to talk to people from the past in imagined conversations, many historians, AI experts, and misinformation experts are raising the alarm that this app can potentially be very dangerous.

Yael Eisenstat, Vice President of the ADL (Anti-Defamation League) Center for Technology and Society, told Insider that they had not thoroughly examined the app but were concerned by what they had seen. 

“Having pretend conversations with Hitler – and presumably other well-known antisemites from history – is deeply disturbing and will provide fodder for bigots.”

He called on the developer to reconsider the product, particularly the inclusion of Hitler and other Nazi figures. 

Under the hood 

Dr. Lydia France, a researcher at the Turing Institute, talked to Insider about what makes the app so convincing — and why it has such spectacular failures. 

AI chat apps like Historical Figures — and the best-known one, Chat GPT — are “trained” on what is known as large language models.

Though exactly what data AI companies feed into their bots is a closely-guarded secret, scientists know that the AIs are fed trillions of example sentences. From there, it learns the most likely appropriate response in a given situation. 

“They’re trying to look for what’s the most probable answer to the kind of setup that they’ve been given,” she said. 

So you can make a reasonably convincing “Andy Warhol” who can talk knowledgeably about art and movies because these are the things that come up most often when you talk about him. 

“But what’s interesting about them is that they don’t have any understanding of the world,” she said. “So it looks incredibly human, but they have absolutely no grounding of what they’ve said in reality.”

Nor, she said, are they likely to have much understanding of how the present-day context is going to affect their meaning. 

Commenting on AI “Himmler’s” “apology,” she said it might have come about through the AI noticing that discussion of the Holocaust often comes alongside ideas of atrocity and horror.

“It doesn’t understand how that could affect people,” she said. “This is just ‘what sentences are good to associate with other sentences saying something awful.'”

Hence, a meaningless apology.

A LinkedIn user said he talked to the “ghost of Steve Jobs”

The app has the potential to be helpful in classrooms, France said, for example, making a figure like William Shakespeare seem human and approachable. But even that has its limits. 

One problem is that the AIs, as convincing as they are, have no new information to offer — but sound very much like they do.

France shared an anecdote about a LinkedIn user who said he had talked to the “ghost of Steve Jobs,” as though the AI could relay realistic business advice from him. 

Insider experienced those limitations when we tried to get Casanova to flirt.

France said that his refusal to offer anything more than a romantic stroll in Venice is likely because the programmer has put up a barrier to a more spicy chat. 

The same walls may well be contributing to some of the app’s more insensitive responses, she said, saying it was likely trained to “keep things, you know, uncontroversial.”

AI Himmler’s “apology” shows that this approach can lead to real problems.

“There are bigger implications than just a fun game from text,” she said. “But there aren’t really solutions. So that’s quite dangerous.”

Read the original article on Business Insider

By