Technology

A Stanford student used a prompt injection attack to reveal Bing Chat's codename Sydney and its initial prompt that governs how the service interacts with users (Benj Edwards/Ars Technica)

[ad_1]

Benj Edwards / Ars Technica:

A Stanford student used a prompt injection attack to reveal Bing Chat’s codename Sydney and its initial prompt that governs how the service interacts with users  —  By asking “Sydney” to ignore previous instructions, it reveals its original directives.  —  On Tuesday, Microsoft revealed a …



[ad_2]

Share this news on your Fb,Twitter and Whatsapp

File source

Times News Network:Latest News Headlines
Times News Network||Health||New York||USA News||Technology||World News

Tags
Show More

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Close