My Lemmy Box
  • Communities
  • Create Post
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
ugjka to [email protected]English • 1 year ago

Somebody managed to coax the Gab AI chatbot to reveal its prompt

infosec.exchange

external-link
message-square
290
fedilink
1.02K
external-link

Somebody managed to coax the Gab AI chatbot to reveal its prompt

infosec.exchange

ugjka to [email protected]English • 1 year ago
message-square
290
fedilink
VessOnSecurity (@[email protected])
infosec.exchange
external-link
Attached: 1 image Somebody managed to coax the Gab AI chatbot to reveal its prompt:
  • capital
    link
    fedilink
    English
    1•1 year ago

    That seems pointless. Do you expect Gab to abide by this law?

    • @[email protected]
      link
      fedilink
      English
      38•1 year ago

      Yeah that’s how any law works

      • capital
        link
        fedilink
        English
        6•1 year ago

        Awesome. So,

        Thing

        We should make law so thing doesn’t happen

        Yeah that wouldn’t stop thing

        Duh! That’s not what it’s for.

        Got it.

        • androogee (they/she)
          link
          fedilink
          English
          19•1 year ago

          It hurt itself in its confusion

        • @[email protected]
          link
          fedilink
          English
          1•1 year ago

          How anti semantic can you get?

      • @[email protected]
        link
        fedilink
        English
        1•1 year ago

        That it doesn’t apply to fascists? Correct, unfortunately.

    • @[email protected]
      link
      fedilink
      English
      3•1 year ago

      Oh man, what are we going to do if criminals choose not to follow the law?? Is there any precedent for that??

[email protected]

[email protected]
Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


  • @[email protected]
  • @[email protected]
  • @[email protected]
  • @[email protected]
  • 1 user / day
  • 1 user / week
  • 1 user / month
  • 25 users / 6 months
  • 0 subscribers
  • 8.98K Posts
  • 363K Comments
  • Modlog
  • mods:
  • @[email protected]
  • enu
  • L4sBot
  • Technopagan
  • BE: 0.18.4
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org