• @[email protected]
    link
    fedilink
    English
    5
    edit-2
    2 years ago

    I hope not, laws tend to get outdated real fast. Who knows robots.txt might not even be used in the future and it just there adding space because of law reasons.

    • BreakDecks
      link
      fedilink
      English
      102 years ago

      robots.txt is a 30 year old standard. If we can write common sense laws around things like email and VoIP, we can do it for web standards too.

    • Echo Dot
      link
      fedilink
      English
      52 years ago

      We don’t need new laws we just need enforcement of existing laws. It is already illegal to copy copyrighted content, it’s just that the AI companies do it anyway and no one does anything about it.

      Enforcing respect for robots.txt doesn’t matter because the AI companies are already breaking the law.

      • BreakDecks
        link
        fedilink
        English
        22 years ago

        I think the issue is that existing laws don’t clearly draw a line that AI can cross. New laws may very well be necessary if you want any chance at enforcement.

        And without a law that defines documents like robots.txt as binding, enforcing respect for it isn’t “unnecessary”, it is impossible.

        I see no logic in complaining about lack of enforcement while actively opposing the ability to meaningfully enforce.

        • Echo Dot
          link
          fedilink
          English
          32 years ago

          Copyright law in general needs changing though that’s the real problem. I don’t see the advantage of legally mandating that a hacky workaround solution becomes a legally mandated requirement.

          Especially because there are many many legitimate reasons to ignore robots.txt including it being misconfigured or it just been set up for search engines when your bot isn’t a search engine crawler.

    • kingthrillgore
      link
      fedilink
      English
      42 years ago

      robots.txt has been an unofficial standard for 30 years and its augmented with sitemap.xml to help index uncrawlable pages, and Schema.org to expose contents for Semantic Web. I’m not stating it shouldn’t not be a law, but to suggest changing norms as a reason is a pretty weak counterargument, man.