• Avid Amoeba
    link
    fedilink
    English
    228 months ago

    Why am I feeling it isn’t going to be a repeat of the standards-driven co-operative development supported by open source software infrastructure that occurred during the decade and a half after the dotcom bubble… I have a feeling it would resemble the pre mass computing world of AT&T, GE and IBM.

    • andyburke
      link
      fedilink
      268 months ago

      There are a lot of open source LLMs being developed, ones you can run at home on your own data.

        • @[email protected]
          link
          fedilink
          English
          48 months ago

          What would be the threshold for them to “take off”? It’s all already out, so already there no?

              • @[email protected]
                link
                fedilink
                English
                1
                edit-2
                8 months ago

                i tried the llama model for text, and another one meant for images i cant quite remember the name but it was one of the main ones.

                are they any good now? running an llm actually sounds mildly useful.

                • just another devA
                  link
                  fedilink
                  English
                  18 months ago

                  The Mixtral models are pretty good, although they require a LOT of memory to run at a decent pace.

                  • @[email protected]
                    link
                    fedilink
                    English
                    18 months ago

                    Honestly i think speed is something I don’t care too much about with models, because even things like ChatGPT will be slower than Google for most things, and if something is more complex and a good use case for an LLM it’s unlikely to be the primary bottleneck.

                    My gf private chat bot right now is a combination of Mistral 7B with a custom finetune and she it directs some queries to ChatGPT if I ask (I got free tokens way back might as well burn through them).

                    How much of an improvement is Mixtral over Mistral in practice?