• Kongar@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    11 hours ago

    I think local compute will kill these huge data centers for AI. It’s amazing what you can do with free tools like ollama or rag agents like n8n. Even on a business laptop with only 16GB of ram. If you’ve got a 4090 at home in your gaming pc and some big ram sticks - well, you’d be surprised at what some models can do (and how quickly they can respond).

    You all know how the internet works - in a short time someone’s going to put together a free tool that’s as easy as “click this button to install” and it’ll do 80% of what ChatGPT can do. ie probably enough for the average user - for free.

    So how are they going to recoup all these billions spent on data centers if peoples personal computers can mostly do the same thing? How do they monetize your information and sell you ads if it’s all done locally?Go download one and ask questions-sure it’s not perfect but it’s surprisingly good locally hosted.

    I think the people spending these billions are starting to realize that…. Meanwhile I think this keeps video card prices high unfortunately…

    • Bob Robertson IX@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      9 hours ago

      Unfortunately I think most businesses will still prefer that their AI solution is hosted by a company like OpenAI rather than maintaining their own. There’s still going to be a need for these large data centers, but I do hope most people realize that hosting your own LLM isn’t that difficult, and it doesn’t cost you your privacy.

      • Kongar@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 hours ago

        The cost is insane though. I think there’s a disconnect between what they want and what they can afford. I think it’s like a 10x adder per user license to go from regular office 356 to a copilot enabled account. I know my company wants it hosted in the cloud - but we aren’t going to pay the going rates. It’s insane.

        Meh we’ll see. But I do wonder what happens when they get packaged up easier as a program.

        • Bob Robertson IX@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          4 hours ago

          Anyone running a newer MacBook Pro can install Ollama and run it with just a few commands:

          brew install ollama

          ollama run deepseek-r1:14b

          Then you can use it at the terminal, but it also has API access, so with a couple more commands you can put a web front end on it, or with a bit more effort you can add it to a new or existing app/service/system.