• themurphy@lemmy.ml
        link
        fedilink
        arrow-up
        0
        arrow-down
        2
        ·
        2 months ago

        I think the guy above is just mad he can’t figure out how to use it. Always easier to be mad at the tool.

        • skuzz@discuss.tchncs.de
          link
          fedilink
          arrow-up
          2
          ·
          2 months ago

          GPT is selectively useful. It’s also, as of the last few weeks dumb as a bag of bricks. Dumber than usual. 4 and 4o are messed up. 4 mini is an idiot. Not sure how they broke them, but it started roughly around the time of the assassination attempt. Not sure if it was a national security request or a mere coincidence, but just the same.

          I’m even seeing 4o make comically dumb and stubborn programming mistakes lately, like:

          GPT: “I totally escaped that character”

          Me: “no, it’s the same as your previous response.”

          GPT: “Oh, sorry, here is the corrected code.” replies with same code again.

          I canceled my sub.

          • theshatterstone54@feddit.uk
            link
            fedilink
            arrow-up
            2
            ·
            2 months ago

            replies with the same code again

            And that’s exactly why I’ve already given up on AI before even really getting into it. The only things I use it for is when I want a basic skeleton for a simple script with the intention of turning it into a real script myself. It’s also pretty good at generating grep, sed and awk commands and oneliners (or at least it was when I last tried it), and sometimes, in spotting mistakes with them.

        • funkless_eck@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          ·
          2 months ago

          different guy here. It seemed to be fairly useful for software engineers to solve quick issues where the answer isn’t immediately obvious - but it’s terrible at most other jobs.

          And part of why it’s bad is because you have to type into a text box what you want and read it back (unless you build you own custom API integration- which goes without saying is also a terrible way to access a product for 99% of people)

          Another part of why it’s bad is because you’re sharing proprietary information with a stranger that is definitely cataloging and profiling it

          Very few people interact with language in a way that is bidirectionally friendly with AI, and AI just isnt very good at writing. It’s very good at creating strings of words that make sense and fit a theme, but most of what makes “very good” writing isn’t just basic competency of the language.

    • MajorHavoc@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      2 months ago

      Yeah. It’s a legitimate business, where the funders at the top of the pyramid are paid by those that join at the bottom!

      • Manmoth@lemmy.ml
        link
        fedilink
        arrow-up
        0
        arrow-down
        2
        ·
        2 months ago

        It’s a brand new, highly competitive technology and ChatGPT has first mover status with a trailer load of capital behind it. They are going to burn a lot of resources right now to innovate quickly and reduce latency etc If they reach a successful product-market-fit getting costs down will eventually be critical to it actually being a viable product. I imagine they will pipe this back into ChatGPT for some sort of AI-driven scaling solution for their infrastructure.

        TL;DR - It’s kind of like how a car uses most of it’s resources going from 0-60 and then efficiencies kick-in at highway speeds.

        Regardless I don’t think they will have to worry about being profitable for a while. With the competition heating up I don’t think there is any way they don’t secure another round of funding.

        • WalnutLum@lemmy.ml
          link
          fedilink
          arrow-up
          2
          ·
          2 months ago

          Facebook is trying to burn the forest around OpenAI and other closed models by removing the market for “models” by themselves, by releasing their own freely to the community. A lot of money is already pivoting away towards companies trying to find products that use the AI instead of the AI itself. Unless OpenAI pivots to something more substantial than just providing multimodal prompt completion they’re gonna find themselves without a lot of runway left.

        • technocrit@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          2
          ·
          2 months ago

          TL;DR - It’s kind of like how a car

          Yes. It’s an inefficient and unsustainable con that’s literally destroying the planet.

        • flappy@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          2 months ago

          If they run out of money (unlikely), they still have a recent history with Microsoft.

    • AlexWIWA@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Sounds like we’re going to get some killer deals on used hardware in a year or so

  • coffee_with_cream@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    2 months ago

    For anyone doing a serious project, it’s much more cost effective to rent a node and run your own models on it. You can spin them up and down as needed, cache often-used queries, etc.

    • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      For sure, and in a lot of use cases you don’t even need a really big model. There are a few niche scenarios where you require a large context that’s not practical to run on your own infrastructure, but in most cases I agree.

  • Aurenkin@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Last time a batch of these popped up it was saying they’d be bankrupt in 2024 so I guess they’ve made it to 2025 now. I wonder if we’ll see similar articles again next year.