• 0 Posts
  • 12 Comments
Joined 1 year ago
cake
Cake day: June 20th, 2023

help-circle
  • One of the first things they teach you in Experimental Physics is that you can’t derive a curve from just 2 data points.

    You can just as easilly fit an exponential growth curve to 2 points like that one 20% above the other, as you can a a sinusoidal curve, a linear one, an inverse square curve (that actually grows to a peak and then eventually goes down again) and any of the many curves were growth has ever diminishing returns and can’t go beyond a certain point (literally “with a limit”)

    I think the point that many are making is that LLM growth in precision is the latter kind of curve: growing but ever slower and tending to a limit which is much less than 100%. It might even be like more like the inverse square one (in that it might actually go down) if the output of LLM models ends up poluting the training sets of the models, which is a real risk.

    You showing that there was some growth between two versions of GPT (so, 2 data points, a before and an after) doesn’t disprove this hypotesis. I doesn’t prove it either: as I said, 2 data points aren’t enough to derive a curve.

    If you do look at the past growth of precision for LLMs, whilst improvement is still happening, the rate of improvement has been going down, which does support the idea that there is a limit to how good they can get.




  • Sound like a critical race condition or bad memory access (this latter only in languages with pointers).

    Since it’s HTTP(S) and judging by the average developer experience in the domain of multi-threading I’ve seen even for people doing stuff that naturally tends to involve multiple threads (such as networked access by multiple simultaneous clients), my bet is the former.

    PS: Yeah, I know it’s a joke, but I made the serious point anyways because it might be useful for somebody.





  • If there’s one thing the last couple of decades have taught me, is there is no such thing as a brand you can trust forever: even the privately owned family brands sometimes get bought out by some conglomerate or made public, followed by enshittification as the new management tries to squeeze all the value they can of the brand.

    You’re better off not going by brand and researching every large ticket item purchase you want to make: if you’re going to spend $1000, it’s probably worth a couple of hours of your time looking into it beforehand unless your hourly rate is pretty high.


  • Quis custodiet ipsos custodes?

    One should be even more skeptical and demanding of proof for wannabe trust-gatekeepers of the entire Internet, than one should already be for single newsmedia entities - the former place themselves as supervisors of trust in the latter and yet have even less proven trustworthiness than them.

    So it’s curious that the [email protected] mods keep on pushing for people reading posts on that community to use this specific self-annointed trust gatekeeper who has repeatedly shown that they themselves are biased (quite a lot to the Right of the political spectrum and pro-Zionistl) as their trust-gatekeeper.

    I keep downvoting it because such action reeks of manipulation and is exactly the kind of thing that State Actors and Political Actors would do to shape opinions in the this day and age when people can read articles from anywhere in the World.


  • Aceticon@lemmy.worldtoProgrammer Humor@lemmy.mlUsers
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    5 months ago

    Yeah.

    Any good software developer is going to account for and even test all the weird situations they can think of … and not the ones they cannot think of as they’re not even aware of those as a possibility (if they were they would account for and test them).

    Which is why you want somebody with a different mindset to independently come up with their own situations.

    It’s not a value judgment on the quality of the developer, it’s just accounting for, at a software development process level, the fact that humans are not all knowing, not even devs ;)


  • I’ve actually worked with a genuine UX/UI designer (not a mere Graphics Designer but their version of a Senior Developer-Designer/Technical-Architect).

    Lets just say most developers aren’t at all good at user interface design.

    I would even go as far as saying most Graphics Designers aren’t all that good at user interface design.

    Certain that explains a lot the shit user interface design out there, same as the “quality” of most common Frameworks and Libraries out there (such as from the likes of Google) can be explained by them not actually having people with real world Technical Architect level or even Senior Designer-Developer experience overseeing the design of Frameworks and Libraries for 3rd party use.


  • I actually have a ton of professional Java experience and have done a lot of microcontroller stuff of late (for fun mainly) and if you go at doing software for ARM Cortex-M microcontrollers the Java way you’re going to end with overengineered bloatware.

    It’s however not a Java-only thing: most of those things have too little memory and processing resources for designing the whole software in a pure OO way, plus you’re pretty much coding directly on the low-level (with at most a thin Hardware Abstraction Layer between your code and direct register manipulation) so only ever having used high-level OO languages isn’t really good preparation for it, something which applies not only for people with only Java experience but also for those whose entire experience is with things like C#.Net as well as all smartphone frameworks and languages (Objective-C, Kotlin, Swift).