• ChaoticNeutralCzech@feddit.de
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 months ago

    Who could have guessed? Not every problem requires humanoid robot arms like this 🦾. The most efficient machines in factories are way simpler and specialized in their task.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    This is the best summary I could come up with:


    Remember a year ago, all the way back to last November before we knew about ChatGPT, when machine learning was all about building models to solve for a single task like loan approvals or fraud protection?

    Jon Turow, a partner at investment firm Madrona, who formerly spent almost a decade at AWS, says the industry has been talking about emerging capabilities in large language models like reasoning and out-of-domain robustness.

    “When you’re looking at an aggregate level in a company, when there are hundreds of machine learning models being trained separately, that doesn’t make any sense,” Deo said.

    For Amazon, SageMaker, the company’s machine learning operations platform, remains a key product, one that is aimed at data scientists instead of developers, as Bedrock is.

    It would be foolhardy to give that up, and frankly just because LLMs are the flavor of the moment doesn’t mean that the technology that came before won’t remain relevant for some time to come.

    It’s worth noting that Amazon did announce upgrades to SageMaker this week, aimed squarely at managing large language models.


    The original article contains 753 words, the summary contains 178 words. Saved 76%. I’m a bot and I’m open source!