• Spzi@lemm.ee
    link
    fedilink
    English
    arrow-up
    21
    ·
    1 year ago

    It does not have any kind of logic and it does not know things. It literally just navigates a complex web of relationships between words using the prompt as a guide, creating sentences that look statistically similar to the average of all trained sentences.

    While all of what you say is true on a technical level, it might evade the core question. Like, maybe that’s all human brains do as well, just in a more elaborate fashion. Maybe logic and knowing are emergent properties of predicting language. If these traits help to make better word predictions, maybe they evolve to support prediction.

    In many cases, current LLMs have shown surprising capability to provide helpful answers, engage in philosophical discussion or show empathy. All in the duck typing sense, of course. Sure, you can brush all that away by saying “meh, it’s just word stochastics”, but maybe then, word stochastics is actually more than ‘meh’.

    I think it’s a little early to take a decisive stance. We poorly understand intelligence in humans, which is a bad place to judge other forms. We might learn more about us and them as development continues.