• gerryflap@feddit.nl
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    I’m not a hundred percent sure, but afaik it has to do with how random the output of the GPT model will be. At 0 it will always pick the most probable next continuation of a piece of text according to its own prediction. The higher the temperature, the more chance there is for less probable outputs to get picked. So it’s most likely to pick 42, but as the temperature increases you see the chance of (according to the model) less likely numbers increase.

    This is how temperature works in the softmax function, which is often used in deep learning.