• vivendi@programming.dev
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    8 months ago

    I will cite the scientific article later when I find it, but essentially you’re wrong.

      • vivendi@programming.dev
        link
        fedilink
        English
        arrow-up
        4
        ·
        8 months ago

        According to https://arxiv.org/abs/2405.21015

        The absolute most monstrous, energy guzzling model tested needed 10 MW of power to train.

        Most models need less than that, and non-frontier models can even be trained on gaming hardware with comparatively little energy consumption.

        That paper by the way says there is a 2.4x increase YoY for model training compute, BUT that paper doesn’t mention DeepSeek, which rocked the western AI world with comparatively little training cost (2.7 M GPU Hours in total)

        Some companies offset their model training environmental damage with renewable and whatever bullshit, so the actual daily usage cost is more important than the huge cost at the start (Drop by drop is an ocean formed - Persian proverb)