• 4 Posts
  • 1.09K Comments
Joined 2 years ago
cake
Cake day: March 22nd, 2024

help-circle






  • AFAIK there are also problems that Chinese companies have their own tool chain, and are releasing high level truly open source solutions for AI.

    One interesting thing about the Chinese “AI Tigers” is the lack of Tech Bro evangelism.

    They see their models as tools. Not black box magic oracles, not human replacements. And they train/structure/productize them and such.

    But with AI you can use whatever tool is best value, and switch to the competition whenever you want.

    Big Tech is making this really hard, though.

    In the business world, there’s a lot of paranoia about using Chinese LLM weights. Which is totally bogus, but also understandably hard to explain.

    And OpenAI and such are working overtime to lock customers in. See: iOS being ChatGPT-only; no “pick your own API.” Or Disney using Sora when they should really be rolling their own finetune.











  • For all the criticism of AI, this is the one that’s massively overstated.

    On my PC, the task energy of a casual diffusion attempt (let’s say a dozen+ images in few batches) on a Flux-tier model is 300W * 240 seconds.

    That’s 54 kilojoules.

    …That’s less than microwaving leftovers, or a few folks browsing this Lemmy thread on laptops.

    And cloud models like Nano Banana are more efficient than that, batching the heck out of generations on wider, more modern hardware, and more modern architectures, than my 3090 from 2020.


    Look. There are a million reasons corporate AI is crap.

    But its power consumption is a meme perpetuated by tech bros who want to convince the world scaling infinitely is the only way to advance it. That is a lie to get them money. And it is not the way research is headed.

    Yes they are building too many data centers, and yes some in awful places, but that’s part of the con. They don’t really need that, and making a few images is not burning someone’s water away.