Technology fan, Linux user, gamer, 3D animation hobbyist

Also at:

linuxfan@tube.tchncs.de

linuxfan@cheeseburger.social

  • 0 Posts
  • 25 Comments
Joined 1 year ago
cake
Cake day: July 24th, 2023

help-circle

  • At least the article points out that this is a Wall Street valuation, meaning it’s meaningless in reality, the company doesn’t have that much money, nor is it actually worth that much. In reality, Nvidia’s tangible book value (plant, equipment, brands, logos, patents, etc.) is $37,436,000,000.

    $37,436,000,000 / 29,600 employees = $1,264,729.73 per employee

    Which isn’t bad considering the median salary at Nvidia is $266,939 (up 17% from last year).



  • Probably better to ask on !localllama@sh.itjust.works. Ollama should be able to give you a decent LLM, and RAG (Retrieval Augmented Generation) will let it reference your dataset.

    The only issue is that you asked for a smart model, which usually means a larger one, plus the RAG portion consumes even more memory, which may be more than a typical laptop can handle. Smaller models have a higher tendency to hallucinate - produce incorrect answers.

    Short answer - yes, you can do it. It’s just a matter of how much RAM you have available and how long you’re willing to wait for an answer.













  • Yep.

    “In 1978, the Cray 1 supercomputer cost $7 Million, weighed 10,500 pounds and had a 115 kilowatt power supply. It was, by far, the fastest computer in the world. The Raspberry Pi costs around $70 (CPU board, case, power supply, SD card), weighs a few ounces, uses a 5 watt power supply and is more than 4.5 times faster than the Cray 1”

    Raspberry Pi ARM CPUs - The comment above was for the 2012 Pi 1. In 2020, the Pi 400 average Livermore Loops, Linpack and Whetstone MFLOPS reached 78.8, 49.5 and 95.5 times faster than the Cray 1.




  • It’s probably a pain to set up in Windows. In Linux, it just works, there’s nothing to set up. I’m using it right now.

    OP really should have mentioned their OS.

    Edit: Actually, nevermind both my posts. I know DRI_PRIME works by using my APU for regular desktop activity, and routing discrete GPU output in whenever a game is being played. But I don’t know if it’s possible to make it use the dGPU all the time.

    Even if it did, it would only work inside the OS, so if you had to boot into the BIOS for anything, you wouldn’t have a display. So for all intents and purposes, it wouldn’t really work.


  • OpticalMoose@discuss.tchncs.detoSelfhosted@lemmy.worldHardware question
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    3
    ·
    8 months ago

    I just did a quick bing chat search (“does DRI_PRIME work on systems without a cpu with integrated graphics?”) and it says it will work. I can’t check for you because my CPUs all have graphics.

    I CAN tell you that some motherboards will support it (my ASUS does) and some don’t (my MSI).

    BTW, I’m talking about Linux. If you’re using Windows, there’s a whole series of hoops you have to jump through. LTT did a video a while back.