MuchPineapples

  • 0 Posts
  • 27 Comments
Joined 1 year ago
cake
Cake day: June 12th, 2023

help-circle















  • So it will be locally hosted on the phone? I seriously doubt it will be very useful in offline only mode. Even relativity small language models (7B or 13B) struggle on even desktop pc’s if you don’t have a high end graphics card with 12+ GB of vram. Analyzing can be relatively fast, but generating will be terribly slow, especially images.

    Edit: So after some reading, the snapdragon gen 3 has some impressive specs, but can someone explain how a phone can generate fast ai content while a PC needs let’s say 24GB’s of vram? I get the phone has an ai-specialized chip, but you still need to load the model into memory.