Instead of using character ai, which will send all my private conversations to governments, I found this solution. Any thoughts on this? 😅

  • fishynoob@infosec.pub
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    I see. Thanks for the note. I think beyond 48GB of VRAM diminishing returns set in very quickly so I’ll likely stick to that limit. I wouldn’t want to use models hosted in the cloud so that’s out of the question.