From c9e7aa0aa9e3cda831a6452ba9fb2d6428fc5125 Mon Sep 17 00:00:00 2001 From: osmarks Date: Mon, 29 Dec 2025 17:21:56 +0000 Subject: [PATCH] =?UTF-8?q?Edit=20=E2=80=98autogollark=E2=80=99?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- autogollark.myco | 1 + 1 file changed, 1 insertion(+) diff --git a/autogollark.myco b/autogollark.myco index 5a1a921..825dea6 100644 --- a/autogollark.myco +++ b/autogollark.myco @@ -48,6 +48,7 @@ Autogollark currently comprises the dataset, the search API server and the [[htt * Context length issues, and subquadratic models are sort of bad, though maybe we can "upcycle" a midsized model to RWKV. This exists somewhere. Not sure of efficiency. Inference code will be awful. } * Train on e.g. Discord Unveiled (local copy available). +* Is vision practical? Probably not without a pretrained model for it, and those don't seem to be base models. Perhaps CLIP embeddings are good enough but I wouldn't count on it. == Versions