[kwlug-disc] Have you played with locally installable and usable AI -> give a talk
kwlug at c.nixc.us
kwlug at c.nixc.us
Tue Apr 9 16:36:46 EDT 2024
I've got a copy of Ollama running on my Gaming PC if you wanted to coordinate access to play with that I also have been meddling with something I found called OpenInterpreter which is probably a terrible idea to hook an AI directly into your computer as a system operator but I installed it into a virtual desktop inside of a Docker container accessible from a browser tab.
March 30, 2024 at 11:46 PM, "Paul Nijjar via kwlug-disc - kwlug-disc at kwlug.org" <ra+ufldbbsvjexizxayvqisniizm at simplelogin.co> wrote:
>
> Another message that got rejected, but shouldn't have?
>
> ----- Forwarded message from Alex Korobkin <korobkin at gmail.com> -----
>
> Date: Sat, 30 Mar 2024 22:57:29 -0400
>
> From: Alex Korobkin <korobkin at gmail.com>
>
> To: KWLUG discussion <kwlug-disc at kwlug.org>
>
> Subject: Re: [kwlug-disc] Have you played with locally installable and usable
>
> AI -> give a talk
>
> I didn't do anything beyond this single article, but the idea is neat: you
>
> download a single 4Gb file, make it executable and run it against images or
>
> text to generate your AI output, entirely locally.
>
> In this article they explain how to use it to rename photos in accordance
>
> with photos content.
>
> https://hackaday.com/2023/12/29/using-local-ai-on-the-command-line-to-rename-images-and-more/
>
> On Sat, Mar 30, 2024 at 5:39 PM Mikalai Birukou <mb at 3nsoft.com> wrote:
>
> >
> > Probably will at some point. I entered this contest, and I'm waiting to
> >
> > hear back if I'm getting cloud access or hardware:
> >
> > https://www.hackster.io/contests/amd2023#challengeNav
> >
> > This is my proposal:
> >
> > https://www.hackster.io/contests/amd2023/hardware_applications/16336
> >
> > Contest is still open if you want to participate, but the hardware
> >
> > requests are finished.
> >
> > You can download most of the Open Source models here:
> >
> > https://huggingface.co/models
> >
> > Good article on how to get started:
> >
> > https://www.philschmid.de/fine-tune-llms-in-2024-with-trl
> >
> > The following heading in this llm fine tuning page makes me laugh and cry
> >
> > simultaneously, `with(xz)`:
> >
> > """
> >
> > 2. Setup development environment
> >
> > """
> >
> > I am listening to podcast
> >
> > >
> > > https://oxide.computer/podcasts/oxide-and-friends/1692510 about Large
> > >
> > > Language Models, and idea comes up, if anyone of you have tried any
> > >
> > > locally usable LLM's, have trained it, tinkered with it, just make a
> > >
> > > "let me show" kind of talk. Seriously.
> > >
> >
> > _______________________________________________
> >
> > kwlug-disc mailing list
> >
> > To unsubscribe, send an email to kwlug-disc-leave at kwlug.org
> >
> > with the subject "unsubscribe", or email
> >
> > kwlug-disc-owner at kwlug.org to contact a human being.
> >
>
> --
>
> Alex.
>
> ----- End forwarded message -----
>
> --
>
> Events: https://feeds.off-topic.kwlug.org/
>
> Housing: https://unionsd.coop/
>
> Blog: http://pnijjar.freeshell.org/
>
> _______________________________________________
>
> kwlug-disc mailing list
>
> To unsubscribe, send an email to kwlug-disc-leave at kwlug.org
>
> with the subject "unsubscribe", or email
>
> kwlug-disc-owner at kwlug.org to contact a human being.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.kwlug.org/pipermail/kwlug-disc_kwlug.org/attachments/20240409/62fd34ec/attachment-0001.htm>
More information about the kwlug-disc
mailing list