diff options
Diffstat (limited to '')
| -rw-r--r-- | 2025/info/private-ai-after.md | 1054 |
1 files changed, 861 insertions, 193 deletions
diff --git a/2025/info/private-ai-after.md b/2025/info/private-ai-after.md index 760e20c9..f1a8149a 100644 --- a/2025/info/private-ai-after.md +++ b/2025/info/private-ai-after.md @@ -3,10 +3,11 @@ <div class="transcript transcript-mainVideo"><a name="private-ai-mainVideo-transcript"></a><h1>Transcript (unedited)</h1> -[[!template text="""Hey, everybody. Welcome from frigid Omaha, Nebraska.""" start="00:00:00.000" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Introduction""" start="00:00:00.000" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""Hey, everybody. Welcome from frigid Omaha, Nebraska.""" start="00:00:00.000" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I'm just going to kick off my talk here,""" start="00:00:04.860" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and we'll see how it all goes. Thanks for attending.""" start="00:00:06.620" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""So the slides will be available on my site, growthy.us,""" start="00:00:23.900" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""So the slides will be available on my site, https://grothe.us,""" start="00:00:23.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""in the presentation section tonight or tomorrow.""" start="00:00:26.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""This is a quick intro to one way to do private AI in Emacs.""" start="00:00:29.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""There are a lot of other ways to do it.""" start="00:00:33.100" video="mainVideo-private-ai" id="subtitle"]] @@ -16,16 +17,18 @@ [[!template text="""and how to give it a spin.""" start="00:00:42.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Really hope some of you give it a shot""" start="00:00:43.860" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and learn something along the way.""" start="00:00:45.820" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""So the overview of the talk.""" start="00:00:48.180" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Overview of talk""" start="00:00:48.180" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""So the overview of the talk""" start="00:00:48.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""broke down these basic bullet points of why private AI,""" start="00:00:50.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""what do I need to do private AI, Emacs and private AI,""" start="00:00:54.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""pieces for an AI Emacs solution,""" start="00:00:58.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""a demo of a minimal viable product, and the summary.""" start="00:01:02.740" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Why private AI? This is pretty simple.""" start="00:01:08.060" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Why private AI?""" start="00:01:08.060" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""Why private AI? This is pretty simple.""" start="00:01:08.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Just read the terms and conditions""" start="00:01:10.780" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""for any AI system you're currently using.""" start="00:01:12.100" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""If you're using the free tiers, your queries,""" start="00:01:14.820" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""code uploaded information""" start="00:01:17.020" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""code, uploaded information""" start="00:01:17.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""is being used to train the models.""" start="00:01:18.620" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""In some cases, you are giving the company""" start="00:01:20.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""a perpetual license to your data.""" start="00:01:22.940" video="mainVideo-private-ai" id="subtitle"]] @@ -49,14 +52,14 @@ [[!template text="""because people are using AI.""" start="00:02:05.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""The problem with that is now""" start="00:02:07.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""there's less data going to Stack Overflow""" start="00:02:08.580" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""for the AI to get. vicious cycle,""" start="00:02:10.380" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""for the AI to get. Vicious cycle,""" start="00:02:10.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""especially when you start looking at""" start="00:02:12.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""newer language like Ruby and stuff like that.""" start="00:02:14.620" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""So it comes down to being an interesting time.""" start="00:02:16.580" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Another reason why to go private AI is your costs are going to vary.""" start="00:02:21.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Right now, these services are being heavily subsidized.""" start="00:02:24.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""If you're paying Claude $20 a month,""" start="00:02:27.020" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""it is not costing Claude, those guys $20 a month""" start="00:02:29.420" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""it is not costing Claude, those guys, $20 a month""" start="00:02:29.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""to host all the infrastructure""" start="00:02:32.580" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""to build all these data centers.""" start="00:02:34.100" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""They are severely subsidizing that""" start="00:02:35.620" video="mainVideo-private-ai" id="subtitle"]] @@ -76,7 +79,8 @@ [[!template text="""a lot of people are using public AI right now""" start="00:03:07.540" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""are going to have no option but to move to private AI""" start="00:03:10.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""or give up on AI overall.""" start="00:03:11.900" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""What do you need to be able to do private AI?""" start="00:03:16.020" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""What do I need for private AI?""" start="00:03:16.020" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""What do you need to be able to do private AI?""" start="00:03:16.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""If you're going to run your own AI,""" start="00:03:18.660" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""you're going to need a system with either some cores,""" start="00:03:21.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""a graphics processor unit,""" start="00:03:23.580" video="mainVideo-private-ai" id="subtitle"]] @@ -108,7 +112,7 @@ [[!template text="""to be able to have the co-pilot badge on it.""" start="00:04:41.460" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And Raspberry Pi's new AI top is about 18 teraflops""" start="00:04:43.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and is $70 on top of the cost of Raspberry Pi 5.""" start="00:04:48.300" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Keep in mind Raspberry recently""" start="00:04:51.220" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Keep in mind, Raspberry recently""" start="00:04:51.220" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""raised the cost of their Pi 5s because of RAM pricing,""" start="00:04:56.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""which is going to be affecting""" start="00:04:59.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""a lot of these types of solutions in the near future.""" start="00:05:00.380" video="mainVideo-private-ai" id="subtitle"]] @@ -117,31 +121,34 @@ [[!template text="""That's what it really comes down to.""" start="00:05:06.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""A lot of people are going to have PCs on their desks.""" start="00:05:08.220" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""They're going to run a decent private AI""" start="00:05:11.180" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""without much issue. So for Emacs and private AI,""" start="00:05:13.460" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""without much issue.""" start="00:05:13.460" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Emacs and private AI""" start="00:05:16.348" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""So for Emacs and private AI,""" start="00:05:16.348" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""there's a couple popular solutions.""" start="00:05:18.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Gptel, which is the one we're going to talk about.""" start="00:05:20.140" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It's a simple interface. It's a minimal interface.""" start="00:05:22.100" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It integrates easily into your workflow.""" start="00:05:24.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It's just, quite honestly, chef's kiss,""" start="00:05:26.580" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""just a beautifully well-done piece of software.""" start="00:05:29.020" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""OllamaBuddy has more features,""" start="00:05:31.060" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Ollama Buddy has more features,""" start="00:05:31.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""a menu interface, has quick access""" start="00:05:33.860" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""for things like code refactoring,""" start="00:05:36.260" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""text-free formatting, et cetera.""" start="00:05:37.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""This is the one that you spend a little more time with,""" start="00:05:38.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but you also get a little bit more back from it.""" start="00:05:41.980" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Elama is another one, has some really good features to it,""" start="00:05:43.940" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Ellama is another one, has some really good features to it,""" start="00:05:43.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""more different capabilities,""" start="00:05:49.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but it's a different set of rules and capabilities to it.""" start="00:05:51.060" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Itermac, which is programming with your AI and Emacs.""" start="00:05:54.980" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Aidermac, which is programming with your AI and Emacs.""" start="00:05:54.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""The closest thing I can come up""" start="00:05:59.180" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""to comparing this to is Cursor, except it's an Emacs.""" start="00:06:01.220" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""to comparing this to is Cursor, except it's in Emacs.""" start="00:06:01.220" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It's really quite well done.""" start="00:06:04.140" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""These are all really quite well done.""" start="00:06:05.660" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""There's a bunch of other projects out there.""" start="00:06:07.300" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""If you go out to GitHub, type Emacs AI,""" start="00:06:08.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""you'll find a lot of different options.""" start="00:06:10.820" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""So what is a minimal viable product that can be done?""" start="00:06:13.220" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Pieces for an AI Emacs solution""" start="00:06:13.220" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""So what is a minimal viable product that can be done?""" start="00:06:13.220" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""A minimal viable product to show what an AI Emacs solution is""" start="00:06:18.460" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""can be done with only needing two pieces of software.""" start="00:06:23.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Llamafile, this is an amazing piece of software.""" start="00:06:27.180" video="mainVideo-private-ai" id="subtitle"]] @@ -155,9 +162,9 @@ [[!template text="""while it runs on a bunch of different systems.""" start="00:06:46.780" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And Gptel, which is an easy plug-in for Emacs,""" start="00:06:48.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""which we talked about in the last slide a bit.""" start="00:06:51.300" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""So setting up the LLM, you have to just go out""" start="00:06:54.980" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""and just hit the a page for it""" start="00:07:00.180" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""and go out and do a wget of it.""" start="00:07:01.700" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""So setting up the LLM, you have to just go out""" start="00:06:56.340" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""and just hit a page for it""" start="00:07:00.180" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""and go out and do a wget of it.""" start="00:07:03.543" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""That's all it takes there.""" start="00:07:05.100" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Chmodding it so you can actually execute the executable.""" start="00:07:07.100" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And then just go ahead and actually running it.""" start="00:07:10.260" video="mainVideo-private-ai" id="subtitle"]] @@ -165,20 +172,21 @@ [[!template text="""I've already downloaded it because I don't want to wait.""" start="00:07:16.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And let's just take a look at it.""" start="00:07:18.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I've actually downloaded several of them,""" start="00:07:21.260" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""but let's go ahead and just run lava 3.2b""" start="00:07:22.900" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""but let's go ahead and just run llama 3.2-1b""" start="00:07:22.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""with the 3 billion instructions. And that's it firing up.""" start="00:07:25.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And it is nice enough to actually be listening in port 8080,""" start="00:07:31.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""which we'll need in a minute.""" start="00:07:33.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""So once you do that, you have to install gptel and emacs.""" start="00:07:35.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""That's as simple as firing up emacs,""" start="00:07:43.140" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""doing the meta x install package,""" start="00:07:45.660" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""and then just typing gptel""" start="00:07:48.340" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""doing the M-x install-package,""" start="00:07:45.660" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""and then just typing gptel,""" start="00:07:48.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""if you have your repository set up right,""" start="00:07:49.780" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""which hopefully you do.""" start="00:07:51.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And then you just go ahead and have it.""" start="00:07:52.300" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""You also have to set up a config file.""" start="00:07:54.500" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Config file""" start="00:07:56.340" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""You also have to set up a config file.""" start="00:07:56.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Here's my example config file as it currently set up,""" start="00:07:58.140" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""requiring ensuring Gptel is loaded,""" start="00:08:01.740" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""requiring, ensuring Gptel is loaded,""" start="00:08:01.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""defining the Llamafile backend.""" start="00:08:04.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""You can put multiple backends into it,""" start="00:08:05.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but I just have the one defined on this example.""" start="00:08:07.780" video="mainVideo-private-ai" id="subtitle"]] @@ -198,7 +206,8 @@ [[!template text="""we can actually name those models by their domain,""" start="00:08:45.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""which is really kind of cool.""" start="00:08:47.460" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""But, uh, that's all that takes.""" start="00:08:48.700" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""So let's go ahead and go to a quick test of it.""" start="00:08:52.100" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Demo: Who was David Bowie?""" start="00:08:52.100" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""So let's go ahead and go to a quick test of it.""" start="00:08:52.100" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Oops. Alt-X, gptel. And we're going to just choose""" start="00:09:03.780" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""the default buffer to make things easier.""" start="00:09:11.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Going to resize it up a bit.""" start="00:09:12.500" video="mainVideo-private-ai" id="subtitle"]] @@ -209,8 +218,8 @@ [[!template text="""This is one that some engines do well on, other ones don't.""" start="00:09:28.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And we can just do, we can either do""" start="00:09:31.140" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""the alt X and send the gptel-send,""" start="00:09:33.740" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""or we can just do control C and hit enter.""" start="00:09:36.060" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""We'll just do control C and enter.""" start="00:09:37.980" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""or we can just do C-c and hit enter.""" start="00:09:36.060" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""We'll just do C-c and enter.""" start="00:09:37.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And now it's going ahead and hitting our local AI system""" start="00:09:39.140" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""running on port 8080. And that looks pretty good,""" start="00:09:43.660" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but let's go ahead and say, hey, it's set to terse mode right now.""" start="00:09:46.660" video="mainVideo-private-ai" id="subtitle"]] @@ -219,7 +228,8 @@ [[!template text="""of the majority of, uh, about David Bowie's life""" start="00:10:05.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and other information about him.""" start="00:10:08.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""So very, very happy with that.""" start="00:10:10.140" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""One thing to keep in mind is you look at things""" start="00:10:21.700" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Hallucinations""" start="00:10:21.700" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""One thing to keep in mind is you look at things""" start="00:10:21.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""when you're looking for hallucinations,""" start="00:10:23.540" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""how accurate AI is, how it's compressed""" start="00:10:24.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""is it will tend to screw up on things like""" start="00:10:26.900" video="mainVideo-private-ai" id="subtitle"]] @@ -227,7 +237,8 @@ [[!template text="""Let me see if it gets to that real quick.""" start="00:10:30.860" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Is it not actually on this one?""" start="00:10:32.460" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Alright, so that's the first question I always ask one.""" start="00:10:39.740" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""The next one is what are sea monkeys?""" start="00:10:42.180" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Next question: What are sea monkeys?""" start="00:10:42.180" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""The next one is what are sea monkeys?""" start="00:10:42.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It gives you an idea of the breadth of the system.""" start="00:10:44.660" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It's querying right now. Pulls it back correctly. Yes.""" start="00:10:48.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And it's smart enough to actually detect David Bowie""" start="00:11:10.620" video="mainVideo-private-ai" id="subtitle"]] @@ -237,12 +248,13 @@ [[!template text="""and that which is very cool feature.""" start="00:11:18.860" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I did not see that coming.""" start="00:11:20.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Here's one that some people say is a really good one""" start="00:11:21.460" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""to ask ours in strawberry.""" start="00:11:24.140" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""All right, now she's going off the reservation.""" start="00:11:25.740" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""to ask. Rs in "strawberry."""" start="00:11:24.140" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""All right, now she's going off the reservation.""" start="00:11:42.780" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""She's going in a different direction.""" start="00:11:46.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Let me go ahead and reopen that again,""" start="00:11:48.140" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""because it's went down a bad hole there for a second.""" start="00:11:49.980" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Let me ask it to do write hello world in Emacs list.""" start="00:11:52.980" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""because it went down a bad hole there for a second.""" start="00:11:49.980" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Writing Hello World in Emacs Lisp""" start="00:11:57.180" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""Let me ask it to write hello world in Emacs Lisp.""" start="00:11:57.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Yep, that works. So the point being here,""" start="00:11:58.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""that was like two minutes of setup.""" start="00:12:10.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And now we have a small AI embedded inside the system.""" start="00:12:14.940" video="mainVideo-private-ai" id="subtitle"]] @@ -250,7 +262,8 @@ [[!template text="""And it's just running locally on the system.""" start="00:12:20.540" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""We also have the default system here as well.""" start="00:12:22.300" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""So not that bad.""" start="00:12:25.260" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""That's a basic solution, that's a basic setup""" start="00:12:32.580" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Pieces for a better solution""" start="00:12:32.580" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""That's a basic solution, that's a basic setup""" start="00:12:32.580" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""that will get you to the point where you can go like,""" start="00:12:35.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""it's a party trick, but it's a very cool party trick.""" start="00:12:37.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""The way that Gptel works is it puts it into buffers,""" start="00:12:39.860" video="mainVideo-private-ai" id="subtitle"]] @@ -262,7 +275,7 @@ [[!template text="""for things that are really cool for that.""" start="00:12:53.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""But if you want a better solution,""" start="00:12:55.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I recommend Ollama or LM Studio.""" start="00:12:57.100" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""They're both more capable than llama file.""" start="00:12:59.940" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""They're both more capable than Llamafile.""" start="00:12:59.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""They can accept a lot of different models.""" start="00:13:01.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""You can do things like RAG.""" start="00:13:03.860" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""You can do loading of things onto the GPU more explicitly.""" start="00:13:05.740" video="mainVideo-private-ai" id="subtitle"]] @@ -271,19 +284,20 @@ [[!template text="""it will let you put your data into the system""" start="00:13:13.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""so you can start uploading your code, your information,""" start="00:13:15.540" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and actually being able to do analysis of it.""" start="00:13:17.780" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""OpenWebUI provides more capabilities.""" start="00:13:20.140" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Open WebUI provides more capabilities.""" start="00:13:20.140" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It provides an interface that's similar""" start="00:13:23.540" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""to what you're used to seeing""" start="00:13:24.860" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""for chat, GPT, and the other systems.""" start="00:13:25.900" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""for ChatGPT and the other systems.""" start="00:13:25.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It's really quite well done.""" start="00:13:28.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And once again, gptel, I have to mention that""" start="00:13:29.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""because that's the one I really kind of like.""" start="00:13:32.540" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""And OlamaBuddy is also another really nice one.""" start="00:13:34.780" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""So what about the licensing of these models?""" start="00:13:36.900" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""And Ollama Buddy is also another really nice one.""" start="00:13:34.780" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""What about the license?""" start="00:13:36.900" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""So what about the licensing of these models?""" start="00:13:36.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Since I'm going out pulling down""" start="00:13:41.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""a model and doing this stuff.""" start="00:13:42.300" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Let's take a look at a couple of highlights""" start="00:13:43.580" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""from the MetaLlama 3 community license scale.""" start="00:13:46.580" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""from the Meta Llama 3 community license scale.""" start="00:13:46.580" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""If your service exceeds 700 million monthly users,""" start="00:13:49.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""you need additional licensing.""" start="00:13:52.580" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Probably not going to be a problem for most of us.""" start="00:13:54.100" video="mainVideo-private-ai" id="subtitle"]] @@ -300,7 +314,7 @@ [[!template text="""And you can distribute the model with derivatives.""" start="00:14:20.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And there are some very cool ones out there.""" start="00:14:22.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""There's people who've done things""" start="00:14:24.060" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""to try and make the llama bee less, what's the phrase,""" start="00:14:25.260" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""to try and make the Llama be less, what's the phrase,""" start="00:14:25.260" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""ethical if you're doing penetration testing research""" start="00:14:29.580" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and stuff like that.""" start="00:14:31.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It has some very nice value there.""" start="00:14:32.620" video="mainVideo-private-ai" id="subtitle"]] @@ -310,10 +324,11 @@ [[!template text="""It's designed to keep it to research and development.""" start="00:14:42.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""You can't use it commercially.""" start="00:14:45.220" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""So it's designed to clearly delineate""" start="00:14:46.740" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""between research and development""" start="00:14:50.420" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""between research and development""" start="00:14:51.793" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and somebody trying to actually build""" start="00:14:52.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""something on top of it.""" start="00:14:54.260" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""And another question I get asked is,""" start="00:14:55.380" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Are there open source data model options?""" start="00:14:56.580" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""And another question I get asked is,""" start="00:14:56.580" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""are there open source data model options?""" start="00:14:57.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Yeah, but most of them are small or specialized currently.""" start="00:14:59.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""MoMo is a whole family of them,""" start="00:15:02.820" video="mainVideo-private-ai" id="subtitle"]] @@ -321,12 +336,13 @@ [[!template text="""but it's very cool to see where it's going.""" start="00:15:07.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And it's another thing that's just going forward.""" start="00:15:09.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It's under the MIT license.""" start="00:15:11.340" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Some things to know to help you""" start="00:15:13.380" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Things to know""" start="00:15:14.520" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""Some things to know to help you""" start="00:15:14.520" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""have a better experience with this.""" start="00:15:15.820" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Get a Llama and OpenWebUI working by themselves,""" start="00:15:17.500" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Get ollama and Open WebUI working by themselves,""" start="00:15:17.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""then set up your config file.""" start="00:15:21.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I was fighting both at the same time,""" start="00:15:22.660" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""and it turned out I had a problem with my LLAMA.""" start="00:15:24.820" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""and it turned out I had a problem with my ollama.""" start="00:15:24.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I had a conflict, so that was what my problem is.""" start="00:15:26.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Llamafile, gptel is a great way to start experimenting""" start="00:15:28.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""just to get you an idea of how it works""" start="00:15:32.820" video="mainVideo-private-ai" id="subtitle"]] @@ -334,27 +350,27 @@ [[!template text="""RAG loading documents into it is really easy with open web UI.""" start="00:15:36.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""You can create models, you can put things like""" start="00:15:40.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""help desk developers and stuff like that, breaking it out.""" start="00:15:43.020" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""The Hacker News has a how to build a $300 AI computer.""" start="00:15:46.420" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""The Hacker Noon has a how to build a $300 AI computer.""" start="00:15:46.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""This is for March 2024,""" start="00:15:51.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but it still has a lot of great information""" start="00:15:52.860" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""on how to benchmark the environments,""" start="00:15:55.100" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""what some values are like the Ryzen 5700U""" start="00:15:56.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""inside my Acer Aspire,""" start="00:16:01.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""that's where I got the idea doing that.""" start="00:16:02.580" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Make sure you do the RockM stuff correctly""" start="00:16:04.420" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Make sure you do the ROCm stuff correctly""" start="00:16:04.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""to get the GUI extensions. But it's just really good stuff.""" start="00:16:06.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""You don't need a great GPU or CPU to get started.""" start="00:16:09.900" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Smaller models like Tiny Llama""" start="00:16:13.060" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Smaller models like tinyllama""" start="00:16:13.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""can run on very small systems.""" start="00:16:14.820" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""It gets you the ability to start playing with it""" start="00:16:16.180" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""and start experimenting and figure out if that's for you""" start="00:16:18.500" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""It gets you the ability to start playing with it""" start="00:16:16.820" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""and start experimenting and figure out if that's for you""" start="00:16:19.043" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and to move forward with it.""" start="00:16:21.620" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""The AMD Ryzen AI Max 395 plus is a mini PC""" start="00:16:23.380" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""The AMD Ryzen AI Max+ 395 is a mini PC""" start="00:16:23.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""makes it really nice dedicated host.""" start="00:16:29.220" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""You used to be able to buy these for about $1200 now""" start="00:16:31.180" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""with the RAM price increase,""" start="00:16:34.620" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""you want to get 120 gig when you're pushing two brands so.""" start="00:16:35.580" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""It gets a little tighter.""" start="00:16:38.780" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""You used to be able to buy these for about $1200.""" start="00:16:31.180" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Now with the RAM price increase,""" start="00:16:34.079" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""you want to get 120 gig when you're pushing two brands,""" start="00:16:35.580" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""so it gets a little tighter.""" start="00:16:38.459" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Macs work remarkably well with AI.""" start="00:16:40.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""My MacBook Air was one of my go-tos for a while,""" start="00:16:44.100" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but once I started doing anything AI,""" start="00:16:47.660" video="mainVideo-private-ai" id="subtitle"]] @@ -367,10 +383,10 @@ [[!template text="""but still you're going to be pushing against that.""" start="00:17:00.140" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""So Mac Minis and the Mac Ultras and stuff like that""" start="00:17:02.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""tend to work really well for that.""" start="00:17:04.940" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Alex Ziskin on YouTube has a channel.""" start="00:17:06.100" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Alex Ziskind on YouTube has a channel.""" start="00:17:06.100" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""He does a lot of AI performance benchmarking,""" start="00:17:09.780" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""like I load a 70 billion parameter model""" start="00:17:11.900" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""on this mini PC and stuff like that.""" start="00:17:14.820" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""like "I load a 70 billion parameter model""" start="00:17:11.900" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""on this mini PC" and stuff like that.""" start="00:17:14.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It's a lot of fun and interesting stuff there.""" start="00:17:16.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And it's influencing my decision""" start="00:17:19.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""to buy my next AI style PC.""" start="00:17:21.220" video="mainVideo-private-ai" id="subtitle"]] @@ -379,12 +395,12 @@ [[!template text="""it sounds like a really cool idea.""" start="00:17:29.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It gives you capabilities to start training stuff""" start="00:17:31.660" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""that you couldn't do with like the big ones.""" start="00:17:34.300" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Even with in terms of fine tuning and stuff,""" start="00:17:35.900" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Even with in terms of fine-tuning and stuff,""" start="00:17:35.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""it's remarkable to see where that space is coming along""" start="00:17:38.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""in the next year or so.""" start="00:17:40.540" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Hugging Face Co has pointers to tons of AI models.""" start="00:17:41.740" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""HuggingFace.co has pointers to tons of AI models.""" start="00:17:41.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""You'll find the one that works for you, hopefully there.""" start="00:17:46.220" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""If you're doing cybersecurity,""" start="00:17:49.260" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""If you're doing cybersecurity,""" start="00:17:48.418" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""there's a whole bunch out there for that,""" start="00:17:50.540" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""that have certain training on it, information.""" start="00:17:52.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It's really good.""" start="00:17:54.620" video="mainVideo-private-ai" id="subtitle"]] @@ -394,7 +410,7 @@ [[!template text="""Don't be using it for court cases like some people have""" start="00:18:05.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and run into those problems. So, That is my talk.""" start="00:18:08.460" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""What I would like you to get out of that is,""" start="00:18:14.540" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""if you haven't tried it, give GPTEL and LlamaFile a shot.""" start="00:18:17.220" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""if you haven't tried it, give Gptel and LlamaFile a shot.""" start="00:18:17.220" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Fire up a little small AI instance,""" start="00:18:21.860" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""play around with a little bit inside your Emacs,""" start="00:18:23.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and see if it makes your life better. Hopefully it will.""" start="00:18:27.340" video="mainVideo-private-ai" id="subtitle"]] @@ -403,13 +419,13 @@ [[!template text="""And the links are at the end of the talk, if you have any questions.""" start="00:18:34.660" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Let me see if we got anything you want, Pat. You do.""" start="00:18:38.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""You've got a few questions.""" start="00:18:42.740" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Hey, this is Corwin. Thank you so much. Thank you, Aaron.""" start="00:18:43.900" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Corwin]: Hey, this is Corwin. Thank you so much. Thank you, Aaron.""" start="00:18:43.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""What an awesome talk this was, actually.""" start="00:18:48.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""If you don't have a camera,""" start="00:18:50.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I can get away with not having one too.""" start="00:18:52.180" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""I've got, I'll turn the camera on.""" start="00:18:54.340" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Okay. All right. I'll turn mine back on. Here I come.""" start="00:18:56.300" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Yeah, so there are a few questions,""" start="00:19:01.500" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: I've got, I'll turn the camera on.""" start="00:18:54.340" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Corwin]: Okay. All right. I'll turn mine back on. Here I come.""" start="00:18:56.300" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Yeah, so there are a few questions,""" start="00:18:59.834" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but first let me say thank you""" start="00:19:03.140" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""for a really captivating talk.""" start="00:19:04.580" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I think a lot of people will be empowered from this""" start="00:19:06.340" video="mainVideo-private-ai" id="subtitle"]] @@ -420,53 +436,55 @@ [[!template text="""So just thinking about how we can""" start="00:19:26.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""put infrastructure we have at home to use""" start="00:19:28.220" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and get more done with less.""" start="00:19:32.420" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Yeah, the data center impact's interesting""" start="00:19:34.020" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: Yeah, the data center impact's interesting""" start="00:19:34.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""because there was a study a while ago.""" start="00:19:37.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Someone said every time you do a Gemini query,""" start="00:19:39.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""it's like boiling a cup of water.""" start="00:19:42.100" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Yeah, I've heard that one too. So do you want to, you know,""" start="00:19:45.020" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Corwin]: Yeah, I've heard that one too. So do you want to, you know,""" start="00:19:45.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I don't know how much direction you want.""" start="00:19:48.620" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I'd be very happy to read out the questions for you.""" start="00:19:51.700" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Yeah, that would be great.""" start="00:19:53.860" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: Yeah, that would be great.""" start="00:19:53.860" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I'm having trouble getting to that tab.""" start="00:19:55.220" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Okay, I'm there, so I'll put it into our chat too,""" start="00:19:57.620" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Corwin]: Okay, I'm there, so I'll put it into our chat too,""" start="00:19:57.620" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""so you can follow along if you'd like.""" start="00:20:02.780" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""The first question was, why is the David Bowie question""" start="00:20:07.420" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Q: Why is the David Bowie question a good one for testing a model? e.g. does it fail in interesting ways?""" start="00:20:07.420" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""[Corwin]: The first question was, why is the David Bowie question""" start="00:20:07.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""a good one to start with?""" start="00:20:11.220" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Does it have interesting failure conditions""" start="00:20:12.220" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""or what made you choose that?""" start="00:20:14.420" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""First off, huge fan of David Bowie.""" start="00:20:17.300" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: First off, huge fan of David Bowie.""" start="00:20:16.640" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""But I came down to it really taught me a few things""" start="00:20:21.980" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""about how old the models work""" start="00:20:24.500" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""about how the models work""" start="00:20:24.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""in terms of things like how many kids he had,""" start="00:20:26.300" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""because deep seek, which is a very popular Chinese model""" start="00:20:28.820" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""because Deepseek, which is a very popular Chinese model""" start="00:20:28.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""that a lot of people are using now,""" start="00:20:31.780" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""misidentifies him having three daughters,""" start="00:20:33.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and he has like one son and one, one, I think,""" start="00:20:35.620" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""two sons and a daughter or something like that.""" start="00:20:38.460" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""so there's differences on that and it just goes over""" start="00:20:40.900" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""so there's differences on that, and it just goes over...""" start="00:20:40.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""there's a whole lot of stuff""" start="00:20:43.660" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""because his story spans like 60 years""" start="00:20:45.300" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""so it gives a good good feedback""" start="00:20:47.780" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""that's the real main reason I asked that question""" start="00:20:49.660" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""because I just needed one that sea monkeys I just picked""" start="00:20:51.540" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""because it was obscure and just always have right""" start="00:20:53.700" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""I used to have it right hello world and forth""" start="00:20:56.580" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""because I thought was an interesting one as well so""" start="00:20:58.940" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""because his story spans like 60 years,""" start="00:20:45.300" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""so it gives good feedback.""" start="00:20:47.780" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""That's the real main reason I asked that question""" start="00:20:49.660" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""because I just needed one... That sea monkeys, I just picked""" start="00:20:51.540" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""because it was obscure, and just always have, write,""" start="00:20:53.700" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""I used to have it write hello world in forth""" start="00:20:56.580" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""because I thought was an interesting one as well.""" start="00:20:58.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It's just picking random ones like that.""" start="00:21:01.020" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""One question asked, sorry, a lot of models is,""" start="00:21:03.900" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""One question I ask a lot of models is,""" start="00:21:03.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""what is the closest star to the Earth?""" start="00:21:06.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Because most of them will say Alpha Centauri""" start="00:21:09.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""or Proxima Centauri and not the sun.""" start="00:21:12.020" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""And I have a whole nother talk""" start="00:21:13.740" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""And I have a whole 'nother talk""" start="00:21:13.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""where I just argue with the LLM""" start="00:21:15.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""trying to say, hey, the sun is a star.""" start="00:21:17.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And he just wouldn't accept it, so. What?""" start="00:21:20.020" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Oh, I can hear that.""" start="00:21:26.580" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""So what specific tasks do you like to use your local AI?""" start="00:21:28.420" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""I like to load a lot of my code into""" start="00:21:34.380" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Oh, I can... You're there.""" start="00:21:26.580" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Q: What specific tasks do you use local AI for?""" start="00:21:30.740" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""[Corwin]: So what specific tasks do you like to use your local AI?""" start="00:21:30.740" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: I like to load a lot of my code into""" start="00:21:34.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and actually have it do analysis of it.""" start="00:21:37.460" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""I was actually going through some code""" start="00:21:39.740" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""I was actually going through some code""" start="00:21:39.100" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I have for some pen testing, and I was having it modified""" start="00:21:42.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""to update it for the newer version,""" start="00:21:45.620" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""because I hate to say this,""" start="00:21:47.260" video="mainVideo-private-ai" id="subtitle"]] @@ -480,56 +498,59 @@ [[!template text="""if you're doing cyber security researching.""" start="00:22:03.260" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and you have your white papers""" start="00:22:04.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and stuff like that and stuff in there.""" start="00:22:06.820" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""I've got a lot of that loaded into RAG""" start="00:22:10.780" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""in one model on my OpenWebUI system.""" start="00:22:13.980" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Neat. Have you used have you used""" start="00:22:15.660" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""I've got a lot of that loaded into RAG""" start="00:22:08.418" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""in one model on my Open WebUI system.""" start="00:22:10.626" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Q: Have you used any small domain-specific LLMs? What are the kinds of tasks they specialize in, and how do I find and use them?""" start="00:22:16.880" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""[Corwin]: Neat. Have you used have you used""" start="00:22:16.880" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""any small domain specific LLMs? What kind of tasks?""" start="00:22:21.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""If so, what kind of tasks that they specialize in?""" start="00:22:25.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And you know, how?""" start="00:22:30.420" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Not to be honest, but there are some out there like once again,""" start="00:22:32.140" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: Not to be honest, but there are some out there like once again,""" start="00:22:32.140" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""for cybersecurity and stuff like that,""" start="00:22:34.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""that I really need to dig into that's on my to do list.""" start="00:22:36.780" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I've got a couple weeks off at the end of the year.""" start="00:22:39.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And that's a big part of my plan for that.""" start="00:22:41.700" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Are the various models updated pretty regularly?""" start="00:22:43.780" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Q: Are the various models updated regularly? Can you add your own data to pre-built models?""" start="00:22:46.540" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""[Corwin]: Are the various models updated pretty regularly?""" start="00:22:46.540" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Can you add your own data to the pre-built models?""" start="00:22:49.380" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Yes. The models are updated pretty reasonably.""" start="00:22:52.060" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: Yes. The models are updated pretty reasonably.""" start="00:22:52.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""You can add data to a model in a couple of different ways.""" start="00:22:56.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""You can do something called fine-tuning,""" start="00:22:59.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""which requires a really nice GPU and a lot of CPU time.""" start="00:23:01.100" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Probably not going to do that.""" start="00:23:03.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""You can do retrieval augmentation generation,""" start="00:23:05.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""which is you load your data on top of the system""" start="00:23:07.420" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""and puts inside a database""" start="00:23:09.500" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""and put inside a database,""" start="00:23:09.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and you can actually scan that and stuff.""" start="00:23:11.300" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I have another talk where I go through""" start="00:23:12.860" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and I start asking questions about,""" start="00:23:14.620" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I load the talk into the engine""" start="00:23:16.220" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and I ask questions against that.""" start="00:23:18.580" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""I would have one more time would have done that""" start="00:23:20.100" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""but it comes down to how many That's that's rag rag""" start="00:23:22.180" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""is pretty easy to do through open web UI or LM studio""" start="00:23:26.500" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""It's a great way you just like point a folder""" start="00:23:29.420" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""point it to a folder and it just sucks all that state into""" start="00:23:31.420" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""and it'll hit that data first""" start="00:23:34.100" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""you have like helpdesk and stuff and""" start="00:23:35.500" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""The other options there's vector databases,""" start="00:23:36.860" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""which is like if you use PostgreSQL.""" start="00:23:39.620" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""It has a PG vector I can do a lot of that stuff.""" start="00:23:41.820" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""If I would have had time, I would have done that,""" start="00:23:20.100" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""but it comes down to how many... That's RAG.""" start="00:23:22.180" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""RAG is pretty easy to do through Open WebUI or LM studio.""" start="00:23:25.797" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""It's a great way, you just, like,""" start="00:23:29.420" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""point it to a folder and it just sucks all that state into...""" start="00:23:31.420" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""and it'll hit that data first.""" start="00:23:34.100" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""You have like helpdesk and stuff and...""" start="00:23:35.500" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""The other options: there's vector databases,""" start="00:23:36.860" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""which is, like, if you use PostgreSQL,""" start="00:23:39.620" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""it has a pg vector that can do a lot of that stuff.""" start="00:23:41.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I've not dug into that yet,""" start="00:23:43.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but that is also on that to-do list""" start="00:23:44.740" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""I've got a lot of stuff planned for Cool.""" start="00:23:46.100" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""So what are your experience with rags?""" start="00:23:48.460" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""I've got a lot of stuff planned for...""" start="00:23:46.100" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Q: What is your experience with RAG? Are you using them and how have they helped?""" start="00:23:48.056" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""[Corwin]: Cool. So what are your experience with RAGs?""" start="00:23:48.056" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I don't even know what that means.""" start="00:23:51.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Do you know what that means?""" start="00:23:54.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Do you remember this question again?""" start="00:23:57.420" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""What is your experience with RAGs? RAGs is great.""" start="00:23:59.620" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""That's Retrieval Augmentation Generation.""" start="00:24:03.980" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""What is your experience with RAGs?""" start="00:23:59.620" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: RAGs is great. That's Retrieval Augmentation Generation.""" start="00:24:03.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""That loads your data first, and it hits yours,""" start="00:24:07.460" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and it'll actually cite it and stuff.""" start="00:24:09.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""There's a guy who wrote a RAG in 100 lines of Python,""" start="00:24:11.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and it's an impressive piece of software.""" start="00:24:14.660" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""I think if you hit one of my site,""" start="00:24:16.900" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""I think if you hit one of my sites,""" start="00:24:16.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I've got a private AI talk where I actually refer to that.""" start="00:24:18.780" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""But retrieval augmentation, it's easy, it's fast,""" start="00:24:22.100" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""it puts your data into the system,""" start="00:24:25.220" video="mainVideo-private-ai" id="subtitle"]] @@ -537,10 +558,11 @@ [[!template text="""That's one of the great things about AI,""" start="00:24:31.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""especially private AI,""" start="00:24:32.660" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""is you can do whatever you want to with it""" start="00:24:33.620" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""and build up with it as you get more experience.""" start="00:24:37.740" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Any thoughts on running things""" start="00:24:43.180" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""and build up with it as you get more experience.""" start="00:24:35.626" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Q: Thoughts on running things on AWS/digital ocean instances, etc?""" start="00:24:38.834" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""[Corwin]: Any thoughts on running things""" start="00:24:38.834" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""on AWS, DigitalOcean, and so on?""" start="00:24:44.220" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""AWS is not bad.""" start="00:24:49.180" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: AWS is not bad.""" start="00:24:49.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""The DigitalOcean, they have some of their GPUs.""" start="00:24:50.620" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I still don't like having the data""" start="00:24:52.660" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""leave my house, to be honest, or at work,""" start="00:24:54.380" video="mainVideo-private-ai" id="subtitle"]] @@ -557,14 +579,15 @@ [[!template text="""usually a certain number of stuff.""" start="00:25:20.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And Google's also has it,""" start="00:25:21.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but I still tend to keep more stuff on local PCs,""" start="00:25:23.180" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""because I just paranoid that way. Gotcha.""" start="00:25:26.740" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""What has your experience been using AI?""" start="00:25:33.300" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""because I'm just paranoid that way.""" start="00:25:26.740" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Q: What has your experience been using AI for cyber security applications? What do you usually use it for?""" start="00:25:31.078" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""[Corwin]: Gotcha. What has your experience been using AI?""" start="00:25:31.078" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Do you want to get into that, using AI for cybersecurity?""" start="00:25:35.580" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""You might have already touched on this.""" start="00:25:40.140" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Yeah, really, for cybersecurity,""" start="00:25:42.020" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: Yeah, really, for cybersecurity,""" start="00:25:42.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""what I've had to do is I've dumped logs""" start="00:25:44.380" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""to have a due correlation.""" start="00:25:46.260" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Keep in mind, the size of that LLAMA file we were using""" start="00:25:47.300" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""to have it do correlation.""" start="00:25:46.260" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Keep in mind, the size of that Llama file we were using""" start="00:25:47.300" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""for figuring out David Bowie, writing the hello world,""" start="00:25:49.860" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""all that stuff, is like six gig.""" start="00:25:52.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""How does it get the entire world in six gig?""" start="00:25:54.180" video="mainVideo-private-ai" id="subtitle"]] @@ -578,20 +601,21 @@ [[!template text="""But I want to work on something to do that more locally""" start="00:26:12.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and be able to actually drive this stuff over that.""" start="00:26:15.300" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""That's one also on the long-term goals.""" start="00:26:19.020" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""So we got any other questions or?""" start="00:26:21.980" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Corwin]: So we got any other questions or?""" start="00:26:24.660" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Those are the questions that I see.""" start="00:26:26.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I want to just read out a couple of comments""" start="00:26:29.100" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""that I saw in IRC though.""" start="00:26:31.180" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Jay Rutabaga says, it went very well""" start="00:26:33.420" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""jrootabaga says, it went very well""" start="00:26:33.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""from an audience perspective.""" start="00:26:36.700" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""And G Gundam says, respect your commitment to privacy.""" start="00:26:39.260" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""And GGundam says, respect your commitment to privacy.""" start="00:26:39.260" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And then somebody is telling us""" start="00:26:43.620" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""we might have skipped a question.""" start="00:26:45.620" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""So I'm just going to run back to my list.""" start="00:26:46.780" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Updated regularly experience.""" start="00:26:50.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I just didn't type in the answer here's""" start="00:26:52.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and there's a couple more questions coming in so""" start="00:26:57.660" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Is there a disparity where you go to paid models""" start="00:26:59.660" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Q: Is there a disparity where you go to paid models becouse they are better and what problems would those be?""" start="00:26:59.660" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""Is there a disparity where you go to paid models""" start="00:26:59.660" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""because they are better and what problems?""" start="00:27:04.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""You know what would drive you to? That's a good question.""" start="00:27:08.620" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Paid models, I don't mind them. I think they're good,""" start="00:27:14.020" video="mainVideo-private-ai" id="subtitle"]] @@ -614,25 +638,25 @@ [[!template text="""But, uh, there's, there's a lot of money""" start="00:27:49.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""going into these AIs and stuff,""" start="00:27:52.140" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but in terms of the ability to get a decent one,""" start="00:27:53.900" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""like the llama, llama three, two,""" start="00:27:56.220" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""like the llama, llama 3.2,""" start="00:27:56.220" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and load your data into it, you can be pretty competitive.""" start="00:27:57.980" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""You're not going to get all the benefits,""" start="00:28:01.700" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""but you have more control over it.""" start="00:28:04.780" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""So it's, it's a, this and that it's a,""" start="00:28:07.300" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""it's a balancing act.""" start="00:28:11.820" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Okay, and I think I see a couple more questions coming in.""" start="00:28:13.140" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""What is the largest parameter size for local models""" start="00:28:15.540" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""You're not going to get all the benefits,""" start="00:28:01.240" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""but you have more control over it.""" start="00:28:02.793" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""So it's a balancing act.""" start="00:28:04.334" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Corwin]: Okay, and I think I see a couple more questions coming in.""" start="00:28:11.001" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Q: What's the largest (in parameter size) local model you've been able to successfully run locally, and do you run into issues with limited context window size?""" start="00:28:14.126" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""What is the largest parameter size for local models""" start="00:28:14.126" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""that you've been able to successfully run locally""" start="00:28:19.620" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""and do run into issues with limited context window size?""" start="00:28:22.460" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""The top eight models will tend to have a larger ceiling.""" start="00:28:26.060" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Yes, yes, yes, yes, yes.""" start="00:28:29.660" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""and do you run into issues with limited context window size?""" start="00:28:22.460" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""The top paid models will tend to have a larger ceiling.""" start="00:28:26.060" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: Yes, yes, yes, yes, yes.""" start="00:28:29.660" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""By default, the context size is I think 1024.""" start="00:28:32.860" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""But I've upped it to 8192 on the on this box, the Pangolin""" start="00:28:37.020" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""because it seems to be some reason""" start="00:28:44.620" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""it's just a very working quite well.""" start="00:28:46.940" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""But the largest ones I've loaded have been in""" start="00:28:49.460" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""the have not been that huge.""" start="00:28:52.220" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""I've loaded this the last biggest one I've done.""" start="00:28:54.060" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""But I've upped it to 8192 on this box, the Pangolin,""" start="00:28:37.020" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""because it seems to be, for some reason,""" start="00:28:41.161" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""it's just a very... working quite well.""" start="00:28:43.543" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""But the largest ones I've loaded have been in the...""" start="00:28:45.209" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""have not been that huge.""" start="00:28:49.751" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""I've loaded this... the last biggest one I've done...""" start="00:28:51.334" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""That's the reason why I'm planning""" start="00:28:55.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""on breaking down and buying a Ryzen.""" start="00:28:57.460" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Actually, I'm going to buy""" start="00:29:01.340" video="mainVideo-private-ai" id="subtitle"]] @@ -645,7 +669,7 @@ [[!template text="""but I mostly stick with the smaller size models""" start="00:29:17.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and the ones that are more quantitized""" start="00:29:20.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""because it just tends to work better for me.""" start="00:29:22.820" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""We've still got over 10 minutes before we're cutting away,""" start="00:29:26.620" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Corwin]: We've still got over 10 minutes before we're cutting away,""" start="00:29:26.620" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but I'm just anticipating""" start="00:29:29.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""that we're going to be going strong at the 10 minute mark.""" start="00:29:30.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""So I'm just, just letting, you know,""" start="00:29:32.860" video="mainVideo-private-ai" id="subtitle"]] @@ -655,9 +679,10 @@ [[!template text="""even if we aren't able to stay with it all.""" start="00:29:44.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Okay. And we've got 10 minutes""" start="00:29:47.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""where we're still going to stay live.""" start="00:29:49.740" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""So next question coming in, I see, are there free as in freedom,""" start="00:29:52.380" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Q: Are there "Free" as in FSF/open source issues with the data?""" start="00:29:52.380" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""So next question coming in, I see, are there free as in freedom,""" start="00:29:52.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""free as in FSF issues with the data?""" start="00:30:00.140" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Yes, where's the data coming from is a huge question with AI.""" start="00:30:05.740" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: Yes, where's the data coming from is a huge question with AI.""" start="00:30:05.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It's astonishing you can ask questions""" start="00:30:11.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""to models that you don't know where it's coming from.""" start="00:30:13.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""That is gonna be one of the big issues long-term.""" start="00:30:16.900" video="mainVideo-private-ai" id="subtitle"]] @@ -666,24 +691,26 @@ [[!template text="""but it's, I mean, if you look at, God,""" start="00:30:22.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I can't remember who it was.""" start="00:30:25.260" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Somebody was actually out torrenting books""" start="00:30:27.060" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""just to be able to build into their AI system.""" start="00:30:28.660" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""just to be able to build it into their AI system.""" start="00:30:28.660" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I think it might've been Meta.""" start="00:30:30.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""So there's a lot of that going on.""" start="00:30:32.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""The open source of the stuff is going to be tough.""" start="00:30:34.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""There's going to be there's some models""" start="00:30:38.140" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""like the mobile guys have got their own license,""" start="00:30:39.460" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but where they're getting their data from,""" start="00:30:41.420" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""I'm not sure on so that that's a huge question.""" start="00:30:42.740" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""That's a that's a talk in itself.""" start="00:30:45.500" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""But yeah, but you if you train on your RAG and your data,""" start="00:30:47.980" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""I'm not sure, so that's a huge question.""" start="00:30:42.740" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""That's a talk in itself.""" start="00:30:45.500" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""But yeah, if you train on your RAG and your data,""" start="00:30:47.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""you know what it's come, you know,""" start="00:30:51.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""you have a license that""" start="00:30:53.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but the other stuff is just""" start="00:30:54.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""more lines of supplement""" start="00:30:55.140" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""if you're using a smaller model,""" start="00:30:56.740" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""but the comment online, I see a couple of them.""" start="00:31:01.380" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""if you're using a smaller model.""" start="00:30:56.740" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Corwin]: The comments online, I see a couple of them.""" start="00:31:01.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I'll read them out in order here. Really interesting stuff.""" start="00:31:05.420" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Thank you for your talk. Given that large AI companies""" start="00:31:08.340" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Thank you for your talk.""" start="00:31:08.340" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Q: Given that large AI companies are openly stealing IP and copyright, thereby eroding the authority of such law (and eroding truth itself as well), can you see a future where IP & copyright flaw become untenable and what sort of onwards effect might that have?""" start="00:31:09.557" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""Given that large AI companies""" start="00:31:09.557" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""are openly stealing intellectual property and copyright""" start="00:31:11.660" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and therefore eroding the authority of such laws""" start="00:31:14.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and maybe obscuring the truth itself,""" start="00:31:18.940" video="mainVideo-private-ai" id="subtitle"]] @@ -702,21 +729,22 @@ [[!template text="""my personal opinion, and I'm not a lawyer,""" start="00:31:53.580" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and I do not have money.""" start="00:31:56.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""So don't sue me, is there's going to be""" start="00:31:57.460" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""the current administration tends is very AI pro AI.""" start="00:31:58.860" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""the current administration tends is very AI, pro AI.""" start="00:31:58.860" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And there's very a great deal of lobbying by those groups.""" start="00:32:02.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And it's on both sides.""" start="00:32:05.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And it's going to be, it's gonna be interesting to see""" start="00:32:07.140" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""what happens to copyright the next 510 years.""" start="00:32:09.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I just don't know how it keeps up""" start="00:32:11.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""without there being some adjustments and stuff.""" start="00:32:13.340" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Okay, and then another comment I saw,""" start="00:32:16.060" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Comment: File size is not going to be the bottleneck, your RAM is.""" start="00:32:18.060" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""[Corwin]: Okay, and then another comment I saw,""" start="00:32:18.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""file size is not going to be a bottleneck.""" start="00:32:20.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""RAM is. You'll need 16 gigabytes of RAM""" start="00:32:23.220" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""to run the smallest local models""" start="00:32:25.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and 512 gigabytes of RAM to run the larger ones.""" start="00:32:28.260" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""You'll need a GPU with that much memory""" start="00:32:31.980" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""if you want it to run quickly. Yeah. Oh no.""" start="00:32:35.060" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""It also depends upon how your memory is laid out.""" start="00:32:39.100" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""if you want it to run quickly.""" start="00:32:35.060" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: Yeah. Oh no. It also depends upon how your memory is laid out.""" start="00:32:38.319" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Like example being the Ultra i285H""" start="00:32:41.260" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I plan to buy, that has 96 gig of memory.""" start="00:32:45.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It's unified between the GPU and the CPU share it,""" start="00:32:47.900" video="mainVideo-private-ai" id="subtitle"]] @@ -725,7 +753,7 @@ [[!template text="""but you're able to load more of it into memory.""" start="00:32:55.780" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""So it's able to do some additional stuff with it""" start="00:32:57.580" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""as opposed to come off disk.""" start="00:32:59.420" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""It's all balancing act. If you hit Zyskin's website,""" start="00:33:00.820" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""It's all balancing act. If you hit Ziskind's website,""" start="00:33:00.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""that guy's done some great work on it.""" start="00:33:03.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I'm trying to figure out how big a model you can do,""" start="00:33:05.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""what you can do with it.""" start="00:33:07.500" video="mainVideo-private-ai" id="subtitle"]] @@ -738,7 +766,7 @@ [[!template text="""So it's a learning process.""" start="00:33:24.620" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""But if you want to, Network Chuck had a great video""" start="00:33:26.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""talking about building his own system""" start="00:33:29.580" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""with a couple really powerful NVIDIA cards""" start="00:33:30.940" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""with a couple really powerful Nvidia cards""" start="00:33:30.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and stuff like that in it.""" start="00:33:34.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And just actually setting up on his system as a node""" start="00:33:35.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and using a web UI on it. So there's a lot of stuff there,""" start="00:33:38.860" video="mainVideo-private-ai" id="subtitle"]] @@ -746,10 +774,10 @@ [[!template text="""which models you want to use,""" start="00:33:43.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""how much information you need,""" start="00:33:44.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but it's part of the learning.""" start="00:33:46.220" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""And you can run models, even as a Raspberry PI fives,""" start="00:33:48.020" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""And you can run models, even on Raspberry Pi 5s,""" start="00:33:49.580" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""if you want to, they'll run slow.""" start="00:33:52.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Don't get me wrong, but they're possible.""" start="00:33:54.500" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Okay, and I think there's other questions coming in too,""" start="00:33:56.460" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Corwin]: Okay, and I think there's other questions coming in too,""" start="00:33:59.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""so I'll just bam for another second.""" start="00:34:02.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""We've got about five minutes before we'll,""" start="00:34:04.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""before we'll be cutting over,""" start="00:34:06.300" video="mainVideo-private-ai" id="subtitle"]] @@ -757,51 +785,55 @@ [[!template text="""how much I appreciate your talk.""" start="00:34:13.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""This is another one that I'm going to""" start="00:34:14.860" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""have to study after the conference.""" start="00:34:15.980" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""We greatly appreciate, all of us appreciate""" start="00:34:18.340" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: We greatly appreciate, all of us appreciate""" start="00:34:18.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""you guys putting on the conference.""" start="00:34:21.100" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It's a great conference. It's well done.""" start="00:34:22.460" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""It's an honor to be on the stage""" start="00:34:26.300" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Corwin]: It's an honor to be on the stage""" start="00:34:26.300" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""with the brains of the project, which is you.""" start="00:34:28.020" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""So what else we got? Question wise.""" start="00:34:30.900" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Okay, so just scanning here.""" start="00:34:34.700" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Have you used local models capable of tool calling?""" start="00:34:39.500" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""I'm, I'm scared of agentic.""" start="00:34:50.700" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""I, I am, I'm going to be a slow adopter of that.""" start="00:34:54.780" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: So what else we got? Question wise.""" start="00:34:33.125" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Corwin]: Okay, so just scanning here.""" start="00:34:34.700" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Q: Have you used local models capable of tool-calling?""" start="00:34:46.900" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""Have you used local models capable of tool calling?""" start="00:34:46.900" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""I'm scared of agentic.""" start="00:34:50.700" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""I'm going to be a slow adopter of that.""" start="00:34:54.780" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I want to do it, but I just don't have the, uh,""" start="00:34:58.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""four decimal fortitude right now to do it.""" start="00:35:02.460" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""I, I, I've had to give me the commands,""" start="00:35:04.340" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""I've had to give me the commands,""" start="00:35:04.340" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but I still run the commands by hand.""" start="00:35:07.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I'm looking into it and it's on once again,""" start="00:35:08.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""it's on that list, but I just, that's a big step for me.""" start="00:35:10.540" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""So. Awesome. All right.""" start="00:35:14.140" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Corwin]: So. Awesome. All right.""" start="00:35:20.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Well, maybe it's, let me just scroll through""" start="00:35:23.140" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""because we might have missed one question. Oh, I see.""" start="00:35:27.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Here was the piggyback question.""" start="00:35:31.540" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Now I see the question that I missed.""" start="00:35:36.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""So this was piggybacking on the question""" start="00:35:38.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""about model updates and adding data.""" start="00:35:41.140" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""And will models reach out to the web""" start="00:35:44.860" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Q: Will the models reach out to the web if they need to for more info?""" start="00:35:44.860" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""And will models reach out to the web""" start="00:35:44.860" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""if they need more info?""" start="00:35:46.580" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Or have you worked with any models that work that way?""" start="00:35:47.820" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""No, I've not seen any models to do that""" start="00:35:51.780" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: No, I've not seen any models to do that""" start="00:35:52.480" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""There's there was like a group""" start="00:35:55.260" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""working on something like a package updater""" start="00:35:57.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""that would do different diffs on it,""" start="00:35:59.900" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""but it's so Models change so much""" start="00:36:02.500" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""even who make minor changes and fine-tuning.""" start="00:36:03.940" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""It's hard just to update them in place""" start="00:36:05.740" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""but it's so... Models change so much,""" start="00:36:02.500" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""even who make minor changes and fine-tuning,""" start="00:36:03.940" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""It's hard just to update them in place.""" start="00:36:05.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""So I haven't seen one, but that doesn't mean""" start="00:36:07.660" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""they're not out there. I'm curious topic though Awesome""" start="00:36:10.100" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""they're not out there. Curious topic though.""" start="00:36:10.100" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Corwin]: Awesome.""" start="00:36:15.714" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Well, it's probably pretty good timing.""" start="00:36:16.260" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Let me just scroll and make sure.""" start="00:36:19.540" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""And of course, before I can say that,""" start="00:36:21.300" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""there's one more question. So let's go ahead and have that.""" start="00:36:23.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I want to make sure while we're still live, though,""" start="00:36:25.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I give you a chance to offer any closing thoughts.""" start="00:36:28.300" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""So what scares you most about the agentic tools?""" start="00:36:31.300" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Q: What scares you most about agentic tools? How would you think about putting a sandbox around it if you adopt an agentic workflow?""" start="00:36:31.300" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""So what scares you most about the agentic tools?""" start="00:36:31.300" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""How would you think about putting a sandbox around that""" start="00:36:35.780" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""if you did adopt an agentic workflow?""" start="00:36:38.420" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""That is a great question.""" start="00:36:42.140" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: That is a great question.""" start="00:36:41.620" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""In terms of that, I would just control""" start="00:36:42.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""what it's able to talk to, what machines,""" start="00:36:45.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I would actually have it be air gap.""" start="00:36:48.100" video="mainVideo-private-ai" id="subtitle"]] @@ -809,7 +841,7 @@ [[!template text="""and we spend a lot of time dealing with air gap systems,""" start="00:36:52.100" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""because that's just kind of the way it works out for us.""" start="00:36:53.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""So agentic, it's just going to take a while to get trust.""" start="00:36:55.980" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""I want to want to see more stuff happening.""" start="00:36:58.500" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""I want to see more stuff happening.""" start="00:36:58.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Humans screw up stuff enough.""" start="00:37:01.060" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""The last thing we need is to multiply that by 1000.""" start="00:37:02.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""So in terms of that, I would be restricting what it can do.""" start="00:37:04.820" video="mainVideo-private-ai" id="subtitle"]] @@ -820,57 +852,693 @@ [[!template text="""I would do those kind of things,""" start="00:37:17.380" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but it's going to be, it's happening.""" start="00:37:18.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It's just, I'm going to be one of the laggards on that one.""" start="00:37:20.860" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""So airgab, jail, extremely locked down environments,""" start="00:37:25.820" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""So air gap, jail, extremely locked down environments,""" start="00:37:25.820" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""like we're talking about separate physicals, not Docker.""" start="00:37:29.260" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Yeah, hopefully. Right, fair.""" start="00:37:34.900" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""So tool calling can be read-only,""" start="00:37:37.500" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Yeah, hopefully.""" start="00:37:34.900" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Q: Tool calling can be read-only, such as giving models the ability to search the web before answersing your question. (No write access or execute access) I'm interested to know if local models are any good at calling tools, though.""" start="00:37:36.578" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""[Corwin]: Right, fair. So tool calling can be read-only,""" start="00:37:36.578" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""such as giving models the ability to search the web""" start="00:37:39.900" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""before answering your question,""" start="00:37:42.540" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""you know, write access, execute access.""" start="00:37:43.980" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I'm interested to know if local models""" start="00:37:46.220" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""are any good at that.""" start="00:37:49.220" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Yes, local models can do a lot of that stuff.""" start="00:37:51.420" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: Yes, local models can do a lot of that stuff.""" start="00:37:51.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It's their capabilities.""" start="00:37:55.580" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""If you load LM studio, you can do a lot of wonderful stuff""" start="00:37:56.820" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""with that or with open web UI with a llama.""" start="00:37:59.020" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""with that or with Open Web UI with ollama.""" start="00:37:59.020" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""It's a lot of capabilities. It's amazing.""" start="00:38:02.420" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Open web UI is actually what a lot of companies are using now""" start="00:38:05.740" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Open Web UI is actually what a lot of companies are using now""" start="00:38:05.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""to put their data behind that.""" start="00:38:08.140" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""They're curated data and stuff like that. So works well.""" start="00:38:10.260" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I can confirm that from my own professional experience.""" start="00:38:12.140" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Excellent. Okay, well, our timing should be just perfect""" start="00:38:15.820" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Excellent.""" start="00:38:15.820" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Corwin]: Okay, well, our timing should be just perfect""" start="00:38:16.916" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""if you want to give us like a 30-second, 45-second wrap-up.""" start="00:38:19.660" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Aaron, let me squeeze in mine.""" start="00:38:22.660" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Thank you again so much for preparing this talk""" start="00:38:24.420" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and for entertaining all of our questions.""" start="00:38:26.780" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Yeah, let me just thank you guys for the conference again.""" start="00:38:30.500" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: Yeah, let me just thank you guys for the conference again.""" start="00:38:30.500" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""This is a great one. I've enjoyed a lot of it.""" start="00:38:33.300" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""I've only had a couple of talks so far,""" start="00:38:35.180" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""but I'm looking forward to hitting the ones after this and tomorrow.""" start="00:38:37.340" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""But the AI stuff is coming. Get on board.""" start="00:38:41.660" video="mainVideo-private-ai" id="subtitle"]] + +<div class="transcript-heading">[[!template new="1" text="""Wrapping up""" start="00:38:41.660" video="mainVideo-private-ai" id="subtitle"]]</div>[[!template text="""But the AI stuff is coming. Get on board.""" start="00:38:41.660" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Definitely recommend it. If you want to just try it out""" start="00:38:44.740" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and get a little taste of it,""" start="00:38:46.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""what my minimal viable product""" start="00:38:48.420" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""with just LlamaFile and GPTEL""" start="00:38:49.780" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""with just Llamafile and gptel""" start="00:38:49.780" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""will get you to the point where you start figuring out.""" start="00:38:51.620" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Gptel is an amazing thing. It just gets out of your way,""" start="00:38:53.140" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""but it works solo with Emacs. Design because it takes""" start="00:38:55.580" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""doesn't take your hands off the keyboard.""" start="00:39:00.460" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""It's just another buffer""" start="00:39:01.700" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""but it works so well with Emacs's design because""" start="00:38:55.580" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""it doesn't take your hands off the keyboard.""" start="00:39:00.460" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""It's just another buffer,""" start="00:39:01.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and you just put information in there.""" start="00:39:02.500" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""It's quite quite a wonderful It's a wonderful time.""" start="00:39:04.060" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Let's put that way That's all I got Thank you""" start="00:39:06.980" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""so much for once again, and we're we're just cut away.""" start="00:39:10.820" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""It's quite a wonderful time.""" start="00:39:04.060" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""Let's put that way. That's all I got.""" start="00:39:06.980" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Corwin]: Thank you so much for once again, and we've just cut away.""" start="00:39:10.502" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""So I'll stop the recording""" start="00:39:14.340" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""and you're on your own recognizance""" start="00:39:15.780" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Well, I'm gonna punch out""" start="00:39:18.260" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""and you're on your own recognizance.""" start="00:39:15.780" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""[Aaron]: Well, I'm gonna punch out""" start="00:39:18.260" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""if anybody has any questions or anything""" start="00:39:19.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""my email address is ajgrothe@yahoo.com or at gmail and""" start="00:39:21.060" video="mainVideo-private-ai" id="subtitle"]] -[[!template text="""Thank you all for attending""" start="00:39:24.700" video="mainVideo-private-ai" id="subtitle"]] +[[!template text="""thank you all for attending,""" start="00:39:24.700" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""and thanks again for the conference""" start="00:39:26.780" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Okay, I'm gonna go ahead and end the room there, thank you.""" start="00:39:29.940" video="mainVideo-private-ai" id="subtitle"]] [[!template text="""Excellent, thanks, bye.""" start="00:39:32.580" video="mainVideo-private-ai" id="subtitle"]] +</div><div class="transcript transcript-qanda"><a name="private-ai-qanda-transcript"></a><h1>Q&A transcript (unedited)</h1> + +[[!template text="""Hey, everybody. Welcome from frigid Omaha, Nebraska. I'm""" start="00:00:26.592" video="qanda-private-ai" id="subtitle"]] +[[!template text="""just going to kick off my talk here, and we'll see how it all""" start="00:00:30.007" video="qanda-private-ai" id="subtitle"]] +[[!template text="""goes. Thanks for attending.""" start="00:00:30.007" video="qanda-private-ai" id="subtitle"]] +[[!template text="""So the slides will be available on my site, growthy.us, in""" start="00:00:49.947" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the presentation section tonight or tomorrow. This is a""" start="00:00:49.947" video="qanda-private-ai" id="subtitle"]] +[[!template text="""quick intro to one way to do private AI in Emacs. There are a""" start="00:00:55.997" video="qanda-private-ai" id="subtitle"]] +[[!template text="""lot of other ways to do it. This one is really just more or less""" start="00:00:59.162" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the easiest way to do it. It's a minimal viable product to get""" start="00:01:01.446" video="qanda-private-ai" id="subtitle"]] +[[!template text="""you an idea of how to get started with it and how to give it a""" start="00:01:05.192" video="qanda-private-ai" id="subtitle"]] +[[!template text="""spin. Really hope some of you give it a shot and learn""" start="00:01:05.192" video="qanda-private-ai" id="subtitle"]] +[[!template text="""something along the way. So the overview of the talk. broke""" start="00:01:09.940" video="qanda-private-ai" id="subtitle"]] +[[!template text="""down these basic bullet points of why private AI, what do I""" start="00:01:16.289" video="qanda-private-ai" id="subtitle"]] +[[!template text="""need to do private AI, Emacs and private AI, pieces for an AI""" start="00:01:16.289" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Emacs solution, a demo of a minimal viable product, and the""" start="00:01:16.289" video="qanda-private-ai" id="subtitle"]] +[[!template text="""summary. Why private AI? This is pretty simple. Just read""" start="00:01:16.289" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the terms and conditions for any AI system you're currently""" start="00:01:36.866" video="qanda-private-ai" id="subtitle"]] +[[!template text="""using. If you're using the free tiers, your queries, code""" start="00:01:36.866" video="qanda-private-ai" id="subtitle"]] +[[!template text="""uploaded information is being used to train the models. In""" start="00:01:40.951" video="qanda-private-ai" id="subtitle"]] +[[!template text="""some cases, you are giving the company a perpetual license""" start="00:01:46.819" video="qanda-private-ai" id="subtitle"]] +[[!template text="""to your data. You have no control over this, except for not""" start="00:01:46.819" video="qanda-private-ai" id="subtitle"]] +[[!template text="""using the engine. And keep in mind, the terms are changing""" start="00:01:51.505" video="qanda-private-ai" id="subtitle"]] +[[!template text="""all the time on that, and they're not normally changing for""" start="00:01:55.430" video="qanda-private-ai" id="subtitle"]] +[[!template text="""our benefit. So that's not necessarily a good thing. If""" start="00:01:55.430" video="qanda-private-ai" id="subtitle"]] +[[!template text="""you're using the paid tiers, you may be able to opt out of the""" start="00:02:04.298" video="qanda-private-ai" id="subtitle"]] +[[!template text="""data collection. But keep in mind, this can change, or they""" start="00:02:04.298" video="qanda-private-ai" id="subtitle"]] +[[!template text="""may start charging for that option. Every AI company wants""" start="00:02:09.496" video="qanda-private-ai" id="subtitle"]] +[[!template text="""more and more data. They need more and more data to train""" start="00:02:14.821" video="qanda-private-ai" id="subtitle"]] +[[!template text="""their models. It is just the way it is. They need more and more""" start="00:02:17.344" video="qanda-private-ai" id="subtitle"]] +[[!template text="""information to get it more and more accurate to keep it up to""" start="00:02:22.689" video="qanda-private-ai" id="subtitle"]] +[[!template text="""date. There's been a story about Stack Overflow. It has like""" start="00:02:22.689" video="qanda-private-ai" id="subtitle"]] +[[!template text="""half the number of queries they had a year ago because people""" start="00:02:29.396" video="qanda-private-ai" id="subtitle"]] +[[!template text="""are using AI. The problem with that is now there's less data""" start="00:02:29.396" video="qanda-private-ai" id="subtitle"]] +[[!template text="""going to Stack Overflow for the AI to get. vicious cycle,""" start="00:02:33.500" video="qanda-private-ai" id="subtitle"]] +[[!template text="""especially when you start looking at newer language like""" start="00:02:38.926" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Ruby and stuff like that. So it comes down to being an""" start="00:02:38.926" video="qanda-private-ai" id="subtitle"]] +[[!template text="""interesting time. Another reason why to go private AI is""" start="00:02:42.732" video="qanda-private-ai" id="subtitle"]] +[[!template text="""your costs are going to vary. Right now, these services are""" start="00:02:46.718" video="qanda-private-ai" id="subtitle"]] +[[!template text="""being heavily subsidized. If you're paying Claude $20 a""" start="00:02:50.824" video="qanda-private-ai" id="subtitle"]] +[[!template text="""month, it is not costing Claude, those guys $20 a month to""" start="00:02:53.067" video="qanda-private-ai" id="subtitle"]] +[[!template text="""host all the infrastructure to build all these data""" start="00:02:53.067" video="qanda-private-ai" id="subtitle"]] +[[!template text="""centers. They are severely subsidizing that at a very much a""" start="00:02:53.067" video="qanda-private-ai" id="subtitle"]] +[[!template text="""loss right now. When they start charging the real costs plus""" start="00:03:02.241" video="qanda-private-ai" id="subtitle"]] +[[!template text="""a profit, it's going to change. Right now, I use a bunch of""" start="00:03:07.327" video="qanda-private-ai" id="subtitle"]] +[[!template text="""different services. I've played with Grok and a bunch of""" start="00:03:11.591" video="qanda-private-ai" id="subtitle"]] +[[!template text="""other ones. But Grok right now is like $30 a month for a""" start="00:03:14.114" video="qanda-private-ai" id="subtitle"]] +[[!template text="""regular Super Grok. When they start charging the real cost""" start="00:03:16.696" video="qanda-private-ai" id="subtitle"]] +[[!template text="""of that, it's going to go from $30 to something a great deal""" start="00:03:20.240" video="qanda-private-ai" id="subtitle"]] +[[!template text="""more, perhaps, I think, $100 or $200 or whatever really""" start="00:03:20.240" video="qanda-private-ai" id="subtitle"]] +[[!template text="""turns out to be the cost when you figure everything into it.""" start="00:03:20.240" video="qanda-private-ai" id="subtitle"]] +[[!template text="""When you start adding that cost into that, a lot of people are""" start="00:03:32.032" video="qanda-private-ai" id="subtitle"]] +[[!template text="""using public AI right now are going to have no option but to""" start="00:03:32.032" video="qanda-private-ai" id="subtitle"]] +[[!template text="""move to private AI or give up on AI overall.""" start="00:03:32.032" video="qanda-private-ai" id="subtitle"]] +[[!template text="""What do you need to be able to do private AI? If you're going to""" start="00:03:42.275" video="qanda-private-ai" id="subtitle"]] +[[!template text="""run your own AI, you're going to need a system with either""" start="00:03:45.768" video="qanda-private-ai" id="subtitle"]] +[[!template text="""some cores, a graphics processor unit, or a neural""" start="00:03:45.768" video="qanda-private-ai" id="subtitle"]] +[[!template text="""processing unit, a GPU or an NPU. I currently have four""" start="00:03:45.768" video="qanda-private-ai" id="subtitle"]] +[[!template text="""systems I'm experimenting with and playing around with on a""" start="00:03:54.519" video="qanda-private-ai" id="subtitle"]] +[[!template text="""daily basis. I have a System76 Pangolin AMD Ryzen 7 78040U""" start="00:03:54.519" video="qanda-private-ai" id="subtitle"]] +[[!template text="""with a Radeon 7080M integrated graphics card. It's got 32""" start="00:03:59.145" video="qanda-private-ai" id="subtitle"]] +[[!template text="""gigs of RAM. It's a beautiful piece of hardware. I really do""" start="00:04:07.176" video="qanda-private-ai" id="subtitle"]] +[[!template text="""like it. I have my main workstation, it's an HP Z620 with dual""" start="00:04:10.139" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Intel Xeons with four NVIDIA K2200 graphics cards in it. Why""" start="00:04:11.401" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the four NVIDIA K2200 graphics card on it? Because I could""" start="00:04:19.757" video="qanda-private-ai" id="subtitle"]] +[[!template text="""buy four of them on eBay for $100 and it was still supported by""" start="00:04:22.742" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the NVIDIA drivers for Debian. So that's why that is. A""" start="00:04:22.742" video="qanda-private-ai" id="subtitle"]] +[[!template text="""MacBook Air with an M1 processor, a very nice piece of kit I""" start="00:04:30.959" video="qanda-private-ai" id="subtitle"]] +[[!template text="""picked up a couple years ago, very cheap, but it runs AI""" start="00:04:30.959" video="qanda-private-ai" id="subtitle"]] +[[!template text="""surprisingly well, and an Acer Aspire 1 with an AMD Ryzen""" start="00:04:30.959" video="qanda-private-ai" id="subtitle"]] +[[!template text="""5700H in it. This was my old laptop. It was a sturdy beast. It""" start="00:04:30.959" video="qanda-private-ai" id="subtitle"]] +[[!template text="""was able to do enough AI to do demos and stuff, and I liked it""" start="00:04:48.104" video="qanda-private-ai" id="subtitle"]] +[[!template text="""quite a bit for that. I'm using the Pangolin for this""" start="00:04:48.104" video="qanda-private-ai" id="subtitle"]] +[[!template text="""demonstration because it's just better. Apple's M4 chip""" start="00:04:52.611" video="qanda-private-ai" id="subtitle"]] +[[!template text="""has 38 teraflops of MPU performance. The Microsoft""" start="00:04:58.887" video="qanda-private-ai" id="subtitle"]] +[[!template text="""co-pilots are now requiring 45 teraflops of MPU to be able to""" start="00:05:03.933" video="qanda-private-ai" id="subtitle"]] +[[!template text="""have the co-pilot badge on it. And Raspberry Pi's new AI top""" start="00:05:03.933" video="qanda-private-ai" id="subtitle"]] +[[!template text="""is about 18 teraflops and is $70 on top of the cost of""" start="00:05:11.161" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Raspberry Pi 5. Keep in mind Raspberry recently raised the""" start="00:05:11.161" video="qanda-private-ai" id="subtitle"]] +[[!template text="""cost of their Pi 5s because of RAM pricing, which is going to""" start="00:05:18.009" video="qanda-private-ai" id="subtitle"]] +[[!template text="""be affecting a lot of these types of solutions in the near""" start="00:05:18.009" video="qanda-private-ai" id="subtitle"]] +[[!template text="""future. But there's going to be a lot of local power""" start="00:05:18.009" video="qanda-private-ai" id="subtitle"]] +[[!template text="""available in the future. That's what it really comes down""" start="00:05:29.178" video="qanda-private-ai" id="subtitle"]] +[[!template text="""to. A lot of people are going to have PCs on their desks.""" start="00:05:32.969" video="qanda-private-ai" id="subtitle"]] +[[!template text="""They're going to run a decent private AI without much issue.""" start="00:05:37.362" video="qanda-private-ai" id="subtitle"]] +[[!template text="""So for Emacs and private AI, there's a couple popular""" start="00:05:42.408" video="qanda-private-ai" id="subtitle"]] +[[!template text="""solutions. GPTEL, which is the one we're going to talk""" start="00:05:42.408" video="qanda-private-ai" id="subtitle"]] +[[!template text="""about. It's a simple interface. It's a minimal interface.""" start="00:05:46.473" video="qanda-private-ai" id="subtitle"]] +[[!template text="""It integrates easily into your workflow. It's just, quite""" start="00:05:50.959" video="qanda-private-ai" id="subtitle"]] +[[!template text="""honestly, chef's kiss, just a beautifully well-done piece""" start="00:05:53.021" video="qanda-private-ai" id="subtitle"]] +[[!template text="""of software. OlamaBuddy has more features, a menu""" start="00:05:53.021" video="qanda-private-ai" id="subtitle"]] +[[!template text="""interface, has quick access for things like code""" start="00:05:58.048" video="qanda-private-ai" id="subtitle"]] +[[!template text="""refactoring, text-free formatting, et cetera. This is the""" start="00:05:58.048" video="qanda-private-ai" id="subtitle"]] +[[!template text="""one that you spend a little more time with, but you also get a""" start="00:06:05.337" video="qanda-private-ai" id="subtitle"]] +[[!template text="""little bit more back from it. Elama is another one, has some""" start="00:06:05.337" video="qanda-private-ai" id="subtitle"]] +[[!template text="""really good features to it, more different capabilities,""" start="00:06:10.403" video="qanda-private-ai" id="subtitle"]] +[[!template text="""but it's a different set of rules and capabilities to it.""" start="00:06:10.403" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Itermac, which is programming with your AI and Emacs. The""" start="00:06:21.595" video="qanda-private-ai" id="subtitle"]] +[[!template text="""closest thing I can come up to comparing this to is Cursor,""" start="00:06:26.020" video="qanda-private-ai" id="subtitle"]] +[[!template text="""except it's an Emacs. It's really quite well done. These are""" start="00:06:26.020" video="qanda-private-ai" id="subtitle"]] +[[!template text="""all really quite well done. There's a bunch of other""" start="00:06:32.007" video="qanda-private-ai" id="subtitle"]] +[[!template text="""projects out there. If you go out to GitHub, type Emacs AI,""" start="00:06:33.188" video="qanda-private-ai" id="subtitle"]] +[[!template text="""you'll find a lot of different options. So what is a minimal""" start="00:06:34.550" video="qanda-private-ai" id="subtitle"]] +[[!template text="""viable product that can be done? A minimal viable product to""" start="00:06:41.555" video="qanda-private-ai" id="subtitle"]] +[[!template text="""show what an AI EMX solution is can be done with only needing""" start="00:06:45.081" video="qanda-private-ai" id="subtitle"]] +[[!template text="""two pieces of software. LLAMA file, this is an amazing piece""" start="00:06:45.081" video="qanda-private-ai" id="subtitle"]] +[[!template text="""of software. This is a whole LLM contained in one file. And""" start="00:06:52.575" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the same file runs on Mac OS X, Linux, Windows, and the BSDs.""" start="00:06:59.667" video="qanda-private-ai" id="subtitle"]] +[[!template text="""It's a wonderful piece of kit based on these people who""" start="00:07:06.158" video="qanda-private-ai" id="subtitle"]] +[[!template text="""created this thing called Cosmopolitan that lets you""" start="00:07:06.158" video="qanda-private-ai" id="subtitle"]] +[[!template text="""create and execute while it runs on a bunch of different""" start="00:07:06.158" video="qanda-private-ai" id="subtitle"]] +[[!template text="""systems. And GPTEL, which is an easy plug-in for Emacs,""" start="00:07:06.158" video="qanda-private-ai" id="subtitle"]] +[[!template text="""which we talked about in the last slide a bit. So setting up""" start="00:07:15.375" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the LLM, you have to just go out and just hit the a page for it""" start="00:07:22.509" video="qanda-private-ai" id="subtitle"]] +[[!template text="""and go out and do a wget of it. That's all it takes there.""" start="00:07:28.585" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Chmodding it so you can actually execute the executable.""" start="00:07:33.552" video="qanda-private-ai" id="subtitle"]] +[[!template text="""And then just go ahead and actually running it. And let's go""" start="00:07:36.876" video="qanda-private-ai" id="subtitle"]] +[[!template text="""ahead and do that. I've already downloaded it because I""" start="00:07:41.743" video="qanda-private-ai" id="subtitle"]] +[[!template text="""don't want to wait. And let's just take a look at it. I've""" start="00:07:43.144" video="qanda-private-ai" id="subtitle"]] +[[!template text="""actually downloaded several of them, but let's go ahead and""" start="00:07:47.550" video="qanda-private-ai" id="subtitle"]] +[[!template text="""just run lava 3.2b with the 3 billion instructions. And""" start="00:07:47.550" video="qanda-private-ai" id="subtitle"]] +[[!template text="""that's it firing up. And it is nice enough to actually be""" start="00:07:55.771" video="qanda-private-ai" id="subtitle"]] +[[!template text="""listening in port 8080, which we'll need in a minute.""" start="00:07:57.473" video="qanda-private-ai" id="subtitle"]] +[[!template text="""So once you do that, you have to install gptel and emacs.""" start="00:08:05.764" video="qanda-private-ai" id="subtitle"]] +[[!template text="""That's as simple as firing up emacs, doing the meta x install""" start="00:08:09.849" video="qanda-private-ai" id="subtitle"]] +[[!template text="""package, and then just typing gptel if you have your""" start="00:08:09.849" video="qanda-private-ai" id="subtitle"]] +[[!template text="""repository set up right, which hopefully you do. And then""" start="00:08:09.849" video="qanda-private-ai" id="subtitle"]] +[[!template text="""you just go ahead and have it. You also have to set up a config""" start="00:08:19.141" video="qanda-private-ai" id="subtitle"]] +[[!template text="""file. Here's my example config file as it currently set up,""" start="00:08:22.450" video="qanda-private-ai" id="subtitle"]] +[[!template text="""requiring ensuring GPTEL is loaded, defining the LLAMA""" start="00:08:24.333" video="qanda-private-ai" id="subtitle"]] +[[!template text="""file backend. You can put multiple backends into it, but I""" start="00:08:24.333" video="qanda-private-ai" id="subtitle"]] +[[!template text="""just have the one defined on this example. But it's pretty""" start="00:08:32.284" video="qanda-private-ai" id="subtitle"]] +[[!template text="""straightforward. LLAMA local file, name for it, stream,""" start="00:08:36.610" video="qanda-private-ai" id="subtitle"]] +[[!template text="""protocol HTTP. If you have HTTPS set up, that's obviously""" start="00:08:38.032" video="qanda-private-ai" id="subtitle"]] +[[!template text="""preferable, but a lot of people don't for their home labs.""" start="00:08:43.882" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Host is just 127.0.0.1 port 8080. Keep in mind, some of the""" start="00:08:49.253" video="qanda-private-ai" id="subtitle"]] +[[!template text="""AIs run on a different port, so you may be 8081 if you're""" start="00:08:53.000" video="qanda-private-ai" id="subtitle"]] +[[!template text="""running OpenWebView at the same time. The key, we don't need""" start="00:08:53.000" video="qanda-private-ai" id="subtitle"]] +[[!template text="""an API key because it's a local server. And the models just,""" start="00:09:00.295" video="qanda-private-ai" id="subtitle"]] +[[!template text="""uh, we can put multiple models on there if we want to. So if we""" start="00:09:03.541" video="qanda-private-ai" id="subtitle"]] +[[!template text="""create one with additional stuff or like rag and stuff like""" start="00:09:07.525" video="qanda-private-ai" id="subtitle"]] +[[!template text="""that, we can actually name those models by their domain,""" start="00:09:07.525" video="qanda-private-ai" id="subtitle"]] +[[!template text="""which is really kind of cool. But, uh, that's all that takes.""" start="00:09:07.525" video="qanda-private-ai" id="subtitle"]] +[[!template text="""So let's go ahead and go to a quick test of it.""" start="00:09:19.198" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Oops. Alt-X, GPTEL. And we're going to just choose the""" start="00:09:30.531" video="qanda-private-ai" id="subtitle"]] +[[!template text="""default buffer to make things easier. Going to resize it up a""" start="00:09:36.171" video="qanda-private-ai" id="subtitle"]] +[[!template text="""bit. And usually the go-to question I go to is, who was David""" start="00:09:40.202" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Bowie? This one is actually a question that's turned out to""" start="00:09:42.108" video="qanda-private-ai" id="subtitle"]] +[[!template text="""be really good for figuring out whether or not AI is""" start="00:09:49.627" video="qanda-private-ai" id="subtitle"]] +[[!template text="""complete. This is one that some engines do well on, other""" start="00:09:49.627" video="qanda-private-ai" id="subtitle"]] +[[!template text="""ones don't. And we can just do, we can either do the alt X and""" start="00:09:54.453" video="qanda-private-ai" id="subtitle"]] +[[!template text="""send the GPTEL send, or we can just do control C and hit enter.""" start="00:09:57.416" video="qanda-private-ai" id="subtitle"]] +[[!template text="""We'll just do control C and enter. And now it's going ahead""" start="00:10:04.084" video="qanda-private-ai" id="subtitle"]] +[[!template text="""and hitting our local AI system running on port 8080. And""" start="00:10:06.326" video="qanda-private-ai" id="subtitle"]] +[[!template text="""that looks pretty good, but let's go ahead and say, hey, it's""" start="00:10:11.472" video="qanda-private-ai" id="subtitle"]] +[[!template text="""set to terse mode right now. Please expand upon this.""" start="00:10:11.472" video="qanda-private-ai" id="subtitle"]] +[[!template text="""And there we go. We're getting a full description of the""" start="00:10:29.182" video="qanda-private-ai" id="subtitle"]] +[[!template text="""majority of, uh, about David Bowie's life and other""" start="00:10:29.923" video="qanda-private-ai" id="subtitle"]] +[[!template text="""information about him. So very, very happy with that.""" start="00:10:29.923" video="qanda-private-ai" id="subtitle"]] +[[!template text="""One thing to keep in mind is you look at things when you're""" start="00:10:47.946" video="qanda-private-ai" id="subtitle"]] +[[!template text="""looking for hallucinations, how accurate AI is, how it's""" start="00:10:47.946" video="qanda-private-ai" id="subtitle"]] +[[!template text="""compressed is it will tend to screw up on things like how many""" start="00:10:47.946" video="qanda-private-ai" id="subtitle"]] +[[!template text="""children he had and stuff like that. Let me see if it gets to""" start="00:10:47.946" video="qanda-private-ai" id="subtitle"]] +[[!template text="""that real quick.""" start="00:10:57.257" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Is it not actually on this one? Alright, so that's the first""" start="00:11:04.349" video="qanda-private-ai" id="subtitle"]] +[[!template text="""question I always ask one. The next one is what are sea""" start="00:11:06.552" video="qanda-private-ai" id="subtitle"]] +[[!template text="""monkeys? It gives you an idea of the breadth of the system.""" start="00:11:08.355" video="qanda-private-ai" id="subtitle"]] +[[!template text="""It's querying right now. Pulls it back correctly.""" start="00:11:19.011" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Yes. And it's smart enough to actually detect David Bowie""" start="00:11:35.197" video="qanda-private-ai" id="subtitle"]] +[[!template text="""even referenced see monkeys in the song sea of love, which""" start="00:11:36.380" video="qanda-private-ai" id="subtitle"]] +[[!template text="""came at hit single. So it's actually keeping the context""" start="00:11:36.380" video="qanda-private-ai" id="subtitle"]] +[[!template text="""alive and that which is very cool feature. I did not see that""" start="00:11:42.654" video="qanda-private-ai" id="subtitle"]] +[[!template text="""coming. Here's one that some people say is a really good one""" start="00:11:46.482" video="qanda-private-ai" id="subtitle"]] +[[!template text="""to ask ours in strawberry.""" start="00:11:48.206" video="qanda-private-ai" id="subtitle"]] +[[!template text="""All right, now she's going off the reservation. She's going""" start="00:12:09.571" video="qanda-private-ai" id="subtitle"]] +[[!template text="""in a different direction. Let me go ahead and reopen that""" start="00:12:12.376" video="qanda-private-ai" id="subtitle"]] +[[!template text="""again, because it's went down a bad hole there for a second.""" start="00:12:14.640" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Let me ask it to do write hello world in Emacs list.""" start="00:12:23.615" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Yep, that works. So the point being here, that was like two""" start="00:12:36.166" video="qanda-private-ai" id="subtitle"]] +[[!template text="""minutes of setup. And now we have a small AI embedded inside""" start="00:12:37.989" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the system. So that gives you an idea just how easy it can be.""" start="00:12:41.695" video="qanda-private-ai" id="subtitle"]] +[[!template text="""And it's just running locally on the system. We also have the""" start="00:12:46.883" video="qanda-private-ai" id="subtitle"]] +[[!template text="""default system here as well. So not that bad.""" start="00:12:48.466" video="qanda-private-ai" id="subtitle"]] +[[!template text="""That's a basic solution, that's a basic setup that will get""" start="00:12:58.289" video="qanda-private-ai" id="subtitle"]] +[[!template text="""you to the point where you can go like, it's a party trick, but""" start="00:12:58.289" video="qanda-private-ai" id="subtitle"]] +[[!template text="""it's a very cool party trick. The way that GPTEL works is it""" start="00:12:58.289" video="qanda-private-ai" id="subtitle"]] +[[!template text="""puts it into buffers, it doesn't interfere with your flow""" start="00:13:06.422" video="qanda-private-ai" id="subtitle"]] +[[!template text="""that much, it's just an additional window you can pop open to""" start="00:13:06.422" video="qanda-private-ai" id="subtitle"]] +[[!template text="""ask questions and get information for, dump code into it and""" start="00:13:06.422" video="qanda-private-ai" id="subtitle"]] +[[!template text="""have it refactored. GPTEL has a lot of additional options""" start="00:13:06.422" video="qanda-private-ai" id="subtitle"]] +[[!template text="""for things that are really cool for that. But if you want a""" start="00:13:17.639" video="qanda-private-ai" id="subtitle"]] +[[!template text="""better solution, I recommend OLAM or LM Studio. They're""" start="00:13:21.886" video="qanda-private-ai" id="subtitle"]] +[[!template text="""both more capable than LAMA file. They can accept a lot of""" start="00:13:26.052" video="qanda-private-ai" id="subtitle"]] +[[!template text="""different models. You can do things like RAG. You can do""" start="00:13:28.355" video="qanda-private-ai" id="subtitle"]] +[[!template text="""loading of things onto the GPU more explicitly. It can speed""" start="00:13:32.240" video="qanda-private-ai" id="subtitle"]] +[[!template text="""stuff up. One of the things about the retrieval""" start="00:13:35.444" video="qanda-private-ai" id="subtitle"]] +[[!template text="""augmentation is it will let you put your data into the system""" start="00:13:36.686" video="qanda-private-ai" id="subtitle"]] +[[!template text="""so you can start uploading your code, your information, and""" start="00:13:36.686" video="qanda-private-ai" id="subtitle"]] +[[!template text="""actually being able to do analysis of it. OpenWebUI""" start="00:13:36.686" video="qanda-private-ai" id="subtitle"]] +[[!template text="""provides more capabilities. It provides an interface""" start="00:13:46.518" video="qanda-private-ai" id="subtitle"]] +[[!template text="""that's similar to what you're used to seeing for chat, GPT,""" start="00:13:49.562" video="qanda-private-ai" id="subtitle"]] +[[!template text="""and the other systems. It's really quite well done. And once""" start="00:13:49.562" video="qanda-private-ai" id="subtitle"]] +[[!template text="""again, GPTEL, I have to mention that because that's the one I""" start="00:13:56.149" video="qanda-private-ai" id="subtitle"]] +[[!template text="""really kind of like. And OlamaBuddy is also another really""" start="00:13:56.149" video="qanda-private-ai" id="subtitle"]] +[[!template text="""nice one. So what about the licensing of these models? Since""" start="00:14:00.454" video="qanda-private-ai" id="subtitle"]] +[[!template text="""I'm going out pulling down a model and doing this stuff.""" start="00:14:07.142" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Let's take a look at a couple of highlights from the""" start="00:14:11.307" video="qanda-private-ai" id="subtitle"]] +[[!template text="""MetaLlama 3 community license scale. If your service""" start="00:14:11.307" video="qanda-private-ai" id="subtitle"]] +[[!template text="""exceeds 700 million monthly users, you need additional""" start="00:14:15.576" video="qanda-private-ai" id="subtitle"]] +[[!template text="""licensing. Probably not going to be a problem for most of us.""" start="00:14:15.576" video="qanda-private-ai" id="subtitle"]] +[[!template text="""There's a competition restriction. You can't use this""" start="00:14:22.972" video="qanda-private-ai" id="subtitle"]] +[[!template text="""model to enhance competing models. And there's some""" start="00:14:24.576" video="qanda-private-ai" id="subtitle"]] +[[!template text="""limitations on using the meta trademarks. Not that big a""" start="00:14:27.622" video="qanda-private-ai" id="subtitle"]] +[[!template text="""deal. And the other ones are it's a permissive one designed""" start="00:14:30.629" video="qanda-private-ai" id="subtitle"]] +[[!template text="""to encourage innovation, open development, commercial""" start="00:14:32.854" video="qanda-private-ai" id="subtitle"]] +[[!template text="""use is allowed, but there are some restrictions on it. Yeah,""" start="00:14:32.854" video="qanda-private-ai" id="subtitle"]] +[[!template text="""you can modify the model, but you have to rely on the license""" start="00:14:42.172" video="qanda-private-ai" id="subtitle"]] +[[!template text="""terms. And you can distribute the model with derivatives.""" start="00:14:42.172" video="qanda-private-ai" id="subtitle"]] +[[!template text="""And there are some very cool ones out there. There's people""" start="00:14:48.542" video="qanda-private-ai" id="subtitle"]] +[[!template text="""who've done things to try and make the llama bee less, what's""" start="00:14:50.164" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the phrase, ethical if you're doing penetration testing""" start="00:14:50.164" video="qanda-private-ai" id="subtitle"]] +[[!template text="""research and stuff like that. It has some very nice value""" start="00:14:50.164" video="qanda-private-ai" id="subtitle"]] +[[!template text="""there. Keep in mind licenses also vary depending on the""" start="00:14:58.517" video="qanda-private-ai" id="subtitle"]] +[[!template text="""model you're using. Mistral AI has the non-production""" start="00:15:01.021" video="qanda-private-ai" id="subtitle"]] +[[!template text="""license. It's designed to keep it to research and""" start="00:15:06.070" video="qanda-private-ai" id="subtitle"]] +[[!template text="""development. You can't use it commercially. So it's""" start="00:15:08.895" video="qanda-private-ai" id="subtitle"]] +[[!template text="""designed to clearly delineate between research and""" start="00:15:13.423" video="qanda-private-ai" id="subtitle"]] +[[!template text="""development and somebody trying to actually build""" start="00:15:13.423" video="qanda-private-ai" id="subtitle"]] +[[!template text="""something on top of it. And another question I get asked is,""" start="00:15:13.423" video="qanda-private-ai" id="subtitle"]] +[[!template text="""are there open source data model options? Yeah, but most of""" start="00:15:22.739" video="qanda-private-ai" id="subtitle"]] +[[!template text="""them are small or specialized currently. MoMo is a whole""" start="00:15:26.426" video="qanda-private-ai" id="subtitle"]] +[[!template text="""family of them, but there tend to be more specialized, but""" start="00:15:29.532" video="qanda-private-ai" id="subtitle"]] +[[!template text="""it's very cool to see where it's going. And it's another""" start="00:15:29.532" video="qanda-private-ai" id="subtitle"]] +[[!template text="""thing that's just going forward. It's under the MIT""" start="00:15:35.824" video="qanda-private-ai" id="subtitle"]] +[[!template text="""license. Some things to know to help you have a better""" start="00:15:37.548" video="qanda-private-ai" id="subtitle"]] +[[!template text="""experience with this. Get a LLAMA and OpenWebUI working by""" start="00:15:40.576" video="qanda-private-ai" id="subtitle"]] +[[!template text="""themselves, then set up your config file. I was fighting""" start="00:15:44.764" video="qanda-private-ai" id="subtitle"]] +[[!template text="""both at the same time, and it turned out I had a problem with my""" start="00:15:49.272" video="qanda-private-ai" id="subtitle"]] +[[!template text="""LLAMA. I had a conflict, so that was what my problem is. LLAMA""" start="00:15:49.272" video="qanda-private-ai" id="subtitle"]] +[[!template text="""file, GPTEL is a great way to start experimenting just to get""" start="00:15:55.725" video="qanda-private-ai" id="subtitle"]] +[[!template text="""you an idea of how it works and figure out how the interfaces""" start="00:15:55.725" video="qanda-private-ai" id="subtitle"]] +[[!template text="""work. Tremendous. RAG loading documents into it is really""" start="00:15:55.725" video="qanda-private-ai" id="subtitle"]] +[[!template text="""easy with open web UI. You can create models, you can put""" start="00:16:03.459" video="qanda-private-ai" id="subtitle"]] +[[!template text="""things like help desk developers and stuff like that,""" start="00:16:06.723" video="qanda-private-ai" id="subtitle"]] +[[!template text="""breaking it out. The Hacker News has a how to build a $300 AI""" start="00:16:06.723" video="qanda-private-ai" id="subtitle"]] +[[!template text="""computer. This is for March 2024, but it still has a lot of""" start="00:16:13.513" video="qanda-private-ai" id="subtitle"]] +[[!template text="""great information on how to benchmark the environments,""" start="00:16:17.199" video="qanda-private-ai" id="subtitle"]] +[[!template text="""what some values are like the Ryzen 5700U inside my Acer""" start="00:16:17.199" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Aspire, that's where I got the idea doing that. Make sure you""" start="00:16:27.674" video="qanda-private-ai" id="subtitle"]] +[[!template text="""do the RockM stuff correctly to get the GUI extensions. But""" start="00:16:30.399" video="qanda-private-ai" id="subtitle"]] +[[!template text="""it's just really good stuff. You don't need a great GPU or CPU""" start="00:16:34.886" video="qanda-private-ai" id="subtitle"]] +[[!template text="""to get started. Smaller models like Tiny Llama can run on""" start="00:16:36.870" video="qanda-private-ai" id="subtitle"]] +[[!template text="""very small systems. It gets you the ability to start playing""" start="00:16:39.334" video="qanda-private-ai" id="subtitle"]] +[[!template text="""with it and start experimenting and figure out if that's for""" start="00:16:43.521" video="qanda-private-ai" id="subtitle"]] +[[!template text="""you and to move forward with it. The AMD Ryzen AI Max 395 plus""" start="00:16:43.521" video="qanda-private-ai" id="subtitle"]] +[[!template text="""is a mini PC makes it really nice dedicated host. You used to""" start="00:16:51.232" video="qanda-private-ai" id="subtitle"]] +[[!template text="""be able to buy these for about $1200 now with the RAM price""" start="00:16:58.033" video="qanda-private-ai" id="subtitle"]] +[[!template text="""increase, you want to get 120 gig when you're pushing two""" start="00:16:58.033" video="qanda-private-ai" id="subtitle"]] +[[!template text="""brands so. It gets a little tighter. Macs work remarkably""" start="00:16:58.033" video="qanda-private-ai" id="subtitle"]] +[[!template text="""well with AI. My MacBook Air was one of my go-tos for a while,""" start="00:17:07.624" video="qanda-private-ai" id="subtitle"]] +[[!template text="""but once I started doing anything AI, I had a five-minute""" start="00:17:11.010" video="qanda-private-ai" id="subtitle"]] +[[!template text="""window before the thermal throttling became an issue. Keep""" start="00:17:11.010" video="qanda-private-ai" id="subtitle"]] +[[!template text="""in mind that's a MacBook Air, so it doesn't have the greatest""" start="00:17:19.123" video="qanda-private-ai" id="subtitle"]] +[[!template text="""ventilation. If you get the MacBook Pros and stuff, they""" start="00:17:19.123" video="qanda-private-ai" id="subtitle"]] +[[!template text="""tend to have more ventilation, but still you're going to be""" start="00:17:23.130" video="qanda-private-ai" id="subtitle"]] +[[!template text="""pushing against that. So Mac Minis and the Mac Ultras and""" start="00:17:23.130" video="qanda-private-ai" id="subtitle"]] +[[!template text="""stuff like that tend to work really well for that. Alex""" start="00:17:28.418" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Ziskin on YouTube has a channel. He does a lot of AI""" start="00:17:32.525" video="qanda-private-ai" id="subtitle"]] +[[!template text="""performance benchmarking, like I load a 70 billion""" start="00:17:36.372" video="qanda-private-ai" id="subtitle"]] +[[!template text="""parameter model on this mini PC and stuff like that. It's a""" start="00:17:36.372" video="qanda-private-ai" id="subtitle"]] +[[!template text="""lot of fun and interesting stuff there. And it's""" start="00:17:42.765" video="qanda-private-ai" id="subtitle"]] +[[!template text="""influencing my decision to buy my next AI style PC. Small""" start="00:17:45.690" video="qanda-private-ai" id="subtitle"]] +[[!template text="""domain specific LLMs are happening. An LLM that has all your""" start="00:17:50.648" video="qanda-private-ai" id="subtitle"]] +[[!template text="""code and information, it sounds like a really cool idea. It""" start="00:17:54.197" video="qanda-private-ai" id="subtitle"]] +[[!template text="""gives you capabilities to start training stuff that you""" start="00:17:57.666" video="qanda-private-ai" id="subtitle"]] +[[!template text="""couldn't do with like the big ones. Even with in terms of fine""" start="00:17:57.666" video="qanda-private-ai" id="subtitle"]] +[[!template text="""tuning and stuff, it's remarkable to see where that space is""" start="00:18:02.598" video="qanda-private-ai" id="subtitle"]] +[[!template text="""coming along in the next year or so. Hugging Face Co has""" start="00:18:02.598" video="qanda-private-ai" id="subtitle"]] +[[!template text="""pointers to tons of AI models. You'll find the one that works""" start="00:18:08.435" video="qanda-private-ai" id="subtitle"]] +[[!template text="""for you, hopefully there. If you're doing cybersecurity,""" start="00:18:12.801" video="qanda-private-ai" id="subtitle"]] +[[!template text="""there's a whole bunch out there for that, that have certain""" start="00:18:14.464" video="qanda-private-ai" id="subtitle"]] +[[!template text="""training on it, information. It's really good. One last""" start="00:18:14.464" video="qanda-private-ai" id="subtitle"]] +[[!template text="""thing to keep in mind is hallucinations are real. You will""" start="00:18:23.497" video="qanda-private-ai" id="subtitle"]] +[[!template text="""get BS back from the AI occasionally, so do validate""" start="00:18:26.762" video="qanda-private-ai" id="subtitle"]] +[[!template text="""everything you get from it. Don't be using it for court cases""" start="00:18:26.762" video="qanda-private-ai" id="subtitle"]] +[[!template text="""like some people have and run into those problems. So, That""" start="00:18:31.930" video="qanda-private-ai" id="subtitle"]] +[[!template text="""is my talk. What I would like you to get out of that is, if you""" start="00:18:38.945" video="qanda-private-ai" id="subtitle"]] +[[!template text="""haven't tried it, give GPTEL and LlamaFile a shot. Fire up a""" start="00:18:41.527" video="qanda-private-ai" id="subtitle"]] +[[!template text="""little small AI instance, play around with a little bit""" start="00:18:48.473" video="qanda-private-ai" id="subtitle"]] +[[!template text="""inside your Emacs, and see if it makes your life better.""" start="00:18:48.473" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Hopefully it will. And I really hope you guys learned""" start="00:18:55.499" video="qanda-private-ai" id="subtitle"]] +[[!template text="""something from this talk. And thanks for listening. And the""" start="00:18:56.821" video="qanda-private-ai" id="subtitle"]] +[[!template text="""links are at the end of the talk, if you have any questions.""" start="00:19:01.905" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Let me see if we got anything you want, Pat. You do. You've got""" start="00:19:05.908" video="qanda-private-ai" id="subtitle"]] +[[!template text="""a few questions. Hey, this is Corwin. Thank you so much.""" start="00:19:08.683" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Thank you, Aaron. What an awesome talk this was, actually.""" start="00:19:13.069" video="qanda-private-ai" id="subtitle"]] +[[!template text="""If you don't have a camera, I can get away with not having one""" start="00:19:17.255" video="qanda-private-ai" id="subtitle"]] +[[!template text="""too. I've got, I'll turn the camera on. Okay. All right. I'll""" start="00:19:17.255" video="qanda-private-ai" id="subtitle"]] +[[!template text="""turn mine back on. Here I come. Yeah, so there are a few""" start="00:19:23.683" video="qanda-private-ai" id="subtitle"]] +[[!template text="""questions, but first let me say thank you for a really""" start="00:19:27.089" video="qanda-private-ai" id="subtitle"]] +[[!template text="""captivating talk. I think a lot of people will be empowered""" start="00:19:27.089" video="qanda-private-ai" id="subtitle"]] +[[!template text="""from this to try to do more with less, especially locally.""" start="00:19:32.887" video="qanda-private-ai" id="subtitle"]] +[[!template text="""concerned about the data center footprint,""" start="00:19:44.538" video="qanda-private-ai" id="subtitle"]] +[[!template text="""environmentally concerned about the footprint of LLM""" start="00:19:44.538" video="qanda-private-ai" id="subtitle"]] +[[!template text="""inside data centers. So just thinking about how we can put""" start="00:19:44.538" video="qanda-private-ai" id="subtitle"]] +[[!template text="""infrastructure we have at home to use and get more done with""" start="00:19:52.918" video="qanda-private-ai" id="subtitle"]] +[[!template text="""less. Yeah, the data center impact's interesting because""" start="00:19:52.918" video="qanda-private-ai" id="subtitle"]] +[[!template text="""there was a study a while ago. Someone said every time you do a""" start="00:20:01.666" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Gemini query, it's like boiling a cup of water. Yeah, I've""" start="00:20:05.871" video="qanda-private-ai" id="subtitle"]] +[[!template text="""heard that one too. So do you want to, you know, I don't know""" start="00:20:12.399" video="qanda-private-ai" id="subtitle"]] +[[!template text="""how much direction you want. I'd be very happy to read out the""" start="00:20:14.021" video="qanda-private-ai" id="subtitle"]] +[[!template text="""questions for you. Yeah, that would be great. I'm having""" start="00:20:17.625" video="qanda-private-ai" id="subtitle"]] +[[!template text="""trouble getting to that tab. Okay, I'm there, so I'll put it""" start="00:20:21.510" video="qanda-private-ai" id="subtitle"]] +[[!template text="""into our chat too, so you can follow along if you'd like. The""" start="00:20:24.093" video="qanda-private-ai" id="subtitle"]] +[[!template text="""first question was, why is the David Bowie question a good""" start="00:20:32.106" video="qanda-private-ai" id="subtitle"]] +[[!template text="""one to start with? Does it have interesting failure""" start="00:20:32.106" video="qanda-private-ai" id="subtitle"]] +[[!template text="""conditions or what made you choose that? First off, huge fan""" start="00:20:38.436" video="qanda-private-ai" id="subtitle"]] +[[!template text="""of David Bowie. But I came down to it really taught me a few""" start="00:20:42.706" video="qanda-private-ai" id="subtitle"]] +[[!template text="""things about how old the models work in terms of things like""" start="00:20:46.070" video="qanda-private-ai" id="subtitle"]] +[[!template text="""how many kids he had, because deep seek, which is a very""" start="00:20:46.070" video="qanda-private-ai" id="subtitle"]] +[[!template text="""popular Chinese model that a lot of people are using now,""" start="00:20:46.070" video="qanda-private-ai" id="subtitle"]] +[[!template text="""misidentifies him having three daughters, and he has like""" start="00:20:46.070" video="qanda-private-ai" id="subtitle"]] +[[!template text="""one son and one, one, I think, two sons and a daughter or""" start="00:20:46.070" video="qanda-private-ai" id="subtitle"]] +[[!template text="""something like that. so there's differences on that and it""" start="00:20:46.070" video="qanda-private-ai" id="subtitle"]] +[[!template text="""just goes over there's a whole lot of stuff because his story""" start="00:21:06.955" video="qanda-private-ai" id="subtitle"]] +[[!template text="""spans like 60 years so it gives a good good feedback that's""" start="00:21:06.955" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the real main reason I asked that question because I just""" start="00:21:06.955" video="qanda-private-ai" id="subtitle"]] +[[!template text="""needed one that sea monkeys I just picked because it was""" start="00:21:06.955" video="qanda-private-ai" id="subtitle"]] +[[!template text="""obscure and just always have right I used to have it right""" start="00:21:06.955" video="qanda-private-ai" id="subtitle"]] +[[!template text="""hello world and forth because I thought was an interesting""" start="00:21:06.955" video="qanda-private-ai" id="subtitle"]] +[[!template text="""one as well so It's just picking random ones like that. One""" start="00:21:06.955" video="qanda-private-ai" id="subtitle"]] +[[!template text="""question asked, sorry, a lot of models is, what is the""" start="00:21:30.265" video="qanda-private-ai" id="subtitle"]] +[[!template text="""closest star to the Earth? Because most of them will say""" start="00:21:30.265" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Alpha Centauri or Proxima Centauri and not the sun. And I""" start="00:21:35.911" video="qanda-private-ai" id="subtitle"]] +[[!template text="""have a whole nother talk where I just argue with the LLM""" start="00:21:40.376" video="qanda-private-ai" id="subtitle"]] +[[!template text="""trying to say, hey, the sun is a star. And he just wouldn't""" start="00:21:40.376" video="qanda-private-ai" id="subtitle"]] +[[!template text="""accept it, so.""" start="00:21:46.383" video="qanda-private-ai" id="subtitle"]] +[[!template text="""What? Oh, I can hear that. So what specific tasks do you like""" start="00:21:53.230" video="qanda-private-ai" id="subtitle"]] +[[!template text="""to use your local AI? I like to load a lot of my code into and""" start="00:21:56.956" video="qanda-private-ai" id="subtitle"]] +[[!template text="""actually have it do analysis of it. I was actually going""" start="00:22:01.883" video="qanda-private-ai" id="subtitle"]] +[[!template text="""through some code I have for some pen testing, and I was""" start="00:22:05.207" video="qanda-private-ai" id="subtitle"]] +[[!template text="""having it modified to update it for the newer version,""" start="00:22:05.207" video="qanda-private-ai" id="subtitle"]] +[[!template text="""because I hate to say this, but it was written for Python 2,""" start="00:22:05.207" video="qanda-private-ai" id="subtitle"]] +[[!template text="""and I needed to update it for Python 3. And the 2 to 3 tool did""" start="00:22:05.207" video="qanda-private-ai" id="subtitle"]] +[[!template text="""not do all of it, but the actual tool was able to do the""" start="00:22:17.982" video="qanda-private-ai" id="subtitle"]] +[[!template text="""refactoring. It's part of my laziness. But I use that for""" start="00:22:17.982" video="qanda-private-ai" id="subtitle"]] +[[!template text="""anything I don't want to hit the web. And that's a lot of stuff""" start="00:22:24.850" video="qanda-private-ai" id="subtitle"]] +[[!template text="""when you start thinking about if you're doing cyber""" start="00:22:27.654" video="qanda-private-ai" id="subtitle"]] +[[!template text="""security researching. and you have your white papers and""" start="00:22:27.654" video="qanda-private-ai" id="subtitle"]] +[[!template text="""stuff like that and stuff in there. I've got a lot of that""" start="00:22:31.398" video="qanda-private-ai" id="subtitle"]] +[[!template text="""loaded into RAG in one model on my OpenWebUI system. Neat.""" start="00:22:34.988" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Have you used have you used any small domain specific LLMs?""" start="00:22:44.272" video="qanda-private-ai" id="subtitle"]] +[[!template text="""What kind of tasks? If so, what kind of tasks that they""" start="00:22:50.459" video="qanda-private-ai" id="subtitle"]] +[[!template text="""specialize in? And you know, how? Not to be honest, but there""" start="00:22:51.540" video="qanda-private-ai" id="subtitle"]] +[[!template text="""are some out there like once again, for cybersecurity and""" start="00:22:58.428" video="qanda-private-ai" id="subtitle"]] +[[!template text="""stuff like that, that I really need to dig into that's on my to""" start="00:22:58.428" video="qanda-private-ai" id="subtitle"]] +[[!template text="""do list. I've got a couple weeks off at the end of the year. And""" start="00:22:58.428" video="qanda-private-ai" id="subtitle"]] +[[!template text="""that's a big part of my plan for that.""" start="00:23:07.778" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Are the various models updated pretty regularly? Can you""" start="00:23:12.539" video="qanda-private-ai" id="subtitle"]] +[[!template text="""add your own data to the pre-built models? Yes. The models""" start="00:23:15.495" video="qanda-private-ai" id="subtitle"]] +[[!template text="""are updated pretty reasonably. You can add data to a model in""" start="00:23:20.622" video="qanda-private-ai" id="subtitle"]] +[[!template text="""a couple of different ways. You can do something called""" start="00:23:23.184" video="qanda-private-ai" id="subtitle"]] +[[!template text="""fine-tuning, which requires a really nice GPU and a lot of""" start="00:23:25.827" video="qanda-private-ai" id="subtitle"]] +[[!template text="""CPU time. Probably not going to do that. You can do retrieval""" start="00:23:25.827" video="qanda-private-ai" id="subtitle"]] +[[!template text="""augmentation generation, which is you load your data on top""" start="00:23:31.694" video="qanda-private-ai" id="subtitle"]] +[[!template text="""of the system and puts inside a database and you can actually""" start="00:23:31.694" video="qanda-private-ai" id="subtitle"]] +[[!template text="""scan that and stuff. I have another talk where I go through""" start="00:23:31.694" video="qanda-private-ai" id="subtitle"]] +[[!template text="""and I start asking questions about, I load the talk into the""" start="00:23:39.422" video="qanda-private-ai" id="subtitle"]] +[[!template text="""engine and I ask questions against that. I would have one""" start="00:23:39.422" video="qanda-private-ai" id="subtitle"]] +[[!template text="""more time would have done that but it comes down to how many""" start="00:23:46.390" video="qanda-private-ai" id="subtitle"]] +[[!template text="""That's that's rag rag is pretty easy to do through open web UI""" start="00:23:46.390" video="qanda-private-ai" id="subtitle"]] +[[!template text="""or LM studio It's a great way you just like point a folder""" start="00:23:46.390" video="qanda-private-ai" id="subtitle"]] +[[!template text="""point it to a folder and it just sucks all that state into and""" start="00:23:46.390" video="qanda-private-ai" id="subtitle"]] +[[!template text="""it'll hit that data first you have like helpdesk and stuff""" start="00:23:46.390" video="qanda-private-ai" id="subtitle"]] +[[!template text="""and The other options there's vector databases, which is""" start="00:23:46.390" video="qanda-private-ai" id="subtitle"]] +[[!template text="""like if you use PostgreSQL. It has a PG vector I can do a lot of""" start="00:23:46.390" video="qanda-private-ai" id="subtitle"]] +[[!template text="""that stuff. I've not dug into that yet, but that is also on""" start="00:24:07.716" video="qanda-private-ai" id="subtitle"]] +[[!template text="""that to-do list I've got a lot of stuff planned for Cool. So""" start="00:24:09.679" video="qanda-private-ai" id="subtitle"]] +[[!template text="""what are your experience with rags? I don't even know what""" start="00:24:15.279" video="qanda-private-ai" id="subtitle"]] +[[!template text="""that means. Do you know what that means? Do you""" start="00:24:17.964" video="qanda-private-ai" id="subtitle"]] +[[!template text="""remember this question again? What is your experience with""" start="00:24:20.911" video="qanda-private-ai" id="subtitle"]] +[[!template text="""RAGS? RAGS is great. That's Retrieval Augmentation""" start="00:24:26.080" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Generation. That loads your data first, and it hits yours,""" start="00:24:31.387" video="qanda-private-ai" id="subtitle"]] +[[!template text="""and it'll actually cite it and stuff. There's a guy who wrote""" start="00:24:33.911" video="qanda-private-ai" id="subtitle"]] +[[!template text="""a RAG in 100 lines of Python, and it's an impressive piece of""" start="00:24:37.656" video="qanda-private-ai" id="subtitle"]] +[[!template text="""software. I think if you hit one of my site, I've got a private""" start="00:24:37.656" video="qanda-private-ai" id="subtitle"]] +[[!template text="""AI talk where I actually refer to that. But retrieval""" start="00:24:43.685" video="qanda-private-ai" id="subtitle"]] +[[!template text="""augmentation, it's easy, it's fast, it puts your data into""" start="00:24:48.411" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the system, Yeah, start with that and go then iterate on top""" start="00:24:48.411" video="qanda-private-ai" id="subtitle"]] +[[!template text="""of that. That's one of the great things about AI, especially""" start="00:24:53.468" video="qanda-private-ai" id="subtitle"]] +[[!template text="""private AI, is you can do whatever you want to with it and""" start="00:24:57.454" video="qanda-private-ai" id="subtitle"]] +[[!template text="""build up with it as you get more experience. Any thoughts on""" start="00:24:57.454" video="qanda-private-ai" id="subtitle"]] +[[!template text="""running things on AWS, DigitalOcean, and so on? AWS is not""" start="00:25:06.067" video="qanda-private-ai" id="subtitle"]] +[[!template text="""bad. The DigitalOcean, they have some of their GPOs. I still""" start="00:25:14.140" video="qanda-private-ai" id="subtitle"]] +[[!template text="""don't like having the data leave my house, to be honest, or at""" start="00:25:18.868" video="qanda-private-ai" id="subtitle"]] +[[!template text="""work, because I tend to do some stuff that I don't want it even""" start="00:25:18.868" video="qanda-private-ai" id="subtitle"]] +[[!template text="""hitting that situation. But they have pretty good stuff.""" start="00:25:18.868" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Another one to consider is Oracle Cloud. Oracle has their AI""" start="00:25:30.106" video="qanda-private-ai" id="subtitle"]] +[[!template text="""infrastructure that's really well done. But I mean, once""" start="00:25:31.829" video="qanda-private-ai" id="subtitle"]] +[[!template text="""again, then you start looking at potential is saying your""" start="00:25:35.555" video="qanda-private-ai" id="subtitle"]] +[[!template text="""data is private, I don't necessarily trust it. But they do""" start="00:25:35.555" video="qanda-private-ai" id="subtitle"]] +[[!template text="""have good stuff, both DigitalOcean, AWS, Oracle Cloud has""" start="00:25:41.323" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the free service, which isn't too bad, usually a certain""" start="00:25:41.323" video="qanda-private-ai" id="subtitle"]] +[[!template text="""number of stuff. And Google's also has it, but I still tend to""" start="00:25:41.323" video="qanda-private-ai" id="subtitle"]] +[[!template text="""keep more stuff on local PCs, because I just paranoid that""" start="00:25:48.051" video="qanda-private-ai" id="subtitle"]] +[[!template text="""way. Gotcha. What has your experience been using AI? Do you""" start="00:25:48.051" video="qanda-private-ai" id="subtitle"]] +[[!template text="""want to get into that, using AI for cybersecurity? You might""" start="00:26:02.366" video="qanda-private-ai" id="subtitle"]] +[[!template text="""have already touched on this. Yeah, really, for""" start="00:26:06.251" video="qanda-private-ai" id="subtitle"]] +[[!template text="""cybersecurity, what I've had to do is I've dumped logs to""" start="00:26:08.434" video="qanda-private-ai" id="subtitle"]] +[[!template text="""have a due correlation. Keep in mind, the size of that LLAMA""" start="00:26:08.434" video="qanda-private-ai" id="subtitle"]] +[[!template text="""file we were using for figuring out David Bowie, writing the""" start="00:26:13.702" video="qanda-private-ai" id="subtitle"]] +[[!template text="""hello world, all that stuff, is like six gig. How does it get""" start="00:26:13.702" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the entire world in six gig? I still haven't figured that out""" start="00:26:20.712" video="qanda-private-ai" id="subtitle"]] +[[!template text="""in terms of quantization. So I'm really interested in""" start="00:26:23.476" video="qanda-private-ai" id="subtitle"]] +[[!template text="""seeing the ability to take all this stuff out of all my logs,""" start="00:26:26.240" video="qanda-private-ai" id="subtitle"]] +[[!template text="""dump it all in there, and actually be able to do intelligent""" start="00:26:26.240" video="qanda-private-ai" id="subtitle"]] +[[!template text="""queries against that. Microsoft has a project called""" start="00:26:26.240" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Security Copilot, which is trying to do that in the Cloud.""" start="00:26:34.572" video="qanda-private-ai" id="subtitle"]] +[[!template text="""But I want to work on something to do that more locally and be""" start="00:26:39.218" video="qanda-private-ai" id="subtitle"]] +[[!template text="""able to actually drive this stuff over that. That's one also""" start="00:26:39.218" video="qanda-private-ai" id="subtitle"]] +[[!template text="""on the long-term goals.""" start="00:26:44.726" video="qanda-private-ai" id="subtitle"]] +[[!template text="""So we got any other questions or? Those are the questions""" start="00:26:50.817" video="qanda-private-ai" id="subtitle"]] +[[!template text="""that I see. I want to just read out a couple of comments that I""" start="00:26:53.640" video="qanda-private-ai" id="subtitle"]] +[[!template text="""saw in IRC though. Jay Rutabaga says, it went very well from""" start="00:26:55.161" video="qanda-private-ai" id="subtitle"]] +[[!template text="""an audience perspective. And G Gundam says, respect your""" start="00:27:00.305" video="qanda-private-ai" id="subtitle"]] +[[!template text="""commitment to privacy. And then somebody is telling us we""" start="00:27:05.090" video="qanda-private-ai" id="subtitle"]] +[[!template text="""might have skipped a question. So I'm just going to run back""" start="00:27:10.635" video="qanda-private-ai" id="subtitle"]] +[[!template text="""to my list. Updated regularly experience. I just didn't""" start="00:27:13.037" video="qanda-private-ai" id="subtitle"]] +[[!template text="""type in the answer here's and there's a couple more""" start="00:27:20.855" video="qanda-private-ai" id="subtitle"]] +[[!template text="""questions coming in so Is there a disparity where you go to""" start="00:27:20.855" video="qanda-private-ai" id="subtitle"]] +[[!template text="""paid models because they are better and what problems? You""" start="00:27:20.855" video="qanda-private-ai" id="subtitle"]] +[[!template text="""know what would drive you to? That's a good question. Paid""" start="00:27:34.735" video="qanda-private-ai" id="subtitle"]] +[[!template text="""models, I don't mind them. I think they're good, but I don't""" start="00:27:40.687" video="qanda-private-ai" id="subtitle"]] +[[!template text="""think they're actually economically sustainable under""" start="00:27:42.970" video="qanda-private-ai" id="subtitle"]] +[[!template text="""their current system. Because right now, if you're paying""" start="00:27:42.970" video="qanda-private-ai" id="subtitle"]] +[[!template text="""20 bucks a month for Copilot and that goes up to 200 bucks, I'm""" start="00:27:49.200" video="qanda-private-ai" id="subtitle"]] +[[!template text="""not going to be as likely to use it. You know what I mean? But it""" start="00:27:49.200" video="qanda-private-ai" id="subtitle"]] +[[!template text="""does do some things in a way that I did not expect. For""" start="00:27:56.030" video="qanda-private-ai" id="subtitle"]] +[[!template text="""example, Grok was refactoring some of my code in the""" start="00:27:59.475" video="qanda-private-ai" id="subtitle"]] +[[!template text="""comments and dropped an F-bomb. which I did not see coming,""" start="00:27:59.475" video="qanda-private-ai" id="subtitle"]] +[[!template text="""but the other code before that I had gotten off GitHub had F""" start="00:28:04.966" video="qanda-private-ai" id="subtitle"]] +[[!template text="""bombs in it. So it was just emulating the style, but would""" start="00:28:04.966" video="qanda-private-ai" id="subtitle"]] +[[!template text="""that be something I'd want to turn in a pull request? I don't""" start="00:28:10.493" video="qanda-private-ai" id="subtitle"]] +[[!template text="""know. But, uh, there's, there's a lot of money going into""" start="00:28:15.619" video="qanda-private-ai" id="subtitle"]] +[[!template text="""these AIs and stuff, but in terms of the ability to get a""" start="00:28:16.180" video="qanda-private-ai" id="subtitle"]] +[[!template text="""decent one, like the llama, llama three, two, and load your""" start="00:28:16.180" video="qanda-private-ai" id="subtitle"]] +[[!template text="""data into it, you can be pretty competitive. You're not""" start="00:28:16.180" video="qanda-private-ai" id="subtitle"]] +[[!template text="""going to get all the benefits, but you have more control over""" start="00:28:27.534" video="qanda-private-ai" id="subtitle"]] +[[!template text="""it. So it's, it's a, this and that it's a, it's a balancing""" start="00:28:27.534" video="qanda-private-ai" id="subtitle"]] +[[!template text="""act.""" start="00:28:30.598" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Okay, and I think I see a couple more questions coming in.""" start="00:28:37.315" video="qanda-private-ai" id="subtitle"]] +[[!template text="""What is the largest parameter size for local models that""" start="00:28:40.821" video="qanda-private-ai" id="subtitle"]] +[[!template text="""you've been able to successfully run locally and do run into""" start="00:28:40.821" video="qanda-private-ai" id="subtitle"]] +[[!template text="""issues with limited context window size? The top eight""" start="00:28:40.821" video="qanda-private-ai" id="subtitle"]] +[[!template text="""models will tend to have a larger ceiling. Yes, yes, yes,""" start="00:28:52.560" video="qanda-private-ai" id="subtitle"]] +[[!template text="""yes, yes. By default, the context size is I think 1024. But""" start="00:28:57.650" video="qanda-private-ai" id="subtitle"]] +[[!template text="""I've upped it to 8192 on the on this box, the Pangolin because""" start="00:29:03.657" video="qanda-private-ai" id="subtitle"]] +[[!template text="""it seems to be some reason it's just a very working quite""" start="00:29:03.657" video="qanda-private-ai" id="subtitle"]] +[[!template text="""well. But the largest ones I've loaded have been in the have""" start="00:29:03.657" video="qanda-private-ai" id="subtitle"]] +[[!template text="""not been that huge. I've loaded this the last biggest one""" start="00:29:11.966" video="qanda-private-ai" id="subtitle"]] +[[!template text="""I've done. That's the reason why I'm planning on breaking""" start="00:29:17.573" video="qanda-private-ai" id="subtitle"]] +[[!template text="""down and buying a Ryzen. Actually, I'm going to buy an Intel""" start="00:29:22.118" video="qanda-private-ai" id="subtitle"]] +[[!template text="""i285H with 96 gig of RAM. Then I should be able to load a 70""" start="00:29:26.483" video="qanda-private-ai" id="subtitle"]] +[[!template text="""billion parameter model in that. How fast will it run? It's""" start="00:29:33.150" video="qanda-private-ai" id="subtitle"]] +[[!template text="""going to run slow as dog, but it's going to be cool to be able to""" start="00:29:38.176" video="qanda-private-ai" id="subtitle"]] +[[!template text="""do it. It's an AI bragging rights thing, but I mostly stick""" start="00:29:38.176" video="qanda-private-ai" id="subtitle"]] +[[!template text="""with the smaller size models and the ones that are more""" start="00:29:41.580" video="qanda-private-ai" id="subtitle"]] +[[!template text="""quantitized because it just tends to work better for me.""" start="00:29:41.580" video="qanda-private-ai" id="subtitle"]] +[[!template text="""We've still got over 10 minutes before we're cutting away,""" start="00:29:50.975" video="qanda-private-ai" id="subtitle"]] +[[!template text="""but I'm just anticipating that we're going to be going""" start="00:29:50.975" video="qanda-private-ai" id="subtitle"]] +[[!template text="""strong at the 10 minute mark. So I'm just, just letting, you""" start="00:29:50.975" video="qanda-private-ai" id="subtitle"]] +[[!template text="""know, we can go as long as we like here at a certain point. I may""" start="00:29:59.065" video="qanda-private-ai" id="subtitle"]] +[[!template text="""have to jump away and check in with the next speaker, but""" start="00:30:03.691" video="qanda-private-ai" id="subtitle"]] +[[!template text="""we'll post the entirety of this, even if we aren't able to""" start="00:30:03.691" video="qanda-private-ai" id="subtitle"]] +[[!template text="""stay with it all. Okay. And we've got 10 minutes where we're""" start="00:30:03.691" video="qanda-private-ai" id="subtitle"]] +[[!template text="""still going to stay live. So next question coming in, I see,""" start="00:30:14.633" video="qanda-private-ai" id="subtitle"]] +[[!template text="""are there free as in freedom, free as in FSF issues with the""" start="00:30:17.859" video="qanda-private-ai" id="subtitle"]] +[[!template text="""data? Yes, where's the data coming from is a huge question""" start="00:30:17.859" video="qanda-private-ai" id="subtitle"]] +[[!template text="""with AI. It's astonishing you can ask questions to models""" start="00:30:31.778" video="qanda-private-ai" id="subtitle"]] +[[!template text="""that you don't know where it's coming from. That is gonna be""" start="00:30:37.826" video="qanda-private-ai" id="subtitle"]] +[[!template text="""one of the big issues long-term. There are people who are""" start="00:30:42.532" video="qanda-private-ai" id="subtitle"]] +[[!template text="""working on trying to figure out that stuff, but it's, I mean,""" start="00:30:46.096" video="qanda-private-ai" id="subtitle"]] +[[!template text="""if you look at, God, I can't remember who it was. Somebody was""" start="00:30:46.096" video="qanda-private-ai" id="subtitle"]] +[[!template text="""actually out torrenting books just to be able to build into""" start="00:30:53.205" video="qanda-private-ai" id="subtitle"]] +[[!template text="""their AI system. I think it might've been Meta. So there's a""" start="00:30:53.205" video="qanda-private-ai" id="subtitle"]] +[[!template text="""lot of that going on. The open source of the stuff is going to""" start="00:30:58.272" video="qanda-private-ai" id="subtitle"]] +[[!template text="""be tough. There's going to be there's some models like the""" start="00:31:00.956" video="qanda-private-ai" id="subtitle"]] +[[!template text="""mobile guys have got their own license, but where they're""" start="00:31:04.240" video="qanda-private-ai" id="subtitle"]] +[[!template text="""getting their data from, I'm not sure on so that that's a huge""" start="00:31:04.240" video="qanda-private-ai" id="subtitle"]] +[[!template text="""question. That's a that's a talk in itself. But yeah, but you""" start="00:31:04.240" video="qanda-private-ai" id="subtitle"]] +[[!template text="""if you train on your rag and your data, you know what it's""" start="00:31:14.074" video="qanda-private-ai" id="subtitle"]] +[[!template text="""come, you know, you have a license that but the other stuff is""" start="00:31:14.074" video="qanda-private-ai" id="subtitle"]] +[[!template text="""just more lines of supplement if you're using a smaller""" start="00:31:14.074" video="qanda-private-ai" id="subtitle"]] +[[!template text="""model, but""" start="00:31:14.074" video="qanda-private-ai" id="subtitle"]] +[[!template text="""The comment online, I see a couple of them. I'll read them out""" start="00:31:27.449" video="qanda-private-ai" id="subtitle"]] +[[!template text="""in order here. Really interesting stuff. Thank you for your""" start="00:31:32.094" video="qanda-private-ai" id="subtitle"]] +[[!template text="""talk. Given that large AI companies are openly stealing""" start="00:31:34.496" video="qanda-private-ai" id="subtitle"]] +[[!template text="""intellectual property and copyright and therefore""" start="00:31:35.617" video="qanda-private-ai" id="subtitle"]] +[[!template text="""eroding the authority of such laws and maybe obscuring the""" start="00:31:35.617" video="qanda-private-ai" id="subtitle"]] +[[!template text="""truth itself, can you see a future where IP and copyright""" start="00:31:35.617" video="qanda-private-ai" id="subtitle"]] +[[!template text="""flaw become untenable? I think that's a great question. I'm""" start="00:31:35.617" video="qanda-private-ai" id="subtitle"]] +[[!template text="""not a lawyer, but it is really getting complicated. It is""" start="00:31:55.799" video="qanda-private-ai" id="subtitle"]] +[[!template text="""getting to the point, I asked a question from, I played with""" start="00:32:01.106" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Sora a little bit, and it generated someone, you can go like,""" start="00:32:01.106" video="qanda-private-ai" id="subtitle"]] +[[!template text="""oh, that's Jon Hamm, that's Christopher Walken, you start""" start="00:32:01.106" video="qanda-private-ai" id="subtitle"]] +[[!template text="""figuring out who the people they're modeling stuff after.""" start="00:32:01.106" video="qanda-private-ai" id="subtitle"]] +[[!template text="""There is an apocalypse, something going to happen right""" start="00:32:12.961" video="qanda-private-ai" id="subtitle"]] +[[!template text="""now. There is, but this is once again, my personal opinion,""" start="00:32:12.961" video="qanda-private-ai" id="subtitle"]] +[[!template text="""and I'm not a lawyer, and I do not have money. So don't sue me,""" start="00:32:17.466" video="qanda-private-ai" id="subtitle"]] +[[!template text="""is there's going to be the current administration tends is""" start="00:32:22.812" video="qanda-private-ai" id="subtitle"]] +[[!template text="""very AI pro AI. And there's very a great deal of lobbying by""" start="00:32:22.812" video="qanda-private-ai" id="subtitle"]] +[[!template text="""those groups. And it's on both sides. And it's going to be,""" start="00:32:29.019" video="qanda-private-ai" id="subtitle"]] +[[!template text="""it's gonna be interesting to see what happens to copyright""" start="00:32:33.423" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the next 510 years. I just don't know how it keeps up without""" start="00:32:33.423" video="qanda-private-ai" id="subtitle"]] +[[!template text="""there being some adjustments and stuff. Okay, and then""" start="00:32:37.888" video="qanda-private-ai" id="subtitle"]] +[[!template text="""another comment I saw, file size is not going to be a""" start="00:32:44.180" video="qanda-private-ai" id="subtitle"]] +[[!template text="""bottleneck. RAM is. You'll need 16 gigabytes of RAM to run""" start="00:32:44.180" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the smallest local models and 512 gigabytes of RAM to run the""" start="00:32:50.014" video="qanda-private-ai" id="subtitle"]] +[[!template text="""larger ones. You'll need a GPU with that much memory if you""" start="00:32:50.014" video="qanda-private-ai" id="subtitle"]] +[[!template text="""want it to run quickly. Yeah. Oh no. It also depends upon how""" start="00:32:57.912" video="qanda-private-ai" id="subtitle"]] +[[!template text="""your memory is laid out. Like example being the Ultra i285H I""" start="00:33:05.421" video="qanda-private-ai" id="subtitle"]] +[[!template text="""plan to buy, that has 96 gig of memory. It's unified between""" start="00:33:07.364" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the GPU and the CPU share it, but they go over the same bus. So""" start="00:33:14.014" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the overall bandwidth of it tends to be a bit less, but you're""" start="00:33:17.800" video="qanda-private-ai" id="subtitle"]] +[[!template text="""able to load more of it into memory. So it's able to do some""" start="00:33:17.800" video="qanda-private-ai" id="subtitle"]] +[[!template text="""additional stuff with it as opposed to come off disk. It's""" start="00:33:23.729" video="qanda-private-ai" id="subtitle"]] +[[!template text="""all balancing act. If you hit Zyskin's website, that guy's""" start="00:33:27.034" video="qanda-private-ai" id="subtitle"]] +[[!template text="""done some great work on it. I'm trying to figure out how big a""" start="00:33:28.516" video="qanda-private-ai" id="subtitle"]] +[[!template text="""model you can do, what you can do with it. And some of the stuff""" start="00:33:31.801" video="qanda-private-ai" id="subtitle"]] +[[!template text="""seems to be not obvious, because like example, being that""" start="00:33:34.826" video="qanda-private-ai" id="subtitle"]] +[[!template text="""MacBook Air, for the five minutes I can run the model, it runs""" start="00:33:34.826" video="qanda-private-ai" id="subtitle"]] +[[!template text="""it faster than a lot of other things that should be able to run""" start="00:33:34.826" video="qanda-private-ai" id="subtitle"]] +[[!template text="""it faster, just because of the way the ARM cores and the""" start="00:33:34.826" video="qanda-private-ai" id="subtitle"]] +[[!template text="""unified memory work on it. So it's a learning process. But if""" start="00:33:34.826" video="qanda-private-ai" id="subtitle"]] +[[!template text="""you want to, Network Chuck had a great video talking about""" start="00:33:52.151" video="qanda-private-ai" id="subtitle"]] +[[!template text="""building his own system with a couple really powerful""" start="00:33:52.151" video="qanda-private-ai" id="subtitle"]] +[[!template text="""NVIDIA cards and stuff like that in it. And just actually""" start="00:33:57.940" video="qanda-private-ai" id="subtitle"]] +[[!template text="""setting up on his system as a node and using a web UI on it. So""" start="00:34:01.864" video="qanda-private-ai" id="subtitle"]] +[[!template text="""there's a lot of stuff there, but it is a process of learning""" start="00:34:06.009" video="qanda-private-ai" id="subtitle"]] +[[!template text="""how big your data is, which models you want to use, how much""" start="00:34:06.009" video="qanda-private-ai" id="subtitle"]] +[[!template text="""information you need, but it's part of the learning. And you""" start="00:34:06.009" video="qanda-private-ai" id="subtitle"]] +[[!template text="""can run models, even as a Raspberry PI fives, if you want to,""" start="00:34:15.920" video="qanda-private-ai" id="subtitle"]] +[[!template text="""they'll run slow. Don't get me wrong, but they're possible.""" start="00:34:15.920" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Okay, and I think there's other questions coming in too, so""" start="00:34:25.497" video="qanda-private-ai" id="subtitle"]] +[[!template text="""I'll just bam for another second. We've got about five""" start="00:34:25.497" video="qanda-private-ai" id="subtitle"]] +[[!template text="""minutes before we'll, before we'll be cutting over, but I""" start="00:34:30.162" video="qanda-private-ai" id="subtitle"]] +[[!template text="""just want to say in case we get close for time here, how much I""" start="00:34:30.162" video="qanda-private-ai" id="subtitle"]] +[[!template text="""appreciate your talk. This is another one that I'm going to""" start="00:34:30.162" video="qanda-private-ai" id="subtitle"]] +[[!template text="""have to study after the conference. We greatly appreciate,""" start="00:34:40.992" video="qanda-private-ai" id="subtitle"]] +[[!template text="""all of us appreciate you guys putting on the conference.""" start="00:34:44.716" video="qanda-private-ai" id="subtitle"]] +[[!template text="""It's a great conference. It's well done. It's an honor to be""" start="00:34:48.279" video="qanda-private-ai" id="subtitle"]] +[[!template text="""on the stage with the brains of the project, which is you. So""" start="00:34:52.328" video="qanda-private-ai" id="subtitle"]] +[[!template text="""what else we got? Question wise. Okay, so just scanning""" start="00:34:59.440" video="qanda-private-ai" id="subtitle"]] +[[!template text="""here.""" start="00:35:01.785" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Have you used local models capable of tool calling? I'm, I'm""" start="00:35:13.746" video="qanda-private-ai" id="subtitle"]] +[[!template text="""scared of agentic. I, I am, I'm going to be a slow adopter of""" start="00:35:18.502" video="qanda-private-ai" id="subtitle"]] +[[!template text="""that. I want to do it, but I just don't have the, uh, four""" start="00:35:21.005" video="qanda-private-ai" id="subtitle"]] +[[!template text="""decimal fortitude right now to do it. I, I, I've had to give me""" start="00:35:25.490" video="qanda-private-ai" id="subtitle"]] +[[!template text="""the commands, but I still run the commands by hand. I'm""" start="00:35:30.897" video="qanda-private-ai" id="subtitle"]] +[[!template text="""looking into it and it's on once again, it's on that list, but""" start="00:35:34.922" video="qanda-private-ai" id="subtitle"]] +[[!template text="""I just, that's a big step for me. So.""" start="00:35:34.922" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Awesome. All right. Well, maybe it's, let me just scroll""" start="00:35:46.953" video="qanda-private-ai" id="subtitle"]] +[[!template text="""through because we might have missed one question.""" start="00:35:49.764" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Oh, I see. Here was the piggyback question. Now I see the""" start="00:36:00.908" video="qanda-private-ai" id="subtitle"]] +[[!template text="""question that I missed. So this was piggybacking on the""" start="00:36:03.033" video="qanda-private-ai" id="subtitle"]] +[[!template text="""question about model updates and adding data. And will""" start="00:36:04.997" video="qanda-private-ai" id="subtitle"]] +[[!template text="""models reach out to the web if they need more info? Or have you""" start="00:36:11.372" video="qanda-private-ai" id="subtitle"]] +[[!template text="""worked with any models that work that way? No, I've not seen""" start="00:36:16.524" video="qanda-private-ai" id="subtitle"]] +[[!template text="""any models to do that There's there was like a group working""" start="00:36:18.408" video="qanda-private-ai" id="subtitle"]] +[[!template text="""on something like a package updater that would do different""" start="00:36:18.408" video="qanda-private-ai" id="subtitle"]] +[[!template text="""diffs on it, but it's so Models change so much even who make""" start="00:36:18.408" video="qanda-private-ai" id="subtitle"]] +[[!template text="""minor changes and fine-tuning. It's hard just to update""" start="00:36:18.408" video="qanda-private-ai" id="subtitle"]] +[[!template text="""them in place So I haven't seen one, but that doesn't mean""" start="00:36:31.983" video="qanda-private-ai" id="subtitle"]] +[[!template text="""they're not out there. I'm curious topic though""" start="00:36:31.983" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Awesome Well, it's probably pretty good timing. Let me just""" start="00:36:37.249" video="qanda-private-ai" id="subtitle"]] +[[!template text="""scroll and make sure. And of course, before I can say that,""" start="00:36:45.728" video="qanda-private-ai" id="subtitle"]] +[[!template text="""there's one more question. So let's go ahead and have that. I""" start="00:36:47.049" video="qanda-private-ai" id="subtitle"]] +[[!template text="""want to make sure while we're still live, though, I give you a""" start="00:36:51.656" video="qanda-private-ai" id="subtitle"]] +[[!template text="""chance to offer any closing thoughts. So what scares you""" start="00:36:51.656" video="qanda-private-ai" id="subtitle"]] +[[!template text="""most about the eugenic tools? How would you think about""" start="00:36:57.503" video="qanda-private-ai" id="subtitle"]] +[[!template text="""putting a sandbox around that if you did adopt an eugenic""" start="00:37:01.889" video="qanda-private-ai" id="subtitle"]] +[[!template text="""workflow? That is a great question. In terms of that, I would""" start="00:37:01.889" video="qanda-private-ai" id="subtitle"]] +[[!template text="""just control what it's able to talk to, what machines, I""" start="00:37:09.670" video="qanda-private-ai" id="subtitle"]] +[[!template text="""would actually have it be air gap. I work for a defense""" start="00:37:09.670" video="qanda-private-ai" id="subtitle"]] +[[!template text="""contractor, and we spend a lot of time dealing with air gap""" start="00:37:16.320" video="qanda-private-ai" id="subtitle"]] +[[!template text="""systems, because that's just kind of the way it works out for""" start="00:37:16.320" video="qanda-private-ai" id="subtitle"]] +[[!template text="""us. So agentic, it's just going to take a while to get trust. I""" start="00:37:16.320" video="qanda-private-ai" id="subtitle"]] +[[!template text="""want to want to see more stuff happening. Humans screw up""" start="00:37:25.594" video="qanda-private-ai" id="subtitle"]] +[[!template text="""stuff enough. The last thing we need is to multiply that by""" start="00:37:27.757" video="qanda-private-ai" id="subtitle"]] +[[!template text="""1000. So in terms of that, I would be restricting what it can""" start="00:37:28.919" video="qanda-private-ai" id="subtitle"]] +[[!template text="""do. If you look at the capabilities, if I created a user and""" start="00:37:31.443" video="qanda-private-ai" id="subtitle"]] +[[!template text="""gave it permissions, I would have a lockdown through sudo,""" start="00:37:35.870" video="qanda-private-ai" id="subtitle"]] +[[!template text="""what it's able to do, what the account's able to do. I would do""" start="00:37:35.870" video="qanda-private-ai" id="subtitle"]] +[[!template text="""those kind of things, but it's going to be, it's happening.""" start="00:37:43.863" video="qanda-private-ai" id="subtitle"]] +[[!template text="""It's just, I'm going to be one of the laggards on that one. So""" start="00:37:47.068" video="qanda-private-ai" id="subtitle"]] +[[!template text="""airgab, jail, extremely locked down environments, like""" start="00:37:49.171" video="qanda-private-ai" id="subtitle"]] +[[!template text="""we're talking about separate physicals, not Docker. Yeah,""" start="00:37:49.171" video="qanda-private-ai" id="subtitle"]] +[[!template text="""hopefully. Right, fair. So tool calling can be read-only,""" start="00:37:59.152" video="qanda-private-ai" id="subtitle"]] +[[!template text="""such as giving models the ability to search the web before""" start="00:38:04.060" video="qanda-private-ai" id="subtitle"]] +[[!template text="""answering your question, you know, write access, execute""" start="00:38:04.060" video="qanda-private-ai" id="subtitle"]] +[[!template text="""access. I'm interested to know if local models are any good""" start="00:38:04.060" video="qanda-private-ai" id="subtitle"]] +[[!template text="""at that. Yes, local models can do a lot of that stuff. It's""" start="00:38:12.253" video="qanda-private-ai" id="subtitle"]] +[[!template text="""their capabilities. If you load LM studio, you can do a lot of""" start="00:38:21.052" video="qanda-private-ai" id="subtitle"]] +[[!template text="""wonderful stuff with that or with open web UI with a llama.""" start="00:38:22.473" video="qanda-private-ai" id="subtitle"]] +[[!template text="""It's a lot of capabilities. It's amazing. Open web UI is""" start="00:38:28.561" video="qanda-private-ai" id="subtitle"]] +[[!template text="""actually what a lot of companies are using now to put their""" start="00:38:31.625" video="qanda-private-ai" id="subtitle"]] +[[!template text="""data behind that. They're curated data and stuff like that.""" start="00:38:31.625" video="qanda-private-ai" id="subtitle"]] +[[!template text="""So works well. I can confirm that from my own professional""" start="00:38:37.893" video="qanda-private-ai" id="subtitle"]] +[[!template text="""experience. Excellent. Okay, well, our timing should be""" start="00:38:38.894" video="qanda-private-ai" id="subtitle"]] +[[!template text="""just perfect if you want to give us like a 30-second,""" start="00:38:42.976" video="qanda-private-ai" id="subtitle"]] +[[!template text="""45-second wrap-up. Aaron, let me squeeze in mine. Thank you""" start="00:38:42.976" video="qanda-private-ai" id="subtitle"]] +[[!template text="""again so much for preparing this talk and for entertaining""" start="00:38:50.320" video="qanda-private-ai" id="subtitle"]] +[[!template text="""all of our questions. Yeah, let me just thank you guys for the""" start="00:38:50.320" video="qanda-private-ai" id="subtitle"]] +[[!template text="""conference again. This is a great one. I've enjoyed a lot of""" start="00:38:56.527" video="qanda-private-ai" id="subtitle"]] +[[!template text="""it. I've only had a couple of talks so far, but I'm looking""" start="00:39:00.232" video="qanda-private-ai" id="subtitle"]] +[[!template text="""forward to hitting the ones after this and tomorrow. But the""" start="00:39:01.333" video="qanda-private-ai" id="subtitle"]] +[[!template text="""AI stuff is coming. Get on board. Definitely recommend it.""" start="00:39:06.681" video="qanda-private-ai" id="subtitle"]] +[[!template text="""If you want to just try it out and get a little taste of it, what""" start="00:39:12.028" video="qanda-private-ai" id="subtitle"]] +[[!template text="""my minimal viable product with just LlamaFile and GPTEL""" start="00:39:12.028" video="qanda-private-ai" id="subtitle"]] +[[!template text="""will get you to the point where you start figuring out. GPTEL""" start="00:39:12.028" video="qanda-private-ai" id="subtitle"]] +[[!template text="""is an amazing thing. It just gets out of your way, but it works""" start="00:39:19.057" video="qanda-private-ai" id="subtitle"]] +[[!template text="""solo with Emacs. Design because it takes doesn't take your""" start="00:39:20.639" video="qanda-private-ai" id="subtitle"]] +[[!template text="""hands off the keyboard. It's just another buffer and you""" start="00:39:24.725" video="qanda-private-ai" id="subtitle"]] +[[!template text="""just put information in there. It's quite quite a wonderful""" start="00:39:27.789" video="qanda-private-ai" id="subtitle"]] +[[!template text="""It's a wonderful time. Let's put that way That's all I got""" start="00:39:30.193" video="qanda-private-ai" id="subtitle"]] +[[!template text="""Thank you so much for once again, and we're we're just cut""" start="00:39:33.057" video="qanda-private-ai" id="subtitle"]] +[[!template text="""away. So I'll stop the recording and you're on your own""" start="00:39:33.057" video="qanda-private-ai" id="subtitle"]] +[[!template text="""recognizance Well, I'm gonna punch out if anybody has any""" start="00:39:40.447" video="qanda-private-ai" id="subtitle"]] +[[!template text="""questions or anything my email address is AJ growthy at""" start="00:39:40.447" video="qanda-private-ai" id="subtitle"]] +[[!template text="""yahoo.com or at gmail and Thank you all for attending and""" start="00:39:40.447" video="qanda-private-ai" id="subtitle"]] +[[!template text="""thanks again for the conference Okay, I'm gonna go ahead and""" start="00:39:40.447" video="qanda-private-ai" id="subtitle"]] +[[!template text="""end the room there, thank you. Excellent, thanks, bye.""" start="00:39:55.994" video="qanda-private-ai" id="subtitle"]] + </div>Questions or comments? Please e-mail [ajgrothe@yahoo.com](mailto:ajgrothe@yahoo.com?subject=Comment%20for%20EmacsConf%202023%20private-ai%3A%20Emacs%20and%20private%20AI%3A%20a%20great%20match) |
