summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
authorEmacsConf <emacsconf-org@gnu.org>2023-12-02 10:55:17 -0500
committerEmacsConf <emacsconf-org@gnu.org>2023-12-02 10:55:17 -0500
commitd8e7de1c0e25eb03bd24dbce5f0bedfc13f8ae70 (patch)
treef5caabb37ecf0e41d95666ddbf8009153b0dde8c
parentf95b11a19c9adcdf488e87515816eef3002a4dd5 (diff)
downloademacsconf-wiki-d8e7de1c0e25eb03bd24dbce5f0bedfc13f8ae70.tar.xz
emacsconf-wiki-d8e7de1c0e25eb03bd24dbce5f0bedfc13f8ae70.zip
Automated commit
-rw-r--r--2023/captions/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--main--chapters.vtt41
-rw-r--r--2023/captions/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--main.vtt1377
-rw-r--r--2023/info/llm-after.md458
-rw-r--r--2023/info/llm-before.md17
4 files changed, 1892 insertions, 1 deletions
diff --git a/2023/captions/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--main--chapters.vtt b/2023/captions/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--main--chapters.vtt
new file mode 100644
index 00000000..858d0fdb
--- /dev/null
+++ b/2023/captions/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--main--chapters.vtt
@@ -0,0 +1,41 @@
+WEBVTT
+
+
+00:00:00.000 --> 00:00:25.079
+Intro to the Talk
+
+00:00:25.080 --> 00:01:56.359
+What are LLMs?
+
+00:01:56.360 --> 00:03:32.239
+Power of LLMs (Magit Demo)
+
+00:03:32.240 --> 00:05:20.119
+Drawbacks of LLMs (regex demo)
+
+00:05:20.120 --> 00:07:32.799
+Embeddings
+
+00:07:32.800 --> 00:08:48.479
+Image Generation
+
+00:08:48.480 --> 00:11:05.679
+Fine-tuning
+
+00:11:08.160 --> 00:12:02.519
+Open Source
+
+00:12:02.840 --> 00:14:04.159
+The Future
+
+00:14:08.200 --> 00:18:14.439
+LLMs in Emacs - existing packages
+
+00:18:15.960 --> 00:19:04.079
+Abstracting LLM challenges
+
+00:19:04.080 --> 00:20:01.599
+Emacs is the ideal interface for LLMs
+
+00:20:01.960 --> 00:20:26.160
+Outro
diff --git a/2023/captions/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--main.vtt b/2023/captions/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--main.vtt
new file mode 100644
index 00000000..3ac4b34c
--- /dev/null
+++ b/2023/captions/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--main.vtt
@@ -0,0 +1,1377 @@
+WEBVTT captioned by bala, checked by sachac
+
+NOTE Intro to the Talk
+
+00:00:00.000 --> 00:00:04.159
+Hello, I'm Andrew Hyatt and I'm going to talk to you
+
+00:00:04.160 --> 00:00:06.439
+about large language models and how
+
+00:00:06.440 --> 00:00:11.079
+they relate to Emacs.
+
+00:00:11.080 --> 00:00:14.919
+And I'm going to talk to you about the technology
+
+00:00:14.920 --> 00:00:18.279
+and how we're going to use it in Emacs.
+
+00:00:18.280 --> 00:00:21.159
+There'll be demos and there'll be talks about,
+
+00:00:21.160 --> 00:00:22.879
+I'll finish up by kind of talking about where
+
+00:00:22.880 --> 00:00:25.079
+I think this should go in the future.
+
+NOTE What are LLMs?
+
+00:00:25.080 --> 00:00:28.239
+So to start off with, let's just talk like,
+
+00:00:28.240 --> 00:00:29.759
+I just want to make sure everyone's on the same page.
+
+00:00:29.760 --> 00:00:30.919
+What are large language models?
+
+00:00:30.920 --> 00:00:34.639
+Not everyone may be caught up on this.
+
+00:00:34.640 --> 00:00:38.999
+Large language models are a way... Basically,
+
+00:00:39.000 --> 00:00:42.999
+the current versions of large language models
+
+00:00:43.000 --> 00:00:44.479
+are all based on the similar architecture
+
+00:00:44.480 --> 00:00:45.279
+called the transformer.
+
+00:00:45.280 --> 00:00:48.719
+It's just an efficient way to train and produce output.
+
+00:00:48.720 --> 00:00:51.919
+So these things are basically models
+
+00:00:51.920 --> 00:00:58.079
+that predict the next word or something like that.
+
+00:00:58.080 --> 00:01:02.119
+And they're trained on an enormous corpus of information
+
+00:01:02.120 --> 00:01:04.319
+and they get extremely good
+
+00:01:04.320 --> 00:01:06.079
+at predicting the next word.
+
+00:01:06.080 --> 00:01:09.679
+And from that basic ability, you can train
+
+00:01:09.680 --> 00:01:12.439
+through further tuning from human input,
+
+00:01:12.440 --> 00:01:13.959
+human ratings and things like that.
+
+00:01:13.960 --> 00:01:17.479
+You can train different models based on that
+
+00:01:17.480 --> 00:01:18.759
+that will do question answering.
+
+00:01:18.760 --> 00:01:22.519
+And this is how basically ChatGPT works.
+
+00:01:22.520 --> 00:01:25.599
+There's a base LLM, like GPT.
+
+00:01:25.600 --> 00:01:27.799
+And then you have a chat version of that,
+
+00:01:27.800 --> 00:01:29.959
+which is just trained to just... You give
+
+00:01:29.960 --> 00:01:32.199
+it a prompt, like what do you want it to do?
+
+00:01:32.200 --> 00:01:37.279
+And it gives you an output that does what you told it to do,
+
+00:01:37.280 --> 00:01:39.919
+or at least attempts to do it.
+
+00:01:39.920 --> 00:01:42.079
+Those are the power of large language models is
+
+00:01:42.080 --> 00:01:45.639
+they're extremely, extremely impressive.
+
+00:01:45.640 --> 00:01:47.199
+Certainly this is, in AI,
+
+00:01:47.200 --> 00:01:49.079
+this has been the biggest thing to happen
+
+00:01:49.080 --> 00:01:51.559
+probably in my lifetime,
+
+00:01:51.560 --> 00:01:56.359
+or at least my lifetime as my working lifetime.
+
+NOTE Power of LLMs (Magit Demo)
+
+00:01:56.360 --> 00:02:02.559
+So let me give you a demonstration of
+
+00:02:02.560 --> 00:02:06.679
+what kinds of stuff it could do in Emacs.
+
+00:02:06.680 --> 00:02:09.039
+So here I have a Emacs file.
+
+00:02:09.040 --> 00:02:12.479
+So this is my Emacs init file.
+
+00:02:12.480 --> 00:02:13.599
+I have a change.
+
+00:02:13.600 --> 00:02:16.879
+Let's commit that change.
+
+00:02:16.880 --> 00:02:19.439
+And, you know, I don't like writing commit messages,
+
+00:02:19.440 --> 00:02:23.039
+so I can generate it.
+
+00:02:23.040 --> 00:02:27.479
+And it did an actually just looking.
+
+00:02:27.480 --> 00:02:29.759
+So all it does is it's looking, it's just reading the diff.
+
+00:02:29.760 --> 00:02:32.479
+I'm just feeding it the diff with some instructions.
+
+00:02:32.480 --> 00:02:37.759
+And it is this a incredible commit message?
+
+00:02:37.760 --> 00:02:39.399
+It's not bad, actually.
+
+00:02:39.400 --> 00:02:42.319
+You can see that it actually has really extracted
+
+00:02:42.320 --> 00:02:46.439
+the meaning of what I'm doing and has written
+
+00:02:46.440 --> 00:02:48.879
+a reasonably good commit message.
+
+00:02:48.880 --> 00:02:53.159
+Now I have to edit it because this is not quite correct.
+
+00:02:53.160 --> 00:02:55.159
+But it's kind of impressive how good it is.
+
+00:02:55.160 --> 00:03:00.039
+And my editing, it's kind of easier for me to edit this
+
+00:03:00.040 --> 00:03:01.879
+than just to write a new one.
+
+00:03:01.880 --> 00:03:04.479
+And quite often it's good enough to just submit as is.
+
+00:03:04.480 --> 00:03:08.119
+So this is kind of, you know, you could say
+
+00:03:08.120 --> 00:03:09.359
+this is just commit messages.
+
+00:03:09.360 --> 00:03:10.719
+You could respond to emails.
+
+00:03:10.720 --> 00:03:15.319
+You could, you know, using your own custom instructions
+
+00:03:15.320 --> 00:03:17.839
+about what you want your email to say.
+
+00:03:17.840 --> 00:03:19.039
+It'll write the email for you.
+
+00:03:19.040 --> 00:03:19.839
+It could do like this
+
+00:03:19.840 --> 00:03:22.519
+Emacs is a way to interact with buffers.
+
+00:03:22.520 --> 00:03:24.199
+This could basically just output text.
+
+00:03:24.200 --> 00:03:27.759
+So it's super useful for
+
+00:03:27.760 --> 00:03:30.319
+understanding something and outputting text based on that,
+
+00:03:30.320 --> 00:03:32.239
+which is just useful for Emacs.
+
+NOTE Drawbacks of LLMs (regex demo)
+
+00:03:32.240 --> 00:03:39.919
+So the drawback is, yeah, it's good,
+
+00:03:39.920 --> 00:03:43.359
+but it's not that reliable.
+
+00:03:43.360 --> 00:03:45.679
+And you'd think it's very easy to get caught up in like,
+
+00:03:45.680 --> 00:03:47.639
+oh my gosh, like this is so powerful.
+
+00:03:47.640 --> 00:03:50.599
+I bet it could work this, whatever idea could work.
+
+00:03:50.600 --> 00:03:52.919
+And these ideas, like they almost can.
+
+00:03:52.920 --> 00:03:55.639
+For example, I was thinking, you know what I could do?
+
+00:03:55.640 --> 00:03:57.239
+I don't like writing regexes.
+
+00:03:57.240 --> 00:04:01.199
+Why can't I have a regex replace that's powered by LLMs?
+
+00:04:01.200 --> 00:04:03.439
+And that way I could give just an instruction
+
+00:04:03.440 --> 00:04:07.399
+to regex replace.
+
+00:04:07.400 --> 00:04:12.079
+And so for example, I could do Emacs LLM regex replace.
+
+00:04:12.080 --> 00:04:12.879
+This is not checked in anywhere.
+
+00:04:12.880 --> 00:04:17.199
+These are just my own kind of private functions.
+
+00:04:17.200 --> 00:04:19.239
+My description lowercase all the org headings.
+
+00:04:19.240 --> 00:04:20.439
+Let's see if it works.
+
+00:04:20.440 --> 00:04:21.039
+It might work.
+
+00:04:21.040 --> 00:04:22.959
+No, it doesn't work.
+
+00:04:22.960 --> 00:04:26.159
+So if I, I'm not going to bother to show you
+
+00:04:26.160 --> 00:04:28.159
+what it actually came up with, but it's something,
+
+00:04:28.160 --> 00:04:29.879
+if you looked at it, it'd be like, wow,
+
+00:04:29.880 --> 00:04:31.639
+this is very close to being...
+
+00:04:31.640 --> 00:04:34.239
+It looks like it should work, but it doesn't.
+
+00:04:34.240 --> 00:04:35.839
+Okay.
+
+00:04:35.840 --> 00:04:38.719
+It's not quite good enough to get it right.
+
+00:04:38.720 --> 00:04:41.599
+And it's possible that perhaps by giving it
+
+00:04:41.600 --> 00:04:43.639
+a few examples of, or explaining more
+
+00:04:43.640 --> 00:04:46.439
+what makes Emacs regexes different.
+
+00:04:46.440 --> 00:04:47.959
+It could do a better job
+
+00:04:47.960 --> 00:04:49.279
+and maybe could solve these problems,
+
+00:04:49.280 --> 00:04:50.679
+but it's always a little bit random.
+
+00:04:50.680 --> 00:04:52.359
+You're never quite sure what you're going to get.
+
+00:04:52.360 --> 00:04:54.839
+So this is the drawback.
+
+00:04:54.840 --> 00:04:58.479
+Like there's a lot of things that look like you could do it,
+
+00:04:58.480 --> 00:05:00.999
+but when it actually comes down to trying it,
+
+00:05:01.000 --> 00:05:03.399
+it's surprisingly hard.
+
+00:05:03.400 --> 00:05:06.319
+And, you know, and whatever you're doing,
+
+00:05:06.320 --> 00:05:08.999
+it's surprisingly hard to get something
+
+00:05:09.000 --> 00:05:13.879
+that is repeatably, that's, that is always good.
+
+00:05:13.880 --> 00:05:20.119
+So yeah, that's currently the problem.
+
+NOTE Embeddings
+
+00:05:20.120 --> 00:05:23.399
+So I want to talk about embeddings.
+
+00:05:23.400 --> 00:05:26.919
+They're another thing that LLMs offer
+
+00:05:26.920 --> 00:05:28.599
+and that are extremely useful.
+
+00:05:28.600 --> 00:05:33.119
+They are, what they do is they encode from
+
+00:05:33.120 --> 00:05:38.959
+a input text that could be a word, a sentence,
+
+00:05:38.960 --> 00:05:42.159
+a small document.
+
+00:05:42.160 --> 00:05:45.399
+It encodes a vector about what the meaning,
+
+00:05:45.400 --> 00:05:46.919
+the semantic meaning of that is.
+
+00:05:46.920 --> 00:05:51.079
+That means you could, something that is,
+
+00:05:51.080 --> 00:05:52.279
+uses completely different words,
+
+00:05:52.280 --> 00:05:54.159
+but is basically talking about the same thing,
+
+00:05:54.160 --> 00:05:57.839
+perhaps in a different language, should be pretty close
+
+00:05:57.840 --> 00:06:01.999
+as a vector to the other vector.
+
+00:06:02.000 --> 00:06:05.399
+You know, as long as they're similarly semantic things,
+
+00:06:05.400 --> 00:06:12.239
+like the words
+
+00:06:12.240 --> 00:06:18.959
+highway and Camino are two different words.
+
+00:06:18.960 --> 00:06:19.639
+They mean the same thing.
+
+00:06:19.640 --> 00:06:21.319
+They should have very similar embeddings.
+
+00:06:21.320 --> 00:06:25.119
+So it is a way to kind of encode this
+
+00:06:25.120 --> 00:06:26.199
+and then you could use this for search.
+
+00:06:26.200 --> 00:06:28.919
+For example, I haven't tried to do this yet,
+
+00:06:28.920 --> 00:06:31.479
+but you could probably just make an embedding
+
+00:06:31.480 --> 00:06:33.919
+for every paragraph in the Emacs manual
+
+00:06:33.920 --> 00:06:36.239
+and the Elisp manual.
+
+00:06:36.240 --> 00:06:39.439
+And then, and then there's a very standard technique.
+
+00:06:39.440 --> 00:06:43.439
+You just... You find that you have a query,
+
+00:06:43.440 --> 00:06:45.799
+oh, how do I do whatever, whatever in Emacs again?
+
+00:06:45.800 --> 00:06:49.479
+And you could, you just find that 20 things
+
+00:06:49.480 --> 00:06:50.319
+that are closest to whatever you're
+
+00:06:50.320 --> 00:06:51.839
+trying to... the embedding of your query.
+
+00:06:51.840 --> 00:06:55.279
+You send those things to the LLM, as you know,
+
+00:06:55.280 --> 00:06:57.799
+with the original query,
+
+00:06:57.800 --> 00:06:59.919
+and you're basically telling the--asking the LLM,
+
+00:06:59.920 --> 00:07:01.279
+look, the user is trying to do this.
+
+00:07:01.280 --> 00:07:03.039
+Here's what I found in the Emacs manual.
+
+00:07:03.040 --> 00:07:04.639
+That's on the Elisp manual.
+
+00:07:04.640 --> 00:07:07.439
+That's close to what they're trying to do.
+
+00:07:07.440 --> 00:07:12.159
+So can you kind of just tell the user what to do?
+
+00:07:12.160 --> 00:07:14.479
+And from this, and you could say,
+
+00:07:14.480 --> 00:07:17.639
+just use things from this, you know, that I give you.
+
+00:07:17.640 --> 00:07:20.679
+Don't just make up your own idea.
+
+00:07:20.680 --> 00:07:21.839
+You know, don't use your own ideas,
+
+00:07:21.840 --> 00:07:23.799
+because sometimes it likes to do that
+
+00:07:23.800 --> 00:07:24.359
+and those things are wrong.
+
+00:07:24.360 --> 00:07:26.719
+So you could try to, you know, do this and you get,
+
+00:07:26.720 --> 00:07:28.719
+you could get quite good results using this.
+
+00:07:28.720 --> 00:07:29.999
+So no one has done this yet,
+
+00:07:30.000 --> 00:07:32.799
+but that should not be hard to do.
+
+NOTE Image Generation
+
+00:07:32.800 --> 00:07:34.879
+Image generation is something that's, you know,
+
+00:07:34.880 --> 00:07:38.479
+it's not quite an LLM in the sense of...
+
+00:07:38.480 --> 00:07:43.079
+These are... It's a different technology,
+
+00:07:43.080 --> 00:07:48.439
+but these things are kind of packaged together
+
+00:07:48.440 --> 00:07:49.039
+in a sense.
+
+00:07:49.040 --> 00:07:51.639
+And you'll see that when I talk about Emacs packages,
+
+00:07:51.640 --> 00:07:54.279
+a lot of them bundle image generation
+
+00:07:54.280 --> 00:07:55.439
+and large language models.
+
+00:07:55.440 --> 00:07:59.039
+You know, the APIs are often bundled together by providers.
+
+00:07:59.040 --> 00:08:02.679
+And the general idea is it's kind of similar
+
+00:08:02.680 --> 00:08:04.399
+because it's very similar to large, you know,
+
+00:08:04.400 --> 00:08:06.559
+doing a chat thing where you, you know,
+
+00:08:06.560 --> 00:08:09.760
+the chat is like, you give it a text request,
+
+00:08:09.761 --> 00:08:12.759
+like write me a sonnet about, you know,
+
+00:08:12.760 --> 00:08:14.879
+the battle between Emacs and vi.
+
+00:08:14.880 --> 00:08:15.839
+And it could, it could do it.
+
+00:08:15.840 --> 00:08:17.159
+It could do a very good job of that.
+
+00:08:17.160 --> 00:08:22.519
+But you could also say, you know,
+
+00:08:22.520 --> 00:08:27.599
+draw me a picture of Emacs and vi as boxers,
+
+00:08:27.600 --> 00:08:30.359
+as a character-character boxing in a ring,
+
+00:08:30.360 --> 00:08:32.239
+like a, you know, political cartoon style.
+
+00:08:32.240 --> 00:08:34.999
+And it can do that as well.
+
+00:08:35.000 --> 00:08:37.679
+And so you could basically think of this
+
+00:08:37.680 --> 00:08:39.439
+as just sort of... it's kind of the
+
+00:08:39.440 --> 00:08:42.399
+same thing with what you're doing
+
+00:08:42.400 --> 00:08:43.359
+with large language models,
+
+00:08:43.360 --> 00:08:44.799
+but instead of outputting a text,
+
+00:08:44.800 --> 00:08:48.479
+you're outputting a picture.
+
+NOTE Fine-tuning
+
+00:08:48.480 --> 00:08:51.079
+There's also, I want to mention the concept of fine-tuning.
+
+00:08:51.080 --> 00:08:55.199
+Fine-tuning is a way to take your--
+
+00:08:55.200 --> 00:08:59.759
+take a corpus of inputs and outputs and just from
+
+00:08:59.760 --> 00:09:01.599
+a large language model, you're like, okay,
+
+00:09:01.600 --> 00:09:03.599
+given this base large language model,
+
+00:09:03.600 --> 00:09:06.679
+I want to make sure that when I give you input,
+
+00:09:06.680 --> 00:09:08.479
+you give me something like output.
+
+00:09:08.480 --> 00:09:10.119
+And this is what I'm just going to
+
+00:09:10.120 --> 00:09:11.799
+train you further on these,
+
+00:09:11.800 --> 00:09:14.879
+these mappings between input and output.
+
+00:09:14.880 --> 00:09:16.399
+And for example, you could do this. Like,
+
+00:09:16.400 --> 00:09:18.039
+let's say you wanted to fix that regex demo
+
+00:09:18.040 --> 00:09:20.999
+I had to make it good.
+
+00:09:21.000 --> 00:09:23.479
+I don't think it, I think it'd be
+
+00:09:23.480 --> 00:09:25.039
+relatively effective to train,
+
+00:09:25.040 --> 00:09:27.039
+to have regex descriptions
+
+00:09:27.040 --> 00:09:30.119
+and regex examples, Emacs regex examples
+
+00:09:30.120 --> 00:09:31.239
+as inputs and outputs.
+
+00:09:31.240 --> 00:09:33.999
+You could get, you know, maybe a hundred,
+
+00:09:34.000 --> 00:09:35.359
+a few hundreds of these things.
+
+00:09:35.360 --> 00:09:38.639
+You could train it.
+
+00:09:38.640 --> 00:09:40.759
+I think that is a reasonable way to,
+
+00:09:40.760 --> 00:09:43.879
+let's just say, I don't know how well it would work,
+
+00:09:43.880 --> 00:09:46.839
+but these things definitely work some of the time
+
+00:09:46.840 --> 00:09:47.999
+and produce pretty good results.
+
+00:09:48.000 --> 00:09:53.039
+And you could do this on your own machine.
+
+00:09:53.040 --> 00:09:58.999
+Corporations like OpenAI offer APIs with, you know,
+
+00:09:59.000 --> 00:10:01.519
+to build your fine tunes on top of OpenAI.
+
+00:10:01.520 --> 00:10:04.159
+And I think, I'm not a hundred percent sure,
+
+00:10:04.160 --> 00:10:05.719
+but I think then you can share your model
+
+00:10:05.720 --> 00:10:06.519
+with other people.
+
+00:10:06.520 --> 00:10:08.519
+But if not, then you just, you know,
+
+00:10:08.520 --> 00:10:10.839
+you could use your model for your own specialized purposes.
+
+00:10:10.840 --> 00:10:14.039
+But in the world of models that you could run,
+
+00:10:14.040 --> 00:10:16.874
+for example, based on Llama, which is like...
+
+00:10:16.875 --> 00:10:22.240
+Llama is this model you can run on your own machine from Meta.
+
+00:10:23.580 --> 00:10:26.880
+There's many fine-tuned models that you could download
+
+00:10:26.881 --> 00:10:28.960
+and you could run on your own.
+
+00:10:28.961 --> 00:10:30.839
+They can do very different things too.
+
+00:10:30.840 --> 00:10:33.399
+Some output Python programs, for example,
+
+00:10:33.400 --> 00:10:34.279
+that you could just run.
+
+00:10:34.280 --> 00:10:37.959
+So you just say...
+
+00:10:37.960 --> 00:10:40.639
+Tell me how old... Let's just say
+
+00:10:40.640 --> 00:10:41.999
+you have a random task, like
+
+00:10:42.000 --> 00:10:48.119
+tell me how old these five cities are in minutes,
+
+00:10:48.120 --> 00:10:49.799
+based on historical evidence.
+
+00:10:49.800 --> 00:10:53.639
+It's kind of a weird query, but it probably can figure,
+
+00:10:53.640 --> 00:10:55.119
+it could probably run that for you.
+
+00:10:55.120 --> 00:10:57.239
+It'll encode its knowledge into whatever
+
+00:10:57.240 --> 00:10:59.599
+the Python program, then use the Python program
+
+00:10:59.600 --> 00:11:01.039
+to do the correct calculations.
+
+00:11:01.040 --> 00:11:05.679
+So pretty, pretty useful stuff.
+
+NOTE Open Source
+
+00:11:08.160 --> 00:11:10.399
+So I also want to mention open source
+
+00:11:10.400 --> 00:11:12.679
+and basically free software here.
+
+00:11:12.680 --> 00:11:17.599
+These LLMs are mostly not free software.
+
+00:11:17.600 --> 00:11:19.159
+They're sometimes open source,
+
+00:11:19.160 --> 00:11:21.959
+but they're generally not free
+
+00:11:21.960 --> 00:11:23.799
+without restrictions to use.
+
+00:11:23.800 --> 00:11:27.279
+Most of these things, even Llama,
+
+00:11:27.280 --> 00:11:28.679
+which you can use on your own machine,
+
+00:11:28.680 --> 00:11:31.439
+have restrictions that you cannot use it
+
+00:11:31.440 --> 00:11:32.519
+to train your own model.
+
+00:11:32.520 --> 00:11:35.119
+This is something that, you know,
+
+00:11:35.120 --> 00:11:37.519
+it costs millions and millions of dollars
+
+00:11:37.520 --> 00:11:40.759
+to train and produce these models.
+
+00:11:40.760 --> 00:11:42.319
+And that's just computation costs.
+
+00:11:42.320 --> 00:11:45.519
+They do not want you
+
+00:11:45.520 --> 00:11:47.839
+stealing all that work by training your own models
+
+00:11:47.840 --> 00:11:48.799
+based on their output.
+
+00:11:48.800 --> 00:11:55.359
+But there are research LLMs that do, I believe,
+
+00:11:55.360 --> 00:11:57.999
+conform to free software principles.
+
+00:11:58.000 --> 00:11:59.519
+They're just not as good yet.
+
+00:11:59.520 --> 00:12:02.519
+And I think that might change in the future.
+
+NOTE The Future
+
+00:12:02.840 --> 00:12:04.119
+So speaking of the future,
+
+00:12:04.120 --> 00:12:07.519
+one of the things I'd like to point out
+
+00:12:07.520 --> 00:12:09.639
+is that like the demos I showed you are based on,
+
+00:12:09.640 --> 00:12:13.519
+I'm using OpenAI 3.5 model.
+
+00:12:13.520 --> 00:12:16.439
+That's more than, well, no,
+
+00:12:16.440 --> 00:12:18.199
+it's like a year old basically at this point.
+
+00:12:18.200 --> 00:12:21.079
+And things are moving fast.
+
+00:12:21.080 --> 00:12:22.039
+They came out with 4.0.
+
+00:12:22.040 --> 00:12:23.319
+4.0 is significantly better.
+
+00:12:23.320 --> 00:12:24.319
+I don't have access to it.
+
+00:12:24.320 --> 00:12:30.839
+Even though I'm using the API and I'm paying money for it,
+
+00:12:30.840 --> 00:12:33.639
+you only can get access to 4.0
+
+00:12:33.640 --> 00:12:34.439
+if you can spend a dollar.
+
+00:12:34.440 --> 00:12:36.319
+And I've never been able to spend,
+
+00:12:36.320 --> 00:12:38.199
+use so much API use that I've spent a dollar.
+
+00:12:38.200 --> 00:12:44.479
+So I have, I don't have 4.0, but I've tried it
+
+00:12:44.480 --> 00:12:46.639
+because I do pay for this
+
+00:12:46.640 --> 00:12:48.340
+so I could get access to 4.0
+
+00:12:48.341 --> 00:12:49.599
+and it is substantially better.
+
+00:12:49.600 --> 00:12:50.519
+By all reports, it's,
+
+00:12:50.520 --> 00:12:53.839
+the difference is extremely significant.
+
+00:12:53.840 --> 00:12:55.159
+I would not be surprised
+
+00:12:55.160 --> 00:12:59.759
+if some of the limitations and drawbacks I described
+
+00:12:59.760 --> 00:13:02.039
+mostly went away with 4.0.
+
+00:13:02.040 --> 00:13:06.679
+We're probably at a stage
+
+00:13:06.680 --> 00:13:09.239
+where regexes will work maybe 5% of the time
+
+00:13:09.240 --> 00:13:10.119
+if you try them.
+
+00:13:10.120 --> 00:13:13.639
+But with 4.0, it could work like 80% of the time.
+
+00:13:13.640 --> 00:13:14.559
+Now, is that good enough?
+
+00:13:14.560 --> 00:13:17.279
+Probably not, but it's a,
+
+00:13:17.280 --> 00:13:20.319
+I wouldn't be surprised if you got results like that.
+
+00:13:20.320 --> 00:13:22.919
+And in a year's time, in two years time,
+
+00:13:22.920 --> 00:13:26.679
+no one knows how much this is going to play out
+
+00:13:26.680 --> 00:13:27.519
+before progress stalls,
+
+00:13:27.520 --> 00:13:32.319
+but there are a lot of interesting research.
+
+00:13:32.320 --> 00:13:34.279
+I don't think, research wise,
+
+00:13:34.280 --> 00:13:35.759
+I don't think things have slowed down.
+
+00:13:35.760 --> 00:13:38.719
+You're still seeing a lot of advances.
+
+00:13:38.720 --> 00:13:40.999
+You're still seeing a lot of models coming out
+
+00:13:41.000 --> 00:13:41.839
+and that will come out.
+
+00:13:41.840 --> 00:13:46.279
+That will be each one, one upping the other one
+
+00:13:46.280 --> 00:13:49.959
+in terms of quality.
+
+00:13:49.960 --> 00:13:52.759
+It'll be really interesting to see how this all plays out.
+
+00:13:52.760 --> 00:13:55.919
+I think that message here is that
+
+00:13:55.920 --> 00:13:57.999
+we're at the beginning here.
+
+00:13:58.000 --> 00:14:01.239
+This is why I think this talk is important.
+
+00:14:01.240 --> 00:14:02.279
+I think this is why we should be
+
+00:14:02.280 --> 00:14:04.159
+paying attention to this stuff.
+
+NOTE LLMs in Emacs - existing packages
+
+00:14:08.200 --> 00:14:11.039
+Let's talk about the existing packages.
+
+00:14:11.040 --> 00:14:13.199
+Because there's a lot out there, people have,
+
+00:14:13.200 --> 00:14:17.039
+I think people have been integrating with
+
+00:14:17.040 --> 00:14:21.239
+these LLMs that often have a relatively easy to use API.
+
+00:14:21.240 --> 00:14:24.039
+So it's kind of natural that people
+
+00:14:24.040 --> 00:14:25.679
+have already put out a lot of packages.
+
+00:14:25.680 --> 00:14:28.319
+Coming off this problem from a lot of different angles,
+
+00:14:28.320 --> 00:14:30.639
+I don't have time to go through
+
+00:14:30.640 --> 00:14:31.959
+all of these packages.
+
+00:14:31.960 --> 00:14:33.559
+These are great packages though.
+
+00:14:33.560 --> 00:14:35.279
+If you're not familiar with them,
+
+00:14:35.280 --> 00:14:37.679
+please check them out.
+
+00:14:37.680 --> 00:14:40.999
+And they all are doing slightly different things.
+
+00:14:41.000 --> 00:14:43.959
+Some of these are relatively straightforward.
+
+00:14:43.960 --> 00:14:47.919
+Interactions, just a way to
+
+00:14:47.920 --> 00:14:52.679
+almost in a comment sort of way to kind of
+
+00:14:52.680 --> 00:14:54.199
+have just an interaction,
+
+00:14:54.200 --> 00:14:55.479
+long running interaction with an LLM
+
+00:14:55.480 --> 00:14:59.039
+where you kind of build off previous responses,
+
+00:14:59.040 --> 00:15:01.799
+kind of like the OpenAI's UI.
+
+00:15:01.800 --> 00:15:08.559
+Two very more Emacsy things where you can sort of
+
+00:15:08.560 --> 00:15:13.679
+embed these LLM responses within a org-mode block
+
+00:15:13.680 --> 00:15:15.239
+using the org-mode's context.
+
+00:15:15.240 --> 00:15:20.959
+Or GitHub Copilot integration where you can use it
+
+00:15:20.960 --> 00:15:23.319
+for auto completion in a very powerful,
+
+00:15:23.320 --> 00:15:27.319
+you know, this stuff is very useful if it could figure out
+
+00:15:27.320 --> 00:15:29.199
+what you're trying to do based on the context.
+
+00:15:29.200 --> 00:15:31.839
+It's quite effective.
+
+00:15:31.840 --> 00:15:36.359
+But I want to kind of call out one thing
+
+00:15:36.360 --> 00:15:38.239
+that I'd like to see change.
+
+00:15:38.240 --> 00:15:42.599
+Which is that users right now,
+
+00:15:42.600 --> 00:15:45.199
+not all of these have a choice of,
+
+00:15:45.200 --> 00:15:47.959
+first of all, there's a lot of them.
+
+00:15:47.960 --> 00:15:49.639
+Each one of them is doing their own calls.
+
+00:15:49.640 --> 00:15:53.999
+And each one of them is, so each one of them
+
+00:15:54.000 --> 00:15:55.239
+has their own interfaces.
+
+00:15:55.240 --> 00:15:57.719
+They're rewriting the interface to OpenAI or wherever.
+
+00:15:57.720 --> 00:16:00.119
+And they're not, they don't, most of these
+
+00:16:00.120 --> 00:16:05.119
+do not make it that configurable or at all configurable
+
+00:16:05.120 --> 00:16:06.599
+what LLM use.
+
+00:16:06.600 --> 00:16:07.239
+This is not good.
+
+00:16:07.240 --> 00:16:09.679
+It is important that we use,
+
+00:16:09.680 --> 00:16:15.679
+we give the user a way to change the LLM they use.
+
+00:16:15.680 --> 00:16:21.079
+And that is because you might not be comfortable
+
+00:16:21.080 --> 00:16:24.439
+sending your requests over to a private corporation
+
+00:16:24.440 --> 00:16:27.799
+where you don't get to see how they use their data.
+
+00:16:27.800 --> 00:16:29.799
+Your data, really.
+
+00:16:29.800 --> 00:16:33.319
+That's especially true with things like embeddings
+
+00:16:33.320 --> 00:16:35.039
+where you might be sending over your documents.
+
+00:16:35.040 --> 00:16:37.519
+You're just giving them your documents, basically.
+
+00:16:37.520 --> 00:16:40.759
+And, you know, that does happen.
+
+00:16:40.760 --> 00:16:43.599
+I don't think really that there's a reason
+
+00:16:43.600 --> 00:16:44.639
+to be uncomfortable with this,
+
+00:16:44.640 --> 00:16:51.439
+but that, you know, people are uncomfortable and that's okay.
+
+00:16:51.440 --> 00:16:53.239
+People might want to use a local machine,
+
+00:16:53.240 --> 00:16:58.359
+a local LLM for maximum privacy.
+
+00:16:58.360 --> 00:17:00.639
+That's something we should allow.
+
+00:17:00.640 --> 00:17:04.519
+People might want to especially use free software.
+
+00:17:04.520 --> 00:17:05.839
+That's something we should definitely allow.
+
+00:17:05.840 --> 00:17:07.279
+This is Emacs.
+
+00:17:07.280 --> 00:17:08.239
+We need to encourage that.
+
+00:17:08.240 --> 00:17:12.159
+But right now, as most of these things are written,
+
+00:17:12.160 --> 00:17:13.959
+you can't do it.
+
+00:17:13.960 --> 00:17:17.839
+And they're spending precious time
+
+00:17:17.840 --> 00:17:18.879
+just doing things themselves.
+
+00:17:18.880 --> 00:17:20.839
+This is why I wrote LLM, which is...
+
+00:17:20.840 --> 00:17:23.039
+it will just make that connection to the LLM for you
+
+00:17:23.040 --> 00:17:26.719
+and it will connect to, you know, it has plugins.
+
+00:17:26.720 --> 00:17:30.279
+So if you can, the user can configure what plugin
+
+00:17:30.280 --> 00:17:31.359
+it actually goes to.
+
+00:17:31.360 --> 00:17:32.399
+Does it go to OpenAI?
+
+00:17:32.400 --> 00:17:35.239
+Does it go to Google Cloud Vertex?
+
+00:17:35.240 --> 00:17:36.999
+Does it go to Llama on your machine?
+
+00:17:37.000 --> 00:17:38.399
+We're using Ollama,
+
+00:17:38.400 --> 00:17:40.999
+which is just a way to run Llama locally.
+
+00:17:41.000 --> 00:17:47.959
+And more things in the future, I hope.
+
+00:17:47.960 --> 00:17:52.079
+So this is, I'm hoping that we use this.
+
+00:17:52.080 --> 00:17:54.839
+It's designed to be sort of maximally usable.
+
+00:17:54.840 --> 00:17:56.279
+You don't need to install anything.
+
+00:17:56.280 --> 00:17:58.359
+It's on GNU ELPA.
+
+00:17:58.360 --> 00:17:59.879
+So even if you write something
+
+00:17:59.880 --> 00:18:01.079
+that you want to contribute to GNU ELPA,
+
+00:18:01.080 --> 00:18:02.879
+you can use it because it's on GNU ELPA.
+
+00:18:02.880 --> 00:18:06.439
+It's part of the Emacs package, Emacs core packages.
+
+00:18:06.440 --> 00:18:09.879
+So, but it has no functionality.
+
+00:18:09.880 --> 00:18:11.719
+It's really just there as a library
+
+00:18:11.720 --> 00:18:14.439
+to use by other things offering functionality. Okay.
+
+NOTE Abstracting LLM challenges
+
+00:18:15.960 --> 00:18:19.839
+And it's a little bit difficult to abstract.
+
+00:18:19.840 --> 00:18:21.159
+I want to point this out
+
+00:18:21.160 --> 00:18:23.599
+because I think it's an important point
+
+00:18:23.600 --> 00:18:29.519
+is that the, it's, some of these LLMs, for example,
+
+00:18:29.520 --> 00:18:30.439
+have image generation.
+
+00:18:30.440 --> 00:18:31.279
+Some do not.
+
+00:18:31.280 --> 00:18:35.319
+Some of them have very large context windows, even for chat.
+
+00:18:35.320 --> 00:18:36.999
+You say, okay, all these things can do chat.
+
+00:18:37.000 --> 00:18:37.319
+Okay.
+
+00:18:37.320 --> 00:18:38.079
+Yeah, kind of.
+
+00:18:38.080 --> 00:18:39.999
+Some of these things you could pass a book to,
+
+00:18:40.000 --> 00:18:41.239
+like Anthropic's API.
+
+00:18:41.240 --> 00:18:43.039
+Most, you cannot.
+
+00:18:43.040 --> 00:18:45.559
+So there really are big differences
+
+00:18:45.560 --> 00:18:46.399
+in how these things work.
+
+00:18:46.400 --> 00:18:51.539
+I hope those differences diminish in the future.
+
+00:18:51.540 --> 00:18:53.800
+But it's just one of the challenges
+
+00:18:53.801 --> 00:18:57.520
+that I hope we can work through in the LLM library.
+
+00:18:57.521 --> 00:19:02.160
+So it's compatible, but there's definitely
+
+00:19:02.161 --> 00:19:04.079
+limits to that compatibility.
+
+NOTE Emacs is the ideal interface for LLMs
+
+00:19:04.080 --> 00:19:06.160
+I want to point out just to finish off,
+
+00:19:06.161 --> 00:19:12.879
+Emacs is the, Emacs has real power here
+
+00:19:12.880 --> 00:19:15.679
+that nothing else I think in the industry is offering.
+
+00:19:15.680 --> 00:19:19.279
+First of all, people that use Emacs
+
+00:19:19.280 --> 00:19:20.439
+tend to do a lot of things in Emacs.
+
+00:19:20.440 --> 00:19:22.159
+We have our to-dos in Emacs with the org mode.
+
+00:19:22.160 --> 00:19:22.999
+We have mail.
+
+00:19:23.000 --> 00:19:25.719
+We, you know, we might read email and we might,
+
+00:19:25.720 --> 00:19:27.679
+and respond to email in Emacs.
+
+00:19:27.680 --> 00:19:29.199
+We might have notes in Emacs.
+
+00:19:29.200 --> 00:19:31.359
+This is very powerful.
+
+00:19:31.360 --> 00:19:34.159
+Using... there's not other stuff like that.
+
+00:19:34.160 --> 00:19:35.759
+And you could feed this stuff to an LLM.
+
+00:19:35.760 --> 00:19:37.039
+You could do interesting things
+
+00:19:37.040 --> 00:19:38.559
+using a combination of all this data.
+
+00:19:38.560 --> 00:19:40.399
+No one else could do this.
+
+00:19:40.400 --> 00:19:41.759
+We need to start thinking about it.
+
+00:19:41.760 --> 00:19:45.039
+Secondly, Emacs can execute commands.
+
+00:19:45.040 --> 00:19:46.239
+This might be a bad idea.
+
+00:19:46.240 --> 00:19:48.399
+This might be how the robots take over,
+
+00:19:48.400 --> 00:19:51.799
+but you could have the LLMs respond with Emacs
+
+00:19:51.800 --> 00:19:54.199
+commands and run those Emacs commands
+
+00:19:54.200 --> 00:19:57.079
+and tell the LLM the response and have it do things
+
+00:19:57.080 --> 00:19:58.679
+as your agent in the editor.
+
+00:19:58.680 --> 00:20:01.599
+I think we need to explore ideas like this.
+
+NOTE Outro
+
+00:20:01.960 --> 00:20:04.279
+And I think we need to share these ideas
+
+00:20:04.280 --> 00:20:07.039
+and we need to make sure that we're pushing the
+
+00:20:07.040 --> 00:20:10.519
+envelope for Emacs and actually, you know, doing things,
+
+00:20:10.520 --> 00:20:12.959
+sharing ideas, sharing progress,
+
+00:20:12.960 --> 00:20:15.199
+and kind of seeing how far we can push this stuff.
+
+00:20:15.200 --> 00:20:20.639
+Let's really help Emacs out, be sort of,
+
+00:20:20.640 --> 00:20:24.519
+take advantage of this super powerful technique.
+
+00:20:24.520 --> 00:20:26.160
+Thank you for listening.
diff --git a/2023/info/llm-after.md b/2023/info/llm-after.md
index 99bc9207..8530fb51 100644
--- a/2023/info/llm-after.md
+++ b/2023/info/llm-after.md
@@ -1,6 +1,464 @@
<!-- Automatically generated by emacsconf-publish-after-page -->
+<a name="llm-mainVideo-transcript"></a>
+# Transcript
+
+[[!template new="1" text="""Hello, I'm Andrew Hyatt and I'm going to talk to you""" start="00:00:00.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""about large language models and how""" start="00:00:04.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""they relate to Emacs.""" start="00:00:06.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And I'm going to talk to you about the technology""" start="00:00:11.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and how we're going to use it in Emacs.""" start="00:00:14.920" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""There'll be demos and there'll be talks about,""" start="00:00:18.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I'll finish up by kind of talking about where""" start="00:00:21.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I think this should go in the future.""" start="00:00:22.880" video="mainVideo-llm" id="subtitle"]]
+[[!template new="1" text="""So to start off with, let's just talk like,""" start="00:00:25.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I just want to make sure everyone's on the same page.""" start="00:00:28.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""What are large language models?""" start="00:00:29.760" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Not everyone may be caught up on this.""" start="00:00:30.920" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Large language models are a way... Basically,""" start="00:00:34.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""the current versions of large language models""" start="00:00:39.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""are all based on the similar architecture""" start="00:00:43.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""called the transformer.""" start="00:00:44.480" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It's just an efficient way to train and produce output.""" start="00:00:45.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So these things are basically models""" start="00:00:48.720" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""that predict the next word or something like that.""" start="00:00:51.920" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And they're trained on an enormous corpus of information""" start="00:00:58.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and they get extremely good""" start="00:01:02.120" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""at predicting the next word.""" start="00:01:04.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And from that basic ability, you can train""" start="00:01:06.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""through further tuning from human input,""" start="00:01:09.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""human ratings and things like that.""" start="00:01:12.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You can train different models based on that""" start="00:01:13.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""that will do question answering.""" start="00:01:17.480" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And this is how basically ChatGPT works.""" start="00:01:18.760" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""There's a base LLM, like GPT.""" start="00:01:22.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And then you have a chat version of that,""" start="00:01:25.600" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""which is just trained to just... You give""" start="00:01:27.800" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""it a prompt, like what do you want it to do?""" start="00:01:29.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And it gives you an output that does what you told it to do,""" start="00:01:32.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""or at least attempts to do it.""" start="00:01:37.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Those are the power of large language models is""" start="00:01:39.920" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""they're extremely, extremely impressive.""" start="00:01:42.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Certainly this is, in AI,""" start="00:01:45.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""this has been the biggest thing to happen""" start="00:01:47.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""probably in my lifetime,""" start="00:01:49.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""or at least my lifetime as my working lifetime.""" start="00:01:51.560" video="mainVideo-llm" id="subtitle"]]
+[[!template new="1" text="""So let me give you a demonstration of""" start="00:01:56.360" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""what kinds of stuff it could do in Emacs.""" start="00:02:02.560" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So here I have a Emacs file.""" start="00:02:06.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So this is my Emacs init file.""" start="00:02:09.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I have a change.""" start="00:02:12.480" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Let's commit that change.""" start="00:02:13.600" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And, you know, I don't like writing commit messages,""" start="00:02:16.880" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""so I can generate it.""" start="00:02:19.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And it did an actually just looking.""" start="00:02:23.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So all it does is it's looking, it's just reading the diff.""" start="00:02:27.480" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I'm just feeding it the diff with some instructions.""" start="00:02:29.760" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And it is this a incredible commit message?""" start="00:02:32.480" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It's not bad, actually.""" start="00:02:37.760" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You can see that it actually has really extracted""" start="00:02:39.400" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""the meaning of what I'm doing and has written""" start="00:02:42.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""a reasonably good commit message.""" start="00:02:46.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Now I have to edit it because this is not quite correct.""" start="00:02:48.880" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""But it's kind of impressive how good it is.""" start="00:02:53.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And my editing, it's kind of easier for me to edit this""" start="00:02:55.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""than just to write a new one.""" start="00:03:00.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And quite often it's good enough to just submit as is.""" start="00:03:01.880" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So this is kind of, you know, you could say""" start="00:03:04.480" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""this is just commit messages.""" start="00:03:08.120" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You could respond to emails.""" start="00:03:09.360" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You could, you know, using your own custom instructions""" start="00:03:10.720" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""about what you want your email to say.""" start="00:03:15.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It'll write the email for you.""" start="00:03:17.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It could do like this""" start="00:03:19.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Emacs is a way to interact with buffers.""" start="00:03:19.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""This could basically just output text.""" start="00:03:22.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So it's super useful for""" start="00:03:24.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""understanding something and outputting text based on that,""" start="00:03:27.760" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""which is just useful for Emacs.""" start="00:03:30.320" video="mainVideo-llm" id="subtitle"]]
+[[!template new="1" text="""So the drawback is, yeah, it's good,""" start="00:03:32.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""but it's not that reliable.""" start="00:03:39.920" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And you'd think it's very easy to get caught up in like,""" start="00:03:43.360" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""oh my gosh, like this is so powerful.""" start="00:03:45.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I bet it could work this, whatever idea could work.""" start="00:03:47.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And these ideas, like they almost can.""" start="00:03:50.600" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""For example, I was thinking, you know what I could do?""" start="00:03:52.920" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I don't like writing regexes.""" start="00:03:55.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Why can't I have a regex replace that's powered by LLMs?""" start="00:03:57.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And that way I could give just an instruction""" start="00:04:01.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""to regex replace.""" start="00:04:03.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And so for example, I could do Emacs LLM regex replace.""" start="00:04:07.400" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""This is not checked in anywhere.""" start="00:04:12.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""These are just my own kind of private functions.""" start="00:04:12.880" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""My description lowercase all the org headings.""" start="00:04:17.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Let's see if it works.""" start="00:04:19.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It might work.""" start="00:04:20.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""No, it doesn't work.""" start="00:04:21.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So if I, I'm not going to bother to show you""" start="00:04:22.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""what it actually came up with, but it's something,""" start="00:04:26.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""if you looked at it, it'd be like, wow,""" start="00:04:28.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""this is very close to being...""" start="00:04:29.880" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It looks like it should work, but it doesn't.""" start="00:04:31.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Okay.""" start="00:04:34.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It's not quite good enough to get it right.""" start="00:04:35.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And it's possible that perhaps by giving it""" start="00:04:38.720" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""a few examples of, or explaining more""" start="00:04:41.600" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""what makes Emacs regexes different.""" start="00:04:43.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It could do a better job""" start="00:04:46.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and maybe could solve these problems,""" start="00:04:47.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""but it's always a little bit random.""" start="00:04:49.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You're never quite sure what you're going to get.""" start="00:04:50.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So this is the drawback.""" start="00:04:52.360" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Like there's a lot of things that look like you could do it,""" start="00:04:54.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""but when it actually comes down to trying it,""" start="00:04:58.480" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""it's surprisingly hard.""" start="00:05:01.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And, you know, and whatever you're doing,""" start="00:05:03.400" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""it's surprisingly hard to get something""" start="00:05:06.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""that is repeatably, that's, that is always good.""" start="00:05:09.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So yeah, that's currently the problem.""" start="00:05:13.880" video="mainVideo-llm" id="subtitle"]]
+[[!template new="1" text="""So I want to talk about embeddings.""" start="00:05:20.120" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""They're another thing that LLMs offer""" start="00:05:23.400" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and that are extremely useful.""" start="00:05:26.920" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""They are, what they do is they encode from""" start="00:05:28.600" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""a input text that could be a word, a sentence,""" start="00:05:33.120" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""a small document.""" start="00:05:38.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It encodes a vector about what the meaning,""" start="00:05:42.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""the semantic meaning of that is.""" start="00:05:45.400" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""That means you could, something that is,""" start="00:05:46.920" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""uses completely different words,""" start="00:05:51.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""but is basically talking about the same thing,""" start="00:05:52.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""perhaps in a different language, should be pretty close""" start="00:05:54.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""as a vector to the other vector.""" start="00:05:57.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You know, as long as they're similarly semantic things,""" start="00:06:02.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""like the words""" start="00:06:05.400" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""highway and Camino are two different words.""" start="00:06:12.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""They mean the same thing.""" start="00:06:18.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""They should have very similar embeddings.""" start="00:06:19.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So it is a way to kind of encode this""" start="00:06:21.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and then you could use this for search.""" start="00:06:25.120" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""For example, I haven't tried to do this yet,""" start="00:06:26.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""but you could probably just make an embedding""" start="00:06:28.920" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""for every paragraph in the Emacs manual""" start="00:06:31.480" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and the Elisp manual.""" start="00:06:33.920" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And then, and then there's a very standard technique.""" start="00:06:36.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You just... You find that you have a query,""" start="00:06:39.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""oh, how do I do whatever, whatever in Emacs again?""" start="00:06:43.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And you could, you just find that 20 things""" start="00:06:45.800" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""that are closest to whatever you're""" start="00:06:49.480" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""trying to... the embedding of your query.""" start="00:06:50.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You send those things to the LLM, as you know,""" start="00:06:51.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""with the original query,""" start="00:06:55.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and you're basically telling the--asking the LLM,""" start="00:06:57.800" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""look, the user is trying to do this.""" start="00:06:59.920" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Here's what I found in the Emacs manual.""" start="00:07:01.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""That's on the Elisp manual.""" start="00:07:03.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""That's close to what they're trying to do.""" start="00:07:04.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So can you kind of just tell the user what to do?""" start="00:07:07.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And from this, and you could say,""" start="00:07:12.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""just use things from this, you know, that I give you.""" start="00:07:14.480" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Don't just make up your own idea.""" start="00:07:17.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You know, don't use your own ideas,""" start="00:07:20.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""because sometimes it likes to do that""" start="00:07:21.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and those things are wrong.""" start="00:07:23.800" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So you could try to, you know, do this and you get,""" start="00:07:24.360" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""you could get quite good results using this.""" start="00:07:26.720" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So no one has done this yet,""" start="00:07:28.720" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""but that should not be hard to do.""" start="00:07:30.000" video="mainVideo-llm" id="subtitle"]]
+[[!template new="1" text="""Image generation is something that's, you know,""" start="00:07:32.800" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""it's not quite an LLM in the sense of...""" start="00:07:34.880" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""These are... It's a different technology,""" start="00:07:38.480" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""but these things are kind of packaged together""" start="00:07:43.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""in a sense.""" start="00:07:48.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And you'll see that when I talk about Emacs packages,""" start="00:07:49.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""a lot of them bundle image generation""" start="00:07:51.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and large language models.""" start="00:07:54.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You know, the APIs are often bundled together by providers.""" start="00:07:55.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And the general idea is it's kind of similar""" start="00:07:59.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""because it's very similar to large, you know,""" start="00:08:02.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""doing a chat thing where you, you know,""" start="00:08:04.400" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""the chat is like, you give it a text request,""" start="00:08:06.560" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""like write me a sonnet about, you know,""" start="00:08:09.761" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""the battle between Emacs and vi.""" start="00:08:12.760" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And it could, it could do it.""" start="00:08:14.880" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It could do a very good job of that.""" start="00:08:15.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""But you could also say, you know,""" start="00:08:17.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""draw me a picture of Emacs and vi as boxers,""" start="00:08:22.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""as a character-character boxing in a ring,""" start="00:08:27.600" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""like a, you know, political cartoon style.""" start="00:08:30.360" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And it can do that as well.""" start="00:08:32.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And so you could basically think of this""" start="00:08:35.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""as just sort of... it's kind of the""" start="00:08:37.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""same thing with what you're doing""" start="00:08:39.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""with large language models,""" start="00:08:42.400" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""but instead of outputting a text,""" start="00:08:43.360" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""you're outputting a picture.""" start="00:08:44.800" video="mainVideo-llm" id="subtitle"]]
+[[!template new="1" text="""There's also, I want to mention the concept of fine-tuning.""" start="00:08:48.480" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Fine-tuning is a way to take your--""" start="00:08:51.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""take a corpus of inputs and outputs and just from""" start="00:08:55.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""a large language model, you're like, okay,""" start="00:08:59.760" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""given this base large language model,""" start="00:09:01.600" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I want to make sure that when I give you input,""" start="00:09:03.600" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""you give me something like output.""" start="00:09:06.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And this is what I'm just going to""" start="00:09:08.480" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""train you further on these,""" start="00:09:10.120" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""these mappings between input and output.""" start="00:09:11.800" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And for example, you could do this. Like,""" start="00:09:14.880" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""let's say you wanted to fix that regex demo""" start="00:09:16.400" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I had to make it good.""" start="00:09:18.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I don't think it, I think it'd be""" start="00:09:21.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""relatively effective to train,""" start="00:09:23.480" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""to have regex descriptions""" start="00:09:25.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and regex examples, Emacs regex examples""" start="00:09:27.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""as inputs and outputs.""" start="00:09:30.120" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You could get, you know, maybe a hundred,""" start="00:09:31.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""a few hundreds of these things.""" start="00:09:34.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You could train it.""" start="00:09:35.360" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I think that is a reasonable way to,""" start="00:09:38.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""let's just say, I don't know how well it would work,""" start="00:09:40.760" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""but these things definitely work some of the time""" start="00:09:43.880" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and produce pretty good results.""" start="00:09:46.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And you could do this on your own machine.""" start="00:09:48.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Corporations like OpenAI offer APIs with, you know,""" start="00:09:53.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""to build your fine tunes on top of OpenAI.""" start="00:09:59.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And I think, I'm not a hundred percent sure,""" start="00:10:01.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""but I think then you can share your model""" start="00:10:04.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""with other people.""" start="00:10:05.720" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""But if not, then you just, you know,""" start="00:10:06.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""you could use your model for your own specialized purposes.""" start="00:10:08.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""But in the world of models that you could run,""" start="00:10:10.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""for example, based on Llama, which is like...""" start="00:10:14.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Llama is this model you can run on your own machine from Meta.""" start="00:10:16.875" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""There's many fine-tuned models that you could download""" start="00:10:23.580" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and you could run on your own.""" start="00:10:26.881" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""They can do very different things too.""" start="00:10:28.961" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Some output Python programs, for example,""" start="00:10:30.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""that you could just run.""" start="00:10:33.400" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So you just say...""" start="00:10:34.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Tell me how old... Let's just say""" start="00:10:37.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""you have a random task, like""" start="00:10:40.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""tell me how old these five cities are in minutes,""" start="00:10:42.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""based on historical evidence.""" start="00:10:48.120" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It's kind of a weird query, but it probably can figure,""" start="00:10:49.800" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""it could probably run that for you.""" start="00:10:53.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It'll encode its knowledge into whatever""" start="00:10:55.120" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""the Python program, then use the Python program""" start="00:10:57.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""to do the correct calculations.""" start="00:10:59.600" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So pretty, pretty useful stuff.""" start="00:11:01.040" video="mainVideo-llm" id="subtitle"]]
+[[!template new="1" text="""So I also want to mention open source""" start="00:11:08.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and basically free software here.""" start="00:11:10.400" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""These LLMs are mostly not free software.""" start="00:11:12.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""They're sometimes open source,""" start="00:11:17.600" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""but they're generally not free""" start="00:11:19.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""without restrictions to use.""" start="00:11:21.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Most of these things, even Llama,""" start="00:11:23.800" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""which you can use on your own machine,""" start="00:11:27.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""have restrictions that you cannot use it""" start="00:11:28.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""to train your own model.""" start="00:11:31.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""This is something that, you know,""" start="00:11:32.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""it costs millions and millions of dollars""" start="00:11:35.120" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""to train and produce these models.""" start="00:11:37.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And that's just computation costs.""" start="00:11:40.760" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""They do not want you""" start="00:11:42.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""stealing all that work by training your own models""" start="00:11:45.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""based on their output.""" start="00:11:47.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""But there are research LLMs that do, I believe,""" start="00:11:48.800" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""conform to free software principles.""" start="00:11:55.360" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""They're just not as good yet.""" start="00:11:58.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And I think that might change in the future.""" start="00:11:59.520" video="mainVideo-llm" id="subtitle"]]
+[[!template new="1" text="""So speaking of the future,""" start="00:12:02.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""one of the things I'd like to point out""" start="00:12:04.120" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""is that like the demos I showed you are based on,""" start="00:12:07.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I'm using OpenAI 3.5 model.""" start="00:12:09.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""That's more than, well, no,""" start="00:12:13.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""it's like a year old basically at this point.""" start="00:12:16.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And things are moving fast.""" start="00:12:18.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""They came out with 4.0.""" start="00:12:21.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""4.0 is significantly better.""" start="00:12:22.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I don't have access to it.""" start="00:12:23.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Even though I'm using the API and I'm paying money for it,""" start="00:12:24.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""you only can get access to 4.0""" start="00:12:30.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""if you can spend a dollar.""" start="00:12:33.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And I've never been able to spend,""" start="00:12:34.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""use so much API use that I've spent a dollar.""" start="00:12:36.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So I have, I don't have 4.0, but I've tried it""" start="00:12:38.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""because I do pay for this""" start="00:12:44.480" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""so I could get access to 4.0""" start="00:12:46.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and it is substantially better.""" start="00:12:48.341" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""By all reports, it's,""" start="00:12:49.600" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""the difference is extremely significant.""" start="00:12:50.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I would not be surprised""" start="00:12:53.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""if some of the limitations and drawbacks I described""" start="00:12:55.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""mostly went away with 4.0.""" start="00:12:59.760" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""We're probably at a stage""" start="00:13:02.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""where regexes will work maybe 5% of the time""" start="00:13:06.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""if you try them.""" start="00:13:09.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""But with 4.0, it could work like 80% of the time.""" start="00:13:10.120" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Now, is that good enough?""" start="00:13:13.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Probably not, but it's a,""" start="00:13:14.560" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I wouldn't be surprised if you got results like that.""" start="00:13:17.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And in a year's time, in two years time,""" start="00:13:20.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""no one knows how much this is going to play out""" start="00:13:22.920" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""before progress stalls,""" start="00:13:26.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""but there are a lot of interesting research.""" start="00:13:27.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I don't think, research wise,""" start="00:13:32.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I don't think things have slowed down.""" start="00:13:34.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You're still seeing a lot of advances.""" start="00:13:35.760" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You're still seeing a lot of models coming out""" start="00:13:38.720" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and that will come out.""" start="00:13:41.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""That will be each one, one upping the other one""" start="00:13:41.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""in terms of quality.""" start="00:13:46.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It'll be really interesting to see how this all plays out.""" start="00:13:49.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I think that message here is that""" start="00:13:52.760" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""we're at the beginning here.""" start="00:13:55.920" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""This is why I think this talk is important.""" start="00:13:58.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I think this is why we should be""" start="00:14:01.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""paying attention to this stuff.""" start="00:14:02.280" video="mainVideo-llm" id="subtitle"]]
+[[!template new="1" text="""Let's talk about the existing packages.""" start="00:14:08.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Because there's a lot out there, people have,""" start="00:14:11.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I think people have been integrating with""" start="00:14:13.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""these LLMs that often have a relatively easy to use API.""" start="00:14:17.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So it's kind of natural that people""" start="00:14:21.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""have already put out a lot of packages.""" start="00:14:24.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Coming off this problem from a lot of different angles,""" start="00:14:25.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I don't have time to go through""" start="00:14:28.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""all of these packages.""" start="00:14:30.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""These are great packages though.""" start="00:14:31.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""If you're not familiar with them,""" start="00:14:33.560" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""please check them out.""" start="00:14:35.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And they all are doing slightly different things.""" start="00:14:37.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Some of these are relatively straightforward.""" start="00:14:41.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Interactions, just a way to""" start="00:14:43.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""almost in a comment sort of way to kind of""" start="00:14:47.920" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""have just an interaction,""" start="00:14:52.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""long running interaction with an LLM""" start="00:14:54.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""where you kind of build off previous responses,""" start="00:14:55.480" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""kind of like the OpenAI's UI.""" start="00:14:59.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Two very more Emacsy things where you can sort of""" start="00:15:01.800" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""embed these LLM responses within a org-mode block""" start="00:15:08.560" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""using the org-mode's context.""" start="00:15:13.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Or GitHub Copilot integration where you can use it""" start="00:15:15.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""for auto completion in a very powerful,""" start="00:15:20.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""you know, this stuff is very useful if it could figure out""" start="00:15:23.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""what you're trying to do based on the context.""" start="00:15:27.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It's quite effective.""" start="00:15:29.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""But I want to kind of call out one thing""" start="00:15:31.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""that I'd like to see change.""" start="00:15:36.360" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Which is that users right now,""" start="00:15:38.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""not all of these have a choice of,""" start="00:15:42.600" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""first of all, there's a lot of them.""" start="00:15:45.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Each one of them is doing their own calls.""" start="00:15:47.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And each one of them is, so each one of them""" start="00:15:49.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""has their own interfaces.""" start="00:15:54.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""They're rewriting the interface to OpenAI or wherever.""" start="00:15:55.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And they're not, they don't, most of these""" start="00:15:57.720" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""do not make it that configurable or at all configurable""" start="00:16:00.120" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""what LLM use.""" start="00:16:05.120" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""This is not good.""" start="00:16:06.600" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It is important that we use,""" start="00:16:07.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""we give the user a way to change the LLM they use.""" start="00:16:09.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And that is because you might not be comfortable""" start="00:16:15.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""sending your requests over to a private corporation""" start="00:16:21.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""where you don't get to see how they use their data.""" start="00:16:24.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Your data, really.""" start="00:16:27.800" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""That's especially true with things like embeddings""" start="00:16:29.800" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""where you might be sending over your documents.""" start="00:16:33.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You're just giving them your documents, basically.""" start="00:16:35.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And, you know, that does happen.""" start="00:16:37.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I don't think really that there's a reason""" start="00:16:40.760" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""to be uncomfortable with this,""" start="00:16:43.600" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""but that, you know, people are uncomfortable and that's okay.""" start="00:16:44.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""People might want to use a local machine,""" start="00:16:51.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""a local LLM for maximum privacy.""" start="00:16:53.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""That's something we should allow.""" start="00:16:58.360" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""People might want to especially use free software.""" start="00:17:00.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""That's something we should definitely allow.""" start="00:17:04.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""This is Emacs.""" start="00:17:05.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""We need to encourage that.""" start="00:17:07.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""But right now, as most of these things are written,""" start="00:17:08.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""you can't do it.""" start="00:17:12.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And they're spending precious time""" start="00:17:13.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""just doing things themselves.""" start="00:17:17.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""This is why I wrote LLM, which is...""" start="00:17:18.880" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""it will just make that connection to the LLM for you""" start="00:17:20.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and it will connect to, you know, it has plugins.""" start="00:17:23.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So if you can, the user can configure what plugin""" start="00:17:26.720" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""it actually goes to.""" start="00:17:30.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Does it go to OpenAI?""" start="00:17:31.360" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Does it go to Google Cloud Vertex?""" start="00:17:32.400" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Does it go to Llama on your machine?""" start="00:17:35.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""We're using Ollama,""" start="00:17:37.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""which is just a way to run Llama locally.""" start="00:17:38.400" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And more things in the future, I hope.""" start="00:17:41.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So this is, I'm hoping that we use this.""" start="00:17:47.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It's designed to be sort of maximally usable.""" start="00:17:52.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You don't need to install anything.""" start="00:17:54.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It's on GNU ELPA.""" start="00:17:56.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So even if you write something""" start="00:17:58.360" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""that you want to contribute to GNU ELPA,""" start="00:17:59.880" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""you can use it because it's on GNU ELPA.""" start="00:18:01.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It's part of the Emacs package, Emacs core packages.""" start="00:18:02.880" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So, but it has no functionality.""" start="00:18:06.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""It's really just there as a library""" start="00:18:09.880" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""to use by other things offering functionality. Okay.""" start="00:18:11.720" video="mainVideo-llm" id="subtitle"]]
+[[!template new="1" text="""And it's a little bit difficult to abstract.""" start="00:18:15.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I want to point this out""" start="00:18:19.840" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""because I think it's an important point""" start="00:18:21.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""is that the, it's, some of these LLMs, for example,""" start="00:18:23.600" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""have image generation.""" start="00:18:29.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Some do not.""" start="00:18:30.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Some of them have very large context windows, even for chat.""" start="00:18:31.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You say, okay, all these things can do chat.""" start="00:18:35.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Okay.""" start="00:18:37.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Yeah, kind of.""" start="00:18:37.320" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Some of these things you could pass a book to,""" start="00:18:38.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""like Anthropic's API.""" start="00:18:40.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Most, you cannot.""" start="00:18:41.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So there really are big differences""" start="00:18:43.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""in how these things work.""" start="00:18:45.560" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I hope those differences diminish in the future.""" start="00:18:46.400" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""But it's just one of the challenges""" start="00:18:51.540" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""that I hope we can work through in the LLM library.""" start="00:18:53.801" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""So it's compatible, but there's definitely""" start="00:18:57.521" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""limits to that compatibility.""" start="00:19:02.161" video="mainVideo-llm" id="subtitle"]]
+[[!template new="1" text="""I want to point out just to finish off,""" start="00:19:04.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Emacs is the, Emacs has real power here""" start="00:19:06.161" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""that nothing else I think in the industry is offering.""" start="00:19:12.880" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""First of all, people that use Emacs""" start="00:19:15.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""tend to do a lot of things in Emacs.""" start="00:19:19.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""We have our to-dos in Emacs with the org mode.""" start="00:19:20.440" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""We have mail.""" start="00:19:22.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""We, you know, we might read email and we might,""" start="00:19:23.000" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and respond to email in Emacs.""" start="00:19:25.720" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""We might have notes in Emacs.""" start="00:19:27.680" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""This is very powerful.""" start="00:19:29.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Using... there's not other stuff like that.""" start="00:19:31.360" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""And you could feed this stuff to an LLM.""" start="00:19:34.160" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""You could do interesting things""" start="00:19:35.760" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""using a combination of all this data.""" start="00:19:37.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""No one else could do this.""" start="00:19:38.560" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""We need to start thinking about it.""" start="00:19:40.400" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Secondly, Emacs can execute commands.""" start="00:19:41.760" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""This might be a bad idea.""" start="00:19:45.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""This might be how the robots take over,""" start="00:19:46.240" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""but you could have the LLMs respond with Emacs""" start="00:19:48.400" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""commands and run those Emacs commands""" start="00:19:51.800" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and tell the LLM the response and have it do things""" start="00:19:54.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""as your agent in the editor.""" start="00:19:57.080" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""I think we need to explore ideas like this.""" start="00:19:58.680" video="mainVideo-llm" id="subtitle"]]
+[[!template new="1" text="""And I think we need to share these ideas""" start="00:20:01.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and we need to make sure that we're pushing the""" start="00:20:04.280" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""envelope for Emacs and actually, you know, doing things,""" start="00:20:07.040" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""sharing ideas, sharing progress,""" start="00:20:10.520" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""and kind of seeing how far we can push this stuff.""" start="00:20:12.960" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Let's really help Emacs out, be sort of,""" start="00:20:15.200" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""take advantage of this super powerful technique.""" start="00:20:20.640" video="mainVideo-llm" id="subtitle"]]
+[[!template text="""Thank you for listening.""" start="00:20:24.520" video="mainVideo-llm" id="subtitle"]]
+
+
+
+Captioner: bala
+
Questions or comments? Please e-mail [ahyatt@gmail.com](mailto:ahyatt@gmail.com?subject=Comment%20for%20EmacsConf%202022%20llm%3A%20LLM%20clients%20in%20Emacs%2C%20functionality%20and%20standardization)
diff --git a/2023/info/llm-before.md b/2023/info/llm-before.md
index 73c4d854..454a92ea 100644
--- a/2023/info/llm-before.md
+++ b/2023/info/llm-before.md
@@ -8,12 +8,27 @@ The following image shows where the talk is in the schedule for Sat 2023-12-02.
Format: 21-min talk; Q&A: BigBlueButton conference room <https://media.emacsconf.org/2023/current/bbb-llm.html>
Etherpad: <https://pad.emacsconf.org/2023-llm>
Discuss on IRC: [#emacsconf-dev](https://chat.emacsconf.org/?join=emacsconf,emacsconf-dev)
-Status: Ready to stream
+Status: Now playing on the conference livestream
<div>Times in different timezones:</div><div class="times" start="2023-12-02T15:55:00Z" end="2023-12-02T16:15:00Z"><div class="conf-time">Saturday, Dec 2 2023, ~10:55 AM - 11:15 AM EST (US/Eastern)</div><div class="others"><div>which is the same as:</div>Saturday, Dec 2 2023, ~9:55 AM - 10:15 AM CST (US/Central)<br />Saturday, Dec 2 2023, ~8:55 AM - 9:15 AM MST (US/Mountain)<br />Saturday, Dec 2 2023, ~7:55 AM - 8:15 AM PST (US/Pacific)<br />Saturday, Dec 2 2023, ~3:55 PM - 4:15 PM UTC <br />Saturday, Dec 2 2023, ~4:55 PM - 5:15 PM CET (Europe/Paris)<br />Saturday, Dec 2 2023, ~5:55 PM - 6:15 PM EET (Europe/Athens)<br />Saturday, Dec 2 2023, ~9:25 PM - 9:45 PM IST (Asia/Kolkata)<br />Saturday, Dec 2 2023, ~11:55 PM - 12:15 AM +08 (Asia/Singapore)<br />Sunday, Dec 3 2023, ~12:55 AM - 1:15 AM JST (Asia/Tokyo)</div></div><div><a href="/2023/watch/dev/">Find out how to watch and participate</a></div>
+<div class="vid"><video controls preload="none" id="llm-mainVideo"><source src="https://media.emacsconf.org/2023/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--main.webm" />captions="""<track label="English" kind="captions" srclang="en" src="/2023/captions/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--main.vtt" default />"""<track kind="chapters" label="Chapters" src="/2023/captions/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--main--chapters.vtt" /><p><em>Your browser does not support the video tag. Please download the video instead.</em></p></video>[[!template id="chapters" vidid="llm-mainVideo" data="""
+00:00.000 Intro to the Talk
+00:25.080 What are LLMs?
+01:56.360 Power of LLMs (Magit Demo)
+03:32.240 Drawbacks of LLMs (regex demo)
+05:20.120 Embeddings
+07:32.800 Image Generation
+08:48.480 Fine-tuning
+11:08.160 Open Source
+12:02.840 The Future
+14:08.200 LLMs in Emacs - existing packages
+18:15.960 Abstracting LLM challenges
+19:04.080 Emacs is the ideal interface for LLMs
+20:01.960 Outro
+"""]]<div></div>Duration: 20:26 minutes<div class="files resources"><ul><li><a href="https://pad.emacsconf.org/2023-llm">Open Etherpad</a></li><li><a href="https://media.emacsconf.org/2023/current/bbb-llm.html">Open public Q&A</a></li><li><a href="https://media.emacsconf.org/2023/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--final.webm">Download --final.webm (50MB)</a></li><li><a href="https://media.emacsconf.org/2023/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--intro.vtt">Download --intro.vtt</a></li><li><a href="https://media.emacsconf.org/2023/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--intro.webm">Download --intro.webm</a></li><li><a href="https://media.emacsconf.org/2023/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--main--chapters.vtt">Download --main--chapters.vtt</a></li><li><a href="https://media.emacsconf.org/2023/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--main.opus">Download --main.opus (12MB)</a></li><li><a href="https://media.emacsconf.org/2023/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--main.vtt">Download --main.vtt</a></li><li><a href="https://media.emacsconf.org/2023/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--main.webm">Download --main.webm (50MB)</a></li><li><a href="https://media.emacsconf.org/2023/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--normalized.opus">Download --normalized.opus (18MB)</a></li><li><a href="https://media.emacsconf.org/2023/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--original.mp4">Download --original.mp4 (2.8GB)</a></li><li><a href="https://media.emacsconf.org/2023/emacsconf-2023-llm--llm-clients-in-emacs-functionality-and-standardization--andrew-hyatt--reencoded.webm">Download --reencoded.webm (44MB)</a></li><li><a href="https://toobnix.org/w/ck1LWXvRiAGNLWFA8s4Ymi">View on Toobnix</a></li></ul></div></div>
# Description
<!-- End of emacsconf-publish-before-page --> \ No newline at end of file