summaryrefslogtreecommitdiffstats
path: root/2023/talks/matplotllm.md
diff options
context:
space:
mode:
Diffstat (limited to '2023/talks/matplotllm.md')
-rw-r--r--2023/talks/matplotllm.md6
1 files changed, 3 insertions, 3 deletions
diff --git a/2023/talks/matplotllm.md b/2023/talks/matplotllm.md
index dfa32233..4117374b 100644
--- a/2023/talks/matplotllm.md
+++ b/2023/talks/matplotllm.md
@@ -45,17 +45,17 @@ Emacs.
- Q: What is the license of <https://github.com/lepisma/matplotllm>
project ? Sjo
- - A: GPLv3 or later. Sorry, I didn\'t put this in the repository,
+ - A: GPLv3 or later. Sorry, I didn't put this in the repository,
You can refer to
<https://github.com/lepisma/matplotllm/blob/main/matplotllm.el#L18C12-L29>
though.
- Q: Sometimes LLMs hallucinate. Can we trust the graph that it
produces?
- A: Not always, but the chances of hallucinations impacting
- \'generated code\' that causes a harmful but not identifiable
+ 'generated code' that causes a harmful but not identifiable
hallucinations are a little lower. Usually hallucination in code
show up as very visible bug so you can always do a retry. But I
- haven\'t done a thorough analysis here yet.
+ haven't done a thorough analysis here yet.
- Q: What are your thoughts on the carbon footprint of LLM useage?
- (not the speaker): to add a bit more to power usage of LLMs, it is not inherent that the models must take many megawatts to train and run. work is happening and seems promising to decrease power usage
## Notes