summaryrefslogtreecommitdiffstats
path: root/2021/talks
diff options
context:
space:
mode:
authorLeo Vivier <zaeph@zaeph.net>2021-12-03 22:05:36 +0100
committerLeo Vivier <zaeph@zaeph.net>2021-12-03 22:05:36 +0100
commit4ee3d9451f2f3521cea14b5fdd6c36cc1080c307 (patch)
treed47112deb37c1b669e4d79be910cb43247af2cb8 /2021/talks
parentf413d4c43f1910c1ab73fae714644fac69d45271 (diff)
downloademacsconf-wiki-4ee3d9451f2f3521cea14b5fdd6c36cc1080c307.tar.xz
emacsconf-wiki-4ee3d9451f2f3521cea14b5fdd6c36cc1080c307.zip
Add pad stuff
Diffstat (limited to '2021/talks')
-rw-r--r--2021/talks/imaginary.md24
1 files changed, 24 insertions, 0 deletions
diff --git a/2021/talks/imaginary.md b/2021/talks/imaginary.md
index 0f9a030c..82895374 100644
--- a/2021/talks/imaginary.md
+++ b/2021/talks/imaginary.md
@@ -31,6 +31,30 @@ GPL. Please keep an open mind.
# Discussion
+Pad:
+
+- Q1: Do you have a site we can follow more of your writing on?
+ - A:Pen.el Tutorial: https://semiosis.github.io/posts/pen-el-tutorial/
+ - https://semiosis.github.io/posts/ilambda-tutorial/
+ - https://emacsconf.org/2021/talks/imaginary/
+- Q2: re slide 27, would it mean that 2 such "idefined" functions would be the "same", meaning do the same thing the same way, given that they are defined without a "body"? (i'm trying to get a better grasp on the objects that get so "imagined" under the hood)
+ - A: The first time a function is run with given parameters, the results are remembered. I use the memoize library. You can update the function every time by surrounding the call the the function with the (upd ...) macro. The body evaluation is completely short-circuited with idefun. The imacro works a bit differently. It will generate real code. You can use the normal macro-expand on an imacro.
+- Q3:Opalvaults :What are some underlying concepts/papers, that we could read to become more familiar with your overarching ideas? (i.e. for instance things that inspired your ideas)
+ - A: paper: pretrain, prompt and predict
+- Q4: Sorry, I just don't get it: How is a function that does something different each time it's called useful?
+ - A: Each time you run one of these functions, you are getting the computer to imagine for you. It's a bicycle for the imagination. You can automate the filtration of the results you want, say by doing many generations and applying grep, or other prompts such as the semantic search prompt to the results. The functions are memoised, so they technicaly do the same thing every time if you want them to. Also, if you use a temperature of 0 for the prompt functions (I demonstrate how to override that, somewhere in the slides), it will be deterministic too, even when bypassing the cache.
+- Q5: How on earth do you ensure that what ilambda gets back from GPT-3 is Lisp and not, say, Harry Potter fanfic? :)
+ - A: A combination of good prompt design, filtering the results, and validating the results. Also, you can fine-tune models to the task you want to eliminate the possibility of unwanted generations.
+- Q6: Your views on the pluses and minuses of GPT-3?
+ - A:It's something we have to live with because of its transformative nature on computing. These language models unfortunately are license-blind.
+- Q7: Any interesting ideas about potential applications of GPT-3 to Emacs itself (or Emacs-adjacent things)?
+ - A: Emacs is the ultimate text-centric operating system. It will become a kernel for AGI, I think. That's what I plan on making. The power-user's terminal of human-ai interaction. I'm trying to extend as many modes in emacs as possible. Org-brain, eww browser, org-mode, comint, emacs lisp primitives, etc.
+- Q8: Follow-up on Q2: how does infering functions in this manner differ from, say, how in the Haskell ecosystem functions are infered by specifying inputs and return type (such as when searching for a suitable function for a given purpose)?
+ - A: Where in haskell, type-declarative function search look through a discrete set of functions by type, the domain of possible functions that are search for using language models is qualatively and quantatively infinite.
+- Q9: Are you deriving functions from their names? What do you do when this is ambiguous - for example, when the name of the function is "get-element-from-pair"?
+ - A: idefun will infer computation and short-circuit the code. Given either 'function name', alone, function name + args, or function name, + args + docstring, or function nae + args + docstring + function body, it will make use of the context you have provided and imagine evaluation. It will create functions which infer rather than properly evaluate, based on merely the name of the function, for example.
+ - A (re: ambiguity): If you had an imaginary defun for this, you'd need to send the final list
+
IRC nick: libertyprime
BBB: