summaryrefslogtreecommitdiffstats
path: root/2024/talks/transducers.md
blob: f5f82c2b64daaede039896b9da83f540b7ae1743 (plain) (blame)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
[[!meta title="Transducers: finally, ergonomic data processing for Emacs!"]]
[[!meta copyright="Copyright © 2024 Colin Woodbury"]]
[[!inline pages="internal(2024/info/transducers-nav)" raw="yes"]]

<!-- Initially generated with emacsconf-publish-talk-page and then left alone for manual editing -->
<!-- You can manually edit this file to update the abstract, add links, etc. --->


# Transducers: finally, ergonomic data processing for Emacs!
Colin Woodbury (he) - <https://x.com/@fosskers> ,  @fosskers@m.fosskers.ca on Mastodon, <https://www.fosskers.ca>

[[!inline pages="internal(2024/info/transducers-before)" raw="yes"]]

Transducers are an ergonomic and extremely memory-efficient way to process a
data source. Here "data source" means simple collections like Lists or
Vectors,
but also potentially large files or generators of infinite data.

Transducers&#x2026;

-   allow the chaining of operations like map and filter without allocating memory between each step.
-   aren't tied to any specific data type; they need only be implemented once.
-   vastly simplify "data transformation code".
-   have nothing to do with "lazy evaluation".
-   are a joy to use!

In this talk, Colin will introduce Transducers, show how to use them, and
demonstrate some Emacs-specific workflows that make live processing of large
data sets in JSON and CSV a breeze.

About the speaker:

Colin has been active in the FOSS world since 2011, publishing libraries and
applications primarily in Haskell and Rust. Since 2023 he has been using
Lisps
more and more, and after falling in love with Transducers from Clojure has
ported the pattern to three other Lisps.

Colin is originally from Canada and lives in Japan.


# Discussion

## Questions and answers

- Q: When I tried comparing transducers.el to cl-lib and dash
  (benchmark-compiled), I got the following results:
  ```
  cl-lib: 0.5 sec, 2 GCs
  dash: 0.3 sec, 1 GC,
  transducers: 0.75 sec, 2 GC
  cl-loop: 0.025 sec, 0 GC (note: 0.025, one order-of-magnitude faster)
  I expected transducers to be slower than cl-loop but faster than the
  cl-lib or dash.  However this isn't the case.  Any idea why? (benchmark
  is here:
  [https://old.reddit.com/r/emacs/comments/1h5c778/which_emacsconf_2024_talks_have_your_attention/m061dge/](https://old.reddit.com/r/emacs/comments/1h5c778/which_emacsconf_2024_talks_have_your_attention/m061dge/))
  ```
  - (benchmark-run-compiled 1000  (cl-loop for n from 1 below
            2000 by 2           sum (\* n n) into total          
            finally return total))
  - A: Loop is always going to win in cases like this where we are
  not doing two nested (e.g.) change calls, what I called comp
  steps.  tranducers shines while we need to do things which chain
  things together; we can sometimes express ourselves more clearly
  vs loop.  this may sound sounds like moving the goal-posts:
  it's really about internal function calls, avoiding stepping
  though each loop in ways which loop doesn't need to do, so loop
  might "win".
  - Note: I'm comparing against cl-lib and dash \-- the cl-loop is
    only for reference. I'm curious about the performance gap
    between transducers and cl-lib/dash.  The number of function
    calls is the same for cl-lib and dash and transducers.

- Q: Did you think about generators as a source cf lists, vectors,
  etc? Maybe I got a word wrong. After the development that generators
  and Series operations needed-each-other, not being redundant as had
  been thought. I forgot who the generators person was in lisp.
  - A: (not yet answered)

- Q:  Do you know of any theoretical texts on transducers?
  - A: My README and Rich Hickey (inventor of Clojure) may be the
    key texts on transducers. 
    - and his talks/videos (on the topic)
  - [https://andreyorst.gitlab.io/posts/2022-08-13-understanding-transducers/](https://andreyorst.gitlab.io/posts/2022-08-13-understanding-transducers/)
  - (not fosskers): I think AIM-1082 is interesting to read. 
    ([https://dspace.mit.edu/handle/1721.1/6035](https://dspace.mit.edu/handle/1721.1/6035)?)
    Yes

- Q: Waters (lazy series in lisp, late 70s) said that this \*should
  have\* been done as an additional compiler feature in compilers, but
  if not it must be a macro package. Did you think about that viz your
  cl, fennel, elisp, porting of your transducers?
  - A: I think my work could help provide the basis for this;
    there's much more to be done.

- Q: Does t-buffer-read provide a lazy stream that's linewise, or
  charwise, or do something else entirely?
  - A: t-file-read

- Q: Can the Elisp library be combined with the stream.el API
  ([https://elpa.gnu.org/packages/stream.html](https://elpa.gnu.org/packages/stream.html))? 
  Or seq in general?
  - A: I'd say these libraries are completely orthogonal. (Re: what
    made me ask this question was seeing \`t-buffer-read' and
    hoping that this doesn't convert a buffer into a string)  With
    seq/generics it should just work: magic of defgeneric

- Q: How does one debug a t-comp expression? Can you single step and
  see intermediate results of the different statements you declare?
  - A: In Common Lisp it is wonderful. In Emacs Lisp it is more
    complicated. I don't know if step debugging would work, but
    there is a "log" (?) function to print out values.

- Q: Is there a path for transducers to enable elisp processing of
  otherwise overly large datasets as if just normal Emacs "buffers"
  (i.e. just pulling one thing at a time so essentially stream-like
  under the hood but buffer-like in interface), with none of the usual
  perf issues with a traditional buffer structure?
  - A: Possibly so yes

- Q: Re the performance issues mentioned and that popped up recently
  in the lists/forums, to what extend does tailcall optimization and
  other mechanisms (incl. inlining, GC-friendliness, etc.) could
  eventually alleviate issues and enable their use at little to no
  extra cost?
  - A: Over time, with further work and optimizations. Some already
    there (tailcall notably)

- Q: Is there an option to read a csv/json and produce an alist or
  plist instead of a hash table for an entry?
  - A:  Absolutely.

- Q: Is the common lisp version ready for 'production' use? Is it
  complete enough and the API stable enough?
  - A: I use it all the time. I use it extensively. Programming a
    game, not realizing a dent in the frame rate.

- Q: Do we need a pre-written "t-" version for every already
  existing reducing function like + or is there a function to
  construct them from already defined reducer 2-arg functions?
  - A:  already defined. This is basically fold. Some built-in functions like plus already function like reducers. It's a coincidence that they do that. But there's an example in the README. Max is one that does not act like that. For instance, maybe I could screen share later, but if you just type in plus one, If you call plus one in Emacs or Common Lisp, you get back one. It actually only needs one argument. If you only type plus, it actually gives you zero. Plus and multiple satisfy the API of reducers. But if you have one that doesn't, like the max function, and similarly, just type in plus as a function call, just plus with nothing else, and you'll see. No, as a function. zero will come out. This basically means it satisfies the reducer API. But a function like max does not. If you just type in max and then one, it won't work. Pardon me, it did. But if you type in max with nothing else, it wouldn't work. Hence, we have to wrap it in something like fold. I would say go look at the fold function.

- Q: Is the compelling argument for transducers is that it's a better
  abstraction? Seems like a lot of concerns/objections (while
  pragmatically valid) are focused on implementation. Can this
  abstraction allow for advances in implementation?
  - A: Yes, what I've basically done is mostly followed the pattern of usage that exists in Clojure and in Scheme's SERP 171. In theory, the service level API is the same no matter where you're using this, and that's the idea. If you learn them in one list, you should be able to use them everywhere. Then what it's actually doing under the hood is free for us to change around. My implementations are mostly based on the scheme with a few alterations here and there. And in the Common Lisp case, like adding some Common Lisp isms to improve usage like UX a little bit. But overall, we are free to do whatever we want internally to speed up performance. I just haven't done that work.
- Q: is there a path for transducers to enable elisp processing of otherwise excessively huge data ets as if just a normal Emacs "buffer" (i.e. but just pulling one thing at a time so essentially stream-like under the hood), with none of the usual issue when a traditional buffer structure?
  - (not yet answered)
- Q: So the "reducer API" means that the function accepts a variable number of arguments in a monoidal way?
  - that's what I gathered
- Q: From your investigations and tests so far, do you think there would be the necessity of transducers to eventually go down into the C level code for things like using them to solve "infinitely-big" buffer-like interfaces and such?
  - A: Yeah, like, if we really wanted to go that hardcore, maybe there's some like C level stuff that we could you know, significant demand for such a thing. You know, so far there hasn't been such demand, but maybe there will be in the future. Yeah, perhaps there's some custom stuff we could do.
- Q: why does the reducer have to sometimes be a custom function, but other times just something like #'+?
  - it depends on if that reducer function needed special input or not
- Q: do you have FSF copyright assignment? it is nice to get low-level libraries like transducers on ELPA so other copyright-assigned packages can use them (and so they can be included in Emacs when they reach wide adoption)
  - transducers is on MELPA
- Q: Is that #' business for some lisps and not others the lisp-1/lisp-2 distinction?
  - Sharp quote refers to (symbol-function 'my-symbol) in a lisp2
  - yes, it emphasizes using the "function" associated with the symbol (if there's one in the "function slot" for the symbol) as opposed to some "variable"-type value
  - (and in this case pkal is not asking about the sharp quote but a t-prefixed function as opposed to a standard function like +)
  - that's because of the separate namespace for function symbols?
  - function rather than symbol-function to be extra-nitpicky (to accomodate lambda forms)
  - If I remember correctly, single quote (') does not respect when a function symbol is overridden by a let, but (pound quote) #' does?
  - yes, my question was about the t- prefix.
  - "let"s only bind the symbol value slot, not the function slot.
  - yes, iiuc; in effect, it's sort of an early-binding/late-binding distinction
  - My bad. Should've specified flet.
  - @can't speak for the elisp case, but in the clojure case using things in transducer form has a slightly different "typing" shape as you'd expect and may require a wrapper if a function can't be used as is and such, though I missed the context and that may not be relevant to your point
- Q: Question about how the transducers video was made? Did you use Reveal.js? Do you have a pointer to the html hosted presentation? How did you generate the content for Reveal?
  - So the presentation itself was done with RevealJS from Org Mode. So as you saw, I had a raw Org Mode buffer, which was which was the presentation itself, which I then just exported with a few certain settings, a few customizations. And then for screen recording, I used OBS, which worked flawlessly on Arch Linux. I used Sway, Wayland, and all of that. So all of that just worked, which was very impressive. Where do the HTML host the presentation? I don't have that presentation hosted anywhere.
  - Text is a little small bth
  - no keep the latin on screen, i was trying to read that
    - It was Lorem Ipsum
      - Translating lorem ipsum plus plus
  - Thanks for larger text

## Notes

-   What made transducers click for me way back when was
    realizing that the usual operations (think map/reduce/filter/etc and
    their generalizations) implicitly or explicitly demand realized
    collections, plus intermediate ones between those operations (and
    GC), instead of composing such transformations into one operation
    and then making this applicable to streams of unknown-sizes and
    such.
    -   ah, like\...\*thinks very hard\*\...stream
        fusion, iirc?
        [http://lambda-the-ultimate.org/node/2192](http://lambda-the-ultimate.org/node/2192)
        that makes a lot of sense
    -   "Rich Hickey has a point" need never be said :)
-   Sorry but map is collect and filter is select for me :)
-   there are many ways to get to them (some may
    already think of those functions as folds, etc.), but for the bulk
    of people familiar with map/reduce/filter/etc, it's useful to enter
    the thinking realizing that you may have infinite streams (of any
    type even) on input, and may not want to have realized collections
    at every step of the usual function applications, thus the need to
    compose the things that make up a map/reduce/filter/etc before
    applying them one by one on a continued stream of things one at a
    time
-   ellis: I wrote about half of one in binutils libctf
    (generators, anyway). See the \*\_next() functions. Having seen this
    I'll try to mutate the internals to something closer, right now
    every iterator is independently implemented (mostly).
    -   (inspired by generators, not transducers, since I
        didn't know about them)
    -   still \*far\* less ugly than "call a function with
        every item" \*\_iter() stuff
-   Thanks for the answers fosskers, I'm quite hopeful
    with transducers working their way into many things, so thinking
    ahead to where that may happen and to solving perf hurdles
-   I'm totally sold. I'm working on a CL codebase
    right now, and these are going in there immediately
-   (also CL does not require TCO but good compilers support
    it to the extent that extensive use of FP is practical)
-   it's a tricky protocol to get perfect the first time for
    sure, is something that transcends language barriers so always fun
    to see more impls :)
-   CLTL2 docs on SERIES for those who are curious
    [http://www.cs.cmu.edu/afs/cs.cmu.edu/project/ai-repository/ai/html/cltl/clm/node347.html#SECTION003400000000000000000](http://www.cs.cmu.edu/afs/cs.cmu.edu/project/ai-repository/ai/html/cltl/clm/node347.html#SECTION003400000000000000000)
    -   (lisp.se mirror in case the ai repository disappears
        someday:
        [http://cltl2.lisp.se/cltl/clm/node347.html](http://cltl2.lisp.se/cltl/clm/node347.html)
        )
-   definitely watching this one more carefully. if it's
    CLOS-oriented i'm going to like it
-   note that full TCO is strictly more expressive than
    special-case optimizations as with emacs's cl-labels
- in the general case, there not need to actually process such a composed operation on an item from the input until needed/"pulled" from the output side, so yes in a way

- yea i think the next step in terms of performance would be using a 'plan' object internally to rewrite the lambda calls
- Julia does loop fusion through its broadcast operator.
- that gets very hairy and makes it far less simple imo
- Re the current answer, it doesn't yet, but it's on the path where it could eventually (i.e. introspecting on the composed transformation and simplifying it, be it fusion-type or otherwise)
- Waters agreed that ergonomics was the key key key thing. (In the PhD world, because of studies that people have trouble reading other peoples' iteration = loop codes)
- "galaxy brain" is great expression.
- CLTL2 docs on SERIES for those who are curious http://www.cs.cmu.edu/afs/cs.cmu.edu/project/ai-repository/ai/html/cltl/clm/node347.html#SECTION003400000000000000000
- i'm curious about what mathematical structures are related to transducers but that's probably a README question
- Q: transducers C impl when
- ecl slime repl + transducers or series
- I wrote about half of one in binutils libctf (generators, anyway). See the *_next() functions. Having seen this I'll try to mutate the internals to something closer, right now every iterator is independently implemented (mostly).
- note that full TCO is strictly more expressive than special-case optimizations as with emacs's cl-labels
- (inspired by generators, not transducers, since I didn't know about them)
- still *far* less ugly than "call a function with every item" *_iter() stuff
- Thanks for the answers fosskers, I'm quite hopeful with transducers working their way into many things, so thinking ahead to where that may happen and to solving perf hurdles
- I'm totally sold. I'm working on a CL codebase right now, and these are going in there immediately
- (also CL does not require TCO but good compilers support it to the extent that extensive use of FP is practical)
- it's a tricky protocol to get perfect the first time for sure, is something that transcends language barriers so always fun to see more impls :)
- absolutely. I will try :) of course libctf is ABI-stable so I can't just change what the user sees, but maybe I can make the internals less repetitive :)
- (and maybe add another API that is more transducery.)
- Nice ielm
- @pkal: that's what I meant back up, that some functions happen to have the right form for use in transducers, but fosskers expounds much better, small incompatibilities (think order of params, variadicity, etc)
- I used emacs for so long before finding out that ielm existed.  (I wish I knew sooner.)

Feedback:

- 👏👏
- impressive
- cool presentation
- (still, very nice!)
- I am so, so thrilled with this. This blew my mind. Thank you! 😊
- 👏
- nice 👏👏
- deeply not ugly
- Thank you for the excellent talk!
- Thanks for the talk
- Thanks fosskers
- 👏
- awesome 👏
- always a pleasure to watch transducers in action - thanks fosskers!
- Inspiring talk, promptly added to my aoc.asd. Will come in handy.
- definitely watching this one more carefully. if it's CLOS-based i'm going to like it; CLOS-oriented, rather
- Thank you fosskers! it is bound to end up in some, if not all of my common lisp projects

- YouTube comment: Nice, this seems to do basically the same thing as C++ ranges, which I've enjoyed using a lot
- YouTube comment: Good
- YouTube comment: Nice presentation!

[[!inline pages="internal(2024/info/transducers-after)" raw="yes"]]

[[!inline pages="internal(2024/info/transducers-nav)" raw="yes"]]