winterkoninkje: shadowcrane (clean) (Default)

I finally got around to posting the slides for a talk I gave twice this summer: Probability Smoothing for NLP: A case study for functional programming and little languages. The first version of the talk was presented at the McMaster Workshop on Domain Specific Lanaguages (and Ed Kmett has posted a video of that version on YouTube) with the presentation focused on EDSLs, with smoothing given as an example. The second version was presented at the AMMCS minisymposium on Progress and Prospects in Model-Based Scientific Software Development, where the focus was more on the domain itself and how the use of a DSL allows ensuring correctness, modularity, and maintainability of code for developing probability models. The slides are essentially the same for both talks, with the benchmarks updated a bit in the latter.

As you may have surmised, this is but a small facet of the Posta project I was working on last year. I had meant to submit it as a functional pearl for ICFP, but the timing didn't work out for that. After giving the McMaster version of the talk, Ed convinced me that I should publish the code for the smoothing DSL separately from the rest of Posta. So he's the one to blame about my being so slow in releasing the Posta code I promised this summer. Though seriously, I'd been considering breaking up and reorganizing the code anyways. Now that I'm back from ICFP and all my traveling over the summer, I hope to get that code pushed out soon. Sorry for the delay y'all.

April 2019

S M T W T F S
 123456
78910111213
14151617181920
212223242526 27
282930    

Tags

Page generated 22 Jun 2025 10:27 am
Powered by Dreamwidth Studios