in Code

Recent Entries (Page 8)

  • Auto as Category, Applicative & Arrow (Intro to Machines/Arrows Part 2)

    Welcome back! It’s been a while since the last post, admittedly; sorry! In this post we’ll be continuing on from the previous post. In particular, we’re going to be looking at the Auto type as something that is a part of a pretty powerful pattern of abstraction, and try to exploit it to write concise, expressive code using Auto composition and proc notation. We’ll also see first hands the principles of locally stateful composition, and how much more expressive and safe it makes our code.

    One motivating factor that I will eventually write about is that we can use this to implement the semantics of Functional Reactive Programming, yay! But through this, I hope you can actually see that it is useful for much, much more!

    As always, feel free to leave a comment if you have any questions, or try to find me on twitter, or drop by the #haskell Freenode IRC channel! (I go by jle`)

    Note that all of the code in this post can be downloaded (from Auto.hs for the last post, and Auto2.hs for this post’s new material) so you can play along on GHCi, or write your own code using it the concepts and types here :) You can also run it online interactively.

    A fair warning: at times this post might feel a bit fragmented; but remember that we really are just going to be exploring and getting very familiar with the Auto type and building an intuition. Everything comes to a mini-climax at the end, and a final satisfying one at the next post — kind of like every Part 2 in every trilogy ever, you know? :)

    Read more … Comments

  • Pipes: Streaming Huffman Compression in Haskell (Part 3)

    Let’s finally finish up our Streaming Huffman Compression project by actually implementing the “streaming” part :) In part 1 we looked at the data structures which we used to implement our compression logic; in part 2 we looked at the actual compression/decompression algorithm and implemented it. Finally, let’s wrap it all up and actually implement a streaming interface!

    If we were using an imperative approach, this would usually involve some sort of loop — read a byte, process it, write the resulting byte, read the next, process it, write it…it’s a step of instructions that a computer will be able to perform step-by-step.

    In Haskell, when we can, we try to look for a pure, declarative approach based on compositions of abstractions. That’s what Haskell does best, after all. So let’s see what we can do!

    (All of the code in this article and the ones before can be downloaded from github, so you can download it and try it out yourself!)

    Read more … Comments

  • Looking forward: A Doctorate Program

    So a bit of some personal news (which you can safely ignore if you’re not interested in my personal life!) — I’m excited to announce that I have decided to accept an offer to the Computational and Data Science Doctorate Program at Chapman University in California. I came to this decision after a decently long period of deliberation and thinking things over, and weighing opportunities at Chapman against my other offers. However, having just finalized everything this week, I am ready to announce Chapman as my home for my doctoral studies in the upcoming years.

    There are two main aspects to my decision — why computational science, and why Chapman?

    Williams Hall — Chapman University (Photo by Tom Arthur)

    Read more … Comments

  • Inside My World (Ode to Functor and Monad)

    I like Haskell because it lets me live inside my world.

    There are a lot of special worlds out there! And Haskell lets me stay in those worlds, and use all of the tools I normally have when I’m not there. I get to transform normal tools into tools that work in my world.

    (This post is meant to be approachable by people unfamiliar with Haskell! That being said, if there is a concept you don’t understand, feel free to leave a comment, tweet me, stop by on irc at freenode’s #haskell, or give Learn You a Haskell a quick read!)

    Read more … Comments

  • Streaming Huffman Compression in Haskell (Part 2: Binary and Searches)

    Continuing on this series of beginner/intermediate projects for newer Haskell users, let’s look back at our Huffman encoding project.

    In our last post we went over two types of binary trees implemented as algebraic data structures in Haskell, and also a scheme for assembling a Huffman encoding tree using the State monad.

    Now let’s look at serializing and unserializing our prefix trees for easy storage, and then at actually using them to encode and decode!

    Read more … Comments

  • A (Dead End?) Arrowized Dataflow Parallelism Interface Attempt

    So I’ve been having several ‘dead end’ projects in Haskell recently that I’ve sort of just scrapped and move from, but I decided that it might be nice to document some of them :) For reading back on it later, for people to possibly help/offer suggestions, for people to peruse and possibly learn something, or for people to laugh at me. Here is my most recent semi-failure — implicit dataflow parallelism through an Arrow interface.

    tl;dr:

    1. Compose parallelizable computations using expressive proc notation.
    2. Consolidate and join forks to maintain maximum parallelization.
    3. All data dependencies implicit; allows for nice succinct direct translations of normal functions.
    4. All “parallelizable” functions can also trivially be typechecked and run as normal functions, due to arrow polymorphism.

    The main problem:

    • Consider ParArrow a c, ParArrow b d, ParArrow (c,d) (e,f), ParArrow e g, and ParArrow f h. We execute the first two in parallel, apply the third, and execute the second two in parallel. Basically, we want two independent ParArrow a g and ParArrow c h that we can fork. And this is possible, as long as the “middle” arrow does not “cross-talk” — that is, it can’t be something like arr (\(x,y) -> (y,x)).

    Read more … Comments

  • Streaming Huffman Compression in Haskell (Part 1: Trees and State)

    So you’re learning Haskell and are looking for some projects that aren’t super trivial, are familiar enough that you can use what you already know, and are difficult enough to maybe help you learn new things. Hey, maybe this is for you :)

    Let’s take a go at Huffman encoding in Haskell. We will look at two types of binary trees, which we use to implement immutable/persistent priority queues and prefix trees. We’ll play around with the State monad a bit, explore some useful typeclasses, learn how to serialize, marshal, and unmarshal data structures using the binary library, and also look at how to load data from a file and write to another in a pure way, avoiding lazy IO using the ever-more-popular pipes library. And hopefully we learn some neat Haskell idioms!

    We’re going to be assuming some basic Haskell knowledge, like algebraic data types, higher order functions, basic monad usage, and some basic familiarity with the functions in Prelude/base, the standard library. If you have any questions, feel free to leave a comment, drop by on #haskell on freenode, throw me a tweet, or give the great Learn You A Haskell a quick read!

    Read more … Comments

  • Intro to Machines & Arrows (Part 1: Stream and Auto)

    So I’m going to be running a series soon on computation and (physical) simulations using AFRP (Arrowized Functional Reactive Programming) principles.

    I consider (A)FRP to actually be a pretty game changing paradigm. It provides us with semantics by which to compose and build time-varying, reactive behaviors and completely changes the way we approach any sort of simulation/state-like project.

    AFRP has its own elegant way of approaching problems, but to be able to properly use it for simulations, we’re going to have to start by learning about the fundamental abstraction behind its implementation: machines.1

    (This post will assume a somewhat basic knowledge of Haskell. I’ll try explaining concepts here and there if I feel that they might not be very commonly known. But if you have any questions, feel free to leave a comment or stop by freenode’s #haskell on irc!)

    (A short disclaimer: this article has not too much to do with the great machines library by Rúnar Bjarnason and Edward Kmett)


    1. It is somewhat important to note here that the semantics of FRP do not inherently involve machines. We’ll learn more about this later. For now, remember that this series will chiefly study the low-level implementation of AFRP, which may or may not be related to the semantics/abstractions of FRP — in an ideal world we wouldn’t even have to worry about implementation and just work on the level of the abstractions. Unfortunately, we don’t live in an ideal world :(↩︎

    Read more … Comments