Top products from r/ProgrammingLanguages

We found 29 product mentions on r/ProgrammingLanguages. We ranked the 30 resulting products by number of redditors who mentioned them. Here are the top 20.

Next page

Top comments that mention products on r/ProgrammingLanguages:

u/jdreaver · 72 pointsr/ProgrammingLanguages

Oh wow, I just went down the rabbit hole of CPS, SSA, and ANF while developing my compiler for a strict Haskell-like functional programming language.

I read the outstanding book by Appel on compiling using CPS, and was all ready to go to refactor my pre-LLVM IR to be CPS. Then I did more research and realized that while a number of optimizations are very natural in CPS, compiling CPS to machine code is not as simple. It felt like a really daunting project, and after wrestling with my CPS transformations for about a week I filed a CPS IR away in the "research again someday" bucket.

The best intermediate representation for a functional language I've found is A-Normal Form (ANF). Here is the original paper on the subject. The argument goes that ANF is much more compact and easier to understand than CPS, and still enables almost all of the same optimizations. Some recent work with join points in GHC and a few other papers/theses I read (linked below) convinced me that ANF was going to be my choice of IR.

I highly recommend sticking with LLVM. It is a very mature ecosystem and it gives you so much "for free". I think it's neat that my optimization pipeline will look like:

  1. Core -> Core optimizations
  2. Some small ANF optimizations
  3. Compilation to LLVM where I can have LLVM do some optimizations as well before spitting out machine code

    Even now, I only have some very rudimentary optimizations implemented for ANF, but turning on -O3 when compiling to LLVM makes my toy programs just as fast as equivalent programs I wrote in C. I feel like using LLVM gives you the best of both worlds between ANF and SSA; you hand-write your ANF transformations in your compiler, and let LLVM do the neat things that can be done with SSA optimizations. Note: I am no compiler expert. Maybe I'm being naive in thinking the LLVM optimizations after ANF optimizations give me that much. I'd be happy for someone else to chime in here :)

    Lastly, you mention ease of use and the ability to get started as important criteria. In that case something like ANF to LLVM is the obvious choice.

    Good luck!

    ---

    If anyone is interested, I gathered a lot of resources while researching CPS/ANF/SSA. I'll just dump them here:

    Andrew Appel wrote a book called Compiling with Continuations
    (https://www.amazon.com/Compiling-Continuations-Andrew-W-Appel/dp/052103311X),
    where he explains how continuations can be used as the back end of a compiler.
    Lots of stuff since then has been written on how using continuations makes lots
    of optimizations a lot simpler, and how it is pretty much equivalent to SSA.

    More stuff:

u/oilshell · 1 pointr/ProgrammingLanguages

Yeah there was nothing in the talk about it.

As far as I can tell, it's a somewhat silly reference to the fact that the influential book "Modern C++ Design" was written in the C++ 98 era.

This 2001 book advocated a clean break from the C heritage in C++:

https://www.amazon.com/Modern-Design-Generic-Programming-Patterns/dp/0201704315

It's heavy on template metaprogramming. I actually prefer "C with classes" for many problems, but it's it's clear that C++11, 14, 17 are going further in this direction.

And I believe that Immer relies on many features from C++11 and 14. Those features enable C++ the language to delegate all its data structures to libraries, rather than being part of the language.

-----

For a completely different and more useful application of the word "postmodern" in programming, I recommend this paper:

http://www.mcs.vuw.ac.nz/comp/Publications/CS-TR-02-9.abs.html

It's very insightful, fun, and in the 15 years since it was written has become more relevant. I haven't blogged about this, but this kind of thinking underlies the shell and the design of the Oil shell in particular. In particular, in large systems, there is "no grand narrative", and it makes sense to opportunistically use models when they are useful, but throw them out when they are not.

There is a tendency among programming language designers to assume that they are at the center of the universe. A big part of shell is about making existing pieces written in different languages work together.

And that includes the operating system. To me, it's obvious that runtime is more important compile time, and too many programming languages are ignorant of the other enormous piece of code that runs when your program runs -- the operating system.

u/RobertJacobson · 1 pointr/ProgrammingLanguages

Here's my attempt to be helpful!

  • Borrow or buy Simon Peyton Jones' The Implementation of Functional Programming Languages (Amazon, free PDF version).
  • Also read Implementing functional languages: a tutorial, which is a reimagining of the above for use in the classroom.
  • Read through The ZINC Experiment, Xavier Leroy's exposition of his earliest OCaml implementation.
  • I really like the LLVM Kaleidoscope tutorial series. It's not about compiling functional languages. Rather, it implements a compiler in OCaml.
  • I second u/sociopath_in_me's advice to try to tackle Crafting Interpreters again.
  • Check out The Optimal Implementation of Functional Programming Languages by Andrea Asperti and Stefano Guerrini (Amazon). There are PDFs of it all over the internet, but I don't know what its copyright status is.

    Regarding Asperti and Guerrini, there are a few people on this subreddit who are working on cutting edge research compilers for functional languages based on term-rewriting. I've found this subreddit as well as r/Compilers to be very friendly and helpful in general, so I encourage you to take advantage of them. Ask questions, bounce ideas off of people, etc.
u/DonaldPShimoda · 8 pointsr/ProgrammingLanguages

I've peeked at this free online book a few times when implementing things. I think it's a pretty solid reference with more discussion of these sorts of things!

Another option is a "real" textbook.

My programming languages course in university followed Programming Languages: Application and Interpretation (which is available online for free). It's more theory-based, which I enjoyed more than compilers.

But the dragon book is the go-to reference on compilers that is slightly old but still good. Another option is this one, which is a bit more modern. The latter was used in my compilers course.

Outside of that, you can read papers! The older papers are actually pretty accessible because they're fairly fundamental. Modern papers in PL theory can be tricky because they build on so much other material.

u/ErrorIsNullError · 6 pointsr/ProgrammingLanguages

TPL is great for type theory stuff.

I'm working through Compiling with Continuations right now, and it's pretty good as a practical way to specify semantics that also has a history as useful in compilers. Matt Might's writeup gives a flavor.

u/sv0f · 3 pointsr/ProgrammingLanguages

All of your questions are pretty much the set of reasons why the Lisp family of languages was invented. Have a look at Common Lisp, Scheme, and the many programming languages books based on these languages. An "introductory" one is here, and advanced one here, and an even more advanced one here.

(Check whether there are newer editions of these. You'll probably want the newest ones so that you can easily type their code into a modern Scheme or Common Lisp implementation.)

u/samrjack · 2 pointsr/ProgrammingLanguages

I would say go with whatever your computer uses so that you can follow along (unless your computer uses something really obsucre).

As for books, I can only really recommend the places I learned X86 from which would be Hacking: the art of exploitation since it puts assembly the context you'll find it most often (looking through assembled code) so you learn many useful tools along the way. Also the textbook I had in college (you can find it cheaper if you look around) which covers many other topics too relating to computer memory and whatnot.

Though for just learning some basic assembly, look for some simple resources online. It's not too hard to learn generally speaking so you should be fine.

u/suhcoR · 3 pointsr/ProgrammingLanguages

Yes. Here are some papers about it if you're interested: https://web.archive.org/web/20050510122857/http://www.iis.sinica.edu.tw/~trc/languages.html They refer to earlier work which again refers to Lisp and a precursor of CLOS.

The Art of the Meta Object Protocol describes the MOP. If you're looking for a general book about CLOS then you could e.g. have a look at https://www.amazon.com/Object-Oriented-Programming-COMMON-LISP-Programmers/dp/0201175894.

u/timlee126 · 3 pointsr/ProgrammingLanguages

Thanks.

Are MOP and CLOS the same thing?

Now there are three books mentioned

u/ablakok · 3 pointsr/ProgrammingLanguages

Nice list. But how about DCPL, Design Concepts in Programming Languages?

u/cparen · 1 pointr/ProgrammingLanguages

I worked on the Windows operating system and Internet Explorer at various points in my career. With very large projects like those, even normally trivial problems become significant engineering tasks when doing so at those scales. Just compiling the core library of IE (allocators, text handling, dom, etc) could take half an hour, so you really came to appreciate incremental builds.

To get a sense of the scale of such problems, I might recommend the book Large Scale C++ Design. I remember finding the book very dry, but in the author's defense, it's a very dry problem space.