With the libclang functionality working reasonably well as a python module, it was time to move on to getting it to work with Emacs. Naturally, my first instinct was to simply use Pymacs to interface with Python and then figure out a way of making it asynchronous. So I spent a couple of hours and made a functioning version with Pymacs, and then explored my options for making it asynchronous. Two solutions stood out:
I set up a neat little deferred chain, but it didn’t seem to be working, in the sense that my emacs window would still freeze up for several seconds while libclang was busy parsing the file to find a list of completions. I asked around on the #emacs IRC channel, and found out that emacs-deferred is actually just timer-based, the execution still happens synchronously. So that was out. I then tried emacs-async, but quickly discarded that option because the forked server instance doesn’t inherit the environment of the caller (not even the closure). My implementation needs the compilation and completion cache to stay in memory as python objects, something that just wasn’t possible with the forking approach. So that was out too.
Then I headed over to emacs-epc and python-epc which combine to give you an RPC mechanism for communicating between emacs and python. You can start a python server with some elisp code, and then you can pass function names to that instance along with the arguments for calling those functions with (man, that was a complicated sentence). You can also pass a callback function to it, which will be called with the results of the RPC call. And there are two versions of this: epc:call-deferred
and epc:call-sync
, so that fit the bill perfectly.
With this, it took me a couple of hours (still getting used to elisp, development should get faster with time) to get a functional version up and running, and if you’re so inclined you can clone the repo to check it out. I still don’t have any code for doing anything with the list of completions returned, but they’re there, and emacs doesn’t get stuck while it’s happening, so I’m happy. The first call takes 2-3 seconds to return, the next one takes around 200-400ms since the results will have been cached from the previous call.