May 28 2016
UPDATE: Now that the official HOPE schedule has been published I can say that I'll be speaking in the Noether room on Sunday, 24 July 2016 at 2:00pm EST4EDT.
UPDATE: The Internet Society will be livestreaming video of the talks as they happen. Here's the page listing all of the livestreams.
I found out last weekend (yes, I've been sitting on this - timed posts are the busy blogger's friend) that the talk I submitted for The Eleventh HOPE in July of 2016 was accepted. I will be giving a presentation on Exocortex, my latest work (of mad science), entitled Constructing Exocortices with Huginn and Halo at some point that weekend. I'll be talking about both Huginn (I asked Andrew if he would present with me; he declined because he may not be able to attend HOPE this year (and Andrew, if somehow you can fit it into your busy schedule I'd really like it if you did..)) and Exocortex Halo. To be more specific, I'll be talking a little bit about how they work - what agents do and how they fit together to process information individually to carry out more complex tasks. I'll also be talking about how Halo's constructs send and receive information to and from Huginn to accomplish more sophisticated things (like generate the speech that gets played over a VoIP link or send commands to a personal search engine to index an entire site to sort through later).
This also puts me on the hook to come up with some really off-the-wall but useful stuff to show off. Thankfully I've got several hundred off-the-wall ideas already written down. Now where are my d10's...
When I know where my talk fits into the HOPE schedule I'll post with the specifics. I'd really appreciate it if everyone spread the word about my talk (and thank you in advance if you do).
May 26 2016
I'd beg the forgiveness of my readers for not posting since early this month, but chances are you've been just as busy as I've been in the past few weeks. Life, work, et cetera, cetera. So, let's get to it.
As I've mentioned once or twice I've been slowly getting an abscessed molar cleaned out and repaired for the past couple of months. It's been slow going, in part because infections require time for the body to fight them off (assisted by antibiotics or not) and, depending on how deep the infection runs it can take a while. Now I can concentrate on getting the molar in front of it, which has long been a thorn in my side, er mouth, worked on. Between being in close proximity to a rather nasty infection and the general stresses applied to molars during everyday life the seal on the crown broke at some point, leaving it somewhat loose and making squishing sounds when I chew. I don't know the extent of the involvement, but from coming home from work wiped out just about every night I'm starting to suspect that something nasty is going on in there also; it's a pattern that I've come to recognize over the years as suggestive of an immune response. There's a good chance that this particular pain-in-the-ass is going to need major repairs and, given how little of the original tooth is left (I lost count of the number of surgeries and root canals performed on it a couple of years ago) I'm pretty much resigned to losing the tooth entirely. I'll probably wind up getting an implant in its place if it does get pulled for the sole reason that it'l prevent the rest of the teeth in my mandible from slowly drifting to the fill in the space. Of course, if I do get an implant I'll try to stick a magnet to it and if it works I'll post the pictures.
Mar 26 2016
In my last post on the topic of exocortices I discussed the Huginn project, how it works, what the code for the agents actually look like, and some of the stuff I use Huginn's agent networks for for in my everyday life. In short, I call it my exocortex - an extension of the information processing capabilities of my brain running in silico instead of in vivo. Now I'm going to talk about Exocortex Halo, a separate suite of bots which augment Huginn to carry out tasks that Huginn by itself isn't designed to carry out very easily, and thus extend my personal capabilities significantly.
Now, don't get me wrong, Huginn has a fantastic suite of agents built into it already and more are being added every day. However, good design techniques require one to realize when an existing software architecture is suited for some things and not others, and allowances should be made for that. To put it another way, it was highly unlikely that I would be able to shoehorn the additional functionality I wanted into Huginn and have a hope in hell of it working. However, what Huginn has a multitude of are interfaces for getting events into and out of itself, and I could make use of those interfaces for plugging my own bots into it. The Website Agent is ideal for pinging REST API interfaces of my own design; Jabber Agent implements a simple XMPP client which can send events to an address on an XMPP server (assuming that it has its own login credentials); oversimplifying a bit, Webhook Agent basically sets up a custom REST API rail that external software can use to send events into Huginn for processing; Data Output Agent is used for sending events out of Huginn in the form of an RSS feed or a JSON document that can be consumed and parsed by other software.