Well, what a ride this past year’s been, on several fronts. I’ve been away from the blog for nearly 6 months, by the looks. Mostly, that’s because a) I managed to seal myself out of my WordPress account, then b) was too damn distracted with everything else to deal with “this too”. So, now that the amazing staff at WordPress.com have gotten me back into my own account, let’s try for a catchup summary, shall we?
On the Home Front
Honestly, I’m not sure what to say here. We’ve had developments on the home front that have made things more challenging than anticipated. In discussing the pros and cons of making it public, it’s now clear that this story is not mine to tell to the general public. Before you ask, I’m not likely to tell you unless I think it may somehow directly affect you. For most people, it simply won’t.
The tl;dr summary though is that I’ve been occupied lately by things other than this blog in my free time.
On the Work Front
My last blog entry in July this year was just after completing a pretty large project that saw the production of two software artifacts, going under the working names MWCIT, and MWRDTools. I did a post around the MWCIT, but have only indirectly mentioned MWRDTools.
MWRD Tools (Murrumbidgee Wetlands Relational Database Tools) was originally written for the NSW Government by another party. I was tasked with fusing a number of diverse data-sets into it, expanding its decision support functionality, and getting it as far down the path of a multi-user web-based product as possible.
I managed the decision support extensions and got it served out of an SQLServer database for them, but web-rendering a particularly monstrous data-set turned out to be something that modern web-browsers are natively incapable of. None of the workarounds were particularly desirable, so we left it as an ArcMap for Desktop plugin.
Towards the end of the project, the owners of the software agreed to my suggestion that it be made open-source, so I hosted it up on GitHub for them.
The project itself was plagued by a number of technical issues, such as a) the codebase needing upgrading to suit ArcMap API changes, b) the codebase carrying the legacy of tightly binding the user-interface with the back-end database, c) surprise fine-print restrictions stopping the deployment of multi-user databases without scads of unexpected licence fees, and d) surprise shifts in 3rd-party data encoding standards at the 11th hour.
Still, the final code-base pleases me. The only thing I’m unhappy about is not having the time for a unit-testing framework to exercise the code with. I threw the idea out when I took my first close look at the original code-base and realised it wouldn’t accept a web front-end, or an alternate database back-end without a real fight. I’d need extra time that wasn’t budgeted for, and a unit-testing framework was the first causality of discovering that the estimate was wildly optimistic.
I did manage to restructure the codebase around MVP lines though, to allow them to replace the UI with something web-based if they ever find a way to deal with that monster data-set. Given that it’s now MVP throughout, it’s also at least possible to build a rich set of unit-tests for it, which wasn’t possible when it was a “big ball of mud“.
Since then, I’ve gone on to helping a researcher take his species conservation action optimisation prototype (dubbed MarxACT), written in R, and convert it to .NET. Why? Four hours down to three minutes for a single optimisation run delivering the same results.
I can’t talk about the research just yet, as it’s unpublished. But, I can tell you that it’s remarkably similar to a previous contract here. So much so that I was able to re-use a model support framework that I built for the EFlows contract.
That’s probably enough for now. I could, nay should, follow this up with a post on some insights gained in cutting a large R model across to a more “traditional” language. No promises on timeframe. I like to think that with admin access again, the next post won’t be another six months out though.