Inspired by Ambisinister's commission theater, I thought I'd start my own commission request page. The service I'm offering, however, is much different than Ambisinister's. Instead of creative services, I'm offering automatic wiki services. What does this mean?
Using as little bandwidth as possible (and with delays to prevent flooding), I keep a file based directory of all the text on this wiki. I can then write code which parses, searches and processes these files to build additional wiki content. As an example, the contents of Favorites page is determined using such code. I began accumulating this data when it became clear that ReaderCentricDesign wasn't going to happen and that the single character code change needed to fix InterPageLinks is never going to be made. While I have no intention of posting this data archive, I will take requests for code to search it, which is what this page is for.
First, some caveats:
- I cannot guarantee response time to, or even completion of, commission requests. Some may simply not be possible/practical.
- I cannot create information that isn't there. If the wiki code of a page doesn't have the data, I can't really do jack.
- The less regular the source data is, the less likely the result you want is possible.
- I have no control or input over the server hosting the wiki. (If I did, it wouldn't be running such lame wiki software.) One of the results of this is that I cannot do anything in real time, nor can I modify the wiki software itself.
- I will not run post results from a job more frequently than monthly. Posting more frequently that this would be a burden to both the server hosting the wiki, my own hardware and my own personal time. At a preference, I will post job results each solstice and equinox, unless a good reason is provided to do otherwise.
Open Commission Requests
Place your requests here.
Commissions in Progress
- Martial Arts Matrix: While I like the idea of the MartialArtsMatrix (that's why I created it), it requires manual updating. It appears that very few people care to do this. I have made a request to authors of martial arts styles to help automate a new version of the matrix that authors can opt into if they like. So far there have not been many takers. On the other hand, I've not yet built the page, either.
- Revealing the Scholarly Tapestry: Build a graphic that represents the interlinking pages of the LexiconOfElderDays. This should also indicate which nodes are still phantoms. The easiest way to do this will probably be to create a .dot file for use with GraphViz.
- Senseless Mandala Generation: As an offshoot of the Lost Eggs project, it should be easy to build a .dot file of the link network and, from it, generating a massive (probably really ugly) graph of the nodes of the wiki and the links between them. Unfortunately, GraphViz really can't handle .dot files this large. If this project ever is done, it will probably work by generating a sparse matrix image, where each axis contains the entire set of page names, and if the page on axis x leads to the page on axis y, a dot is drawn at the x,y point. Since there are over 20,000 pages in this wiki, the resulting bitmap would be pretty large, but could be compressed by, for example, dividing it into 8x8 pixel squares and finding a greyscale value to match. Or doing the same, but using color. Alternatively, there are some other "large graph" drawing algorithms that could probably handle graph like the wiki links.
- Favorites: Though no one requested it, the Favorites page is an example of the type of work that can be done.
- Collected Demons: Based on a request made on the ExaltedCompendium, ATaxonomyOfMadness is parsed to automatically extract the demons therein and reorganize them by first, second and third circles.
- Lost Eggs: To help out the UserFriendlyCategories initiative find "orphaned" pages, this project will look for pages that are linked only by user pages. The result will be an Orphans page, listing links of pages that may need to be linked to from the directory pages.
- Project Gezlak: Doing battle with script gremlins, this project uses the Python wikipedia robot framework to automatically fix a bunch of links that were broken by Xyphoid's conversion script.