Nothing to do with OpenGL-ctypes. Spent what little time I was awake yesterday on getting the upstream channel signal-quality monitor graphs written for Cinemon. Matplotlib is fine, but somehow the interface feels... mushy... like I just dump data in and hope that it comes out okay.
That kind of mushiness is good for what the package is trying to do (automatically build graphs for non-programmers in an interactive or semi-interactive context), but it freaks me out a little when I'm trying to code something that will be used across thousands of machines with no human oversight or control.
Any number of times during testing I mistakenly passed in a null set of data (or did something else wrong) and wound up bringing the machine to its knees rendering thousands of pieces of text. Sure, I coded catches for all the ones I found, but I'm going to have to test the beetlejuice out of it to be confident it's going to be reliable across all possible situations.
Interestingly, the set of data I grabbed from a channel (at random) for testing turned out to be hideously bad looking; a huge drop with lots of high-amplitude jitter. I gather I really don't know what the data should look like, because to me it looked like the channel should be collapsing. Will have to get our domain expert to interpret a few graphs for me at some point. Probably also need to expand the length of time we keep the data (maybe up to a week or even a month), as the graph I have (from 8 hours) just begins to show a single waveform for the overall graph (one rise, one dip).
Today it's off to grams' for pool, then I should probably try (again) to get the silly nVidia card working. I have managed to get mplayer working on the machine, so I can at least clear out the old PVR recordings, but MythTV is still failing with what looks like audio-buffer problems (I have to use SDL's audio driver for mplayer to work, alsa causes the same problems as seen in MythTV).
Pingbacks are closed.