Query came up on PyOpenGL-dev this morning about how to handle GLchar pointer arguments. These are binary-specified arguments, they are human-readable text *most* of the time, ascii source-code and identifiers, that kind of thing, but nothing about GLchar pointer requires that they be ascii. They *are* 8-bit character strings (that's what GLchar pointer means).
But it looks awkward for Python 3 users to specify b'identifier' and b'#shader-code...' so they are likely going to expect to be able to pass unicode values in. That, I think, we can support without any real problems, but then users are going to be using Unicode to store and process their ASCII 8-bit shaders...
The question is what to produce when we *return* something which is a GLchar pointer. Most of the time, these are ASCII human-readable strings, some of the time they are 8-bit character pointers with binary data. It seems whichever way we go, some corner cases will pop up where a user tries to compare, search or otherwise interact with a byte-string and a unicode object and blows up because they can't be auto-converted.
So, is best practice to raise errors on ingest (refuse to guess, require explicit conversion to 8-bit)? Return unicode even if the data might be binary (make it convenient for the user in the common case of not caring)? Allow unicode ingest, but produce 8-bit output (introduce some corner cases that are likely to blow up "elsewhere" in the code)? Or do we have to explicitly code every single GLchar pointer entry point looking at whether that entry point is dealing with text or arbitrary data (and hoping that it always does the same)?
My sinking feeling is that the only way to provide both a "natural" interface and a safe/sane one will be the last of those. I'd rather go for the first or third option, just to make it simple and easy to explain. Python 3 experts care to weigh in?
Pingbacks are closed.