Friday, 23 March 2012

OpenGL context and window management

Before I can do any actual 3D graphics work, I need a framebuffer and an OpenGL context.  As mentioned previously, this is a toolkit- and/or platform-specific issue, and it carries with it a bunch of implementation issues around object life-cycle management.

First of all, we have to make sure that the program is in a state where it knows enough about its environment to actually create a window and get a handle to an OpenGL context.  Second, we have to make sure that we aren't inadvertently creating extra windows -- or, worse, aliasing the existing window and clobbering it when a passed-by-value GlContext gets destroyed.  And finally, this is exactly the sort of platform-specific crap I want to keep out of Tungsten's application code and quarantine as much as possible.

Fortunately, we've taken care of much of that with the AppFramework factory class (and its SdlAppFramework implementation).  But setting up and creating an OpenGL window involves a lot more state than anything else I've written about, so it merits some extra discussion.


We can look at the GlContext class declaration (GlContext.h -- BitBucket) but I think it's better to look at how a GlContext actually gets initialized and used first.  Here's some relevant code from tungsten.cc:

AppFrameworkPtr framework(new SdlAppFramework);
GlContextPtr context = framework->glContext();
// ...

context->setGlVersion(3, 2)
        .setDepthSize(24)
        .setColorSize(8)
        .setAlphaSize(8)
        .setStencilSize(0)
        .setDoubleBuffer(true)
        .setMultiSample(0)
        .setWindowName("Dogfight: Tungsten")
        .setWindowResolution(640, 480)
        .setFullScreen(false);

context->initialize(); // FIXME: Either check retval or return void

(AppFrameworkPtr and GlContextPtr are typedefs for std::tr1::shared_ptr<foo>.  I kind of hate having template instantiations and scope resolution operators floating around my code.)

I have three separate phases of context's lifecycle on display here.  We have initial creation from the AppFramework object; parameter specification via a nifty chain of Named Parameter-ish calls; and finally window/context initialization.  After that the client code mostly just calls context->swapBuffers() in an event loop.

There are a couple design decisions exposed here that I'm not entirely sure about.  One of them is the big chain of parameter-setting calls: By default, a GlContext's configuration parameters are all invalid to begin with (with the exception of the bools, which is awkward), and not explicitly setting every param to something valid will cause context->initialize() to fail.  The next is the separation of the parameter chain from the initialize() call, and the question of whether a failed initialize() is recoverable.

My rationale for forcing the client to initialize (almost -- damn bools) all of the parameters in a GlContext is avoiding bugs-by-omission.  This theory contends that "reasonable defaults" are easy enough to ignore, until some assumption changes that makes one of the defaults wrong.  At that point it's hard to look through the code and figure out where that default comes from if it's never made explicit outside, say, a ctor's initialization list.  By forcing the client to initialize everything, I can force the client to think about everything that should be initialized, and that initialization code lives conceptually closer to the GlContext's initialization in the client code -- making it more likely that the inits will change when the assumptions do.  And with chained methods, it's not horrendously ugly.  Hardly any worse than a named parameter list, really.

On the other hand, by starting off with invalid default parameters, I raise the question of whether any given parameter value is in fact valid.  And since the only convenient way to specify parameters without horrifying chains of if-elses is by chained methods, I can't easily test any particular parameter value for validity (perhaps by querying the underlying toolkit to see if I really can get a 32-bit depth buffer).  If I specify a dozen parameters and context->initialize() fails, all I know is that one or more of those parameters didn't pass muster with the underlying toolkit.  Kind of hard to recover from that.

This leads into the question of whether a failed context->initialize() call is really recoverable.  At least in the STL implementation (StlGlContext.cc -- BitBucket), there are two ways that the call can fail: First by failing to create a window, and second by failing to create an OpenGL context from that window.  The whole point of the framework class is to abstract that reality from the user, but even so the question arises of whether initialization failed for avoidable reasons (the graphics driver doesn't support 32-bit depth buffers, only 24-bit) or utterly unavoidable ones (I'm trying to run the program over an SSH connection and don't have an X session open at all).

It might be more convenient (for the client code, obviously) to start by generating a minimal or known-good set of parameters (use SDL to query for a nice high-res graphics mode, leave off multisample anti-aliasing, and so on) and let the client manually reset into a different mode (perhaps by restarting) that can be chosen based on queried known-good values.  For now I'm going to leave things as-is, because I'm optimizing for programmer time and this is "good enough" for the present.

One last thing: SdlGlContext references "sdldie()" and "checkSDLError()", which is a particularly ugly way to do error reporting.  I'll need a better way to do that, ideally something that can keep track of SDL error states and OpenGL error states -- and maybe catch any exceptions I have floating up and down the call stack -- in a centralized place (also something connected to the AppFramework, because at least the SDL errors are toolkit-specific).  And speaking of which: That SDL_CreateWindow() call throws an "Invalid Window" error that doesn't seem to affect the program's behaviour at all.  SDL_CreateWindow() returns a perfectly valid handle, and the OpenGL context I create from it seems to do all the right things.  Still, I'm not happy.

No comments:

Post a Comment