Azoth - a developer blog

Azoth - a developer blog

Quo Vadis ?

Every journey starts with the most important step - the first one. I knew, going into this, that it was going to be a long haul project, but retirement beckons and I'm going to have to find something to do with all this free time, a concept I'm not overly familiar with …

The Goal(s)
The goal is complex, or at least the path towards the goal is complex. One goal is to create an Objective-C driven user interface framework, that I can compile for either Mac, iOS, Windows and Linux, and get all the benefits of Objective C (memory management, blocks, the Foundation framework). The end-result should be skin-able, so I can use it in different scenarios, but also provide a default "professional"-looking interface as its standard appearance.

But this is not the actual goal. The real goal is to write something similar to the original Baldur's Gate game, again in Objective C, and again for all the above operating systems. The UI and non-gameplay sections of that game will be written in a game-skinned version of the framework, providing me with builtins for scrollbars, collection-views ("inventory"), drag-and-drop etc. That's the real goal. As I said, it's going to be a long haul project [grin]

The vision is something similar to AppKit on the Mac. Something that's familiar, if not exactly the same. It's unlikely to be anywhere near as comprehensive as AppKit, and it'll have a lot more limitations, but it ought to provide a base level of functionality that allows fairly complex user interfaces to be constructed.

Getting there
In order to run-test or "dog-food" the framework, once it's minimally functional, I'll be building a tool application (not for release, so it's ok if it still has rough edges or goes the wrong way a few times before finally being useful) to make sure I shake out the bugs and gotchas that will inevitably creep in. I also intend to write small test-apps along the way to test out the major functionality and act as "how-to" guides being very simple cut-down examples.

I mentioned above that the platforms I'm targeting are the Mac/iOS as well as Linux and Windows. To build cross-platform functionality like that is a huge deal in and of itself, so my plan is to use the newly-released SDL3 to be the rendering underpinnings and provide the graphics abstraction…

Platforms

  • Windows: there is a build of Objective C that integrates into Visual Studio, and is supported (and supplied) by Microsoft. There is also a port of the GNUstep Foundation framework (which provides modern ObjC bindings like blocks and libdispatch) for this toolset, so in theory it ought to be plug-and-play. It won't be, things aren't that simple but I have got a simple "hello world" type of application with SDL3 and ObjC both working together.

  • Mac / iOS: The Mac is actually where I'm building/testing the majority of the code - well all of it, right now - so that's not going to be an issue. iOS ought to "just work", because the only common dependencies are SDL3 and Foundation, both of which iOS has. I'm expecting there to be implications with the virtual keyboard (for example) amongst other issues like sandboxing. I'll cross that bridge when we get there.

  • Linux: It would be nice to support Linux too. Again, the dependencies are SDL3 and Foundation - and Linux is the prime port for GNUStep's Foundation, so in theory it ought to just work. Again, my expectations are that theory and practice will … differ a little.


Nomenclature
Why "Azoth" - well although it sounds like it, there's no association with any demon-worshipping going on here. The background is that the last major framework I wrote was an object-relational database interface, and I called it QuickSilver, which was an old name for the element Mercury. I was going to name this framework 'Mercury', but Apple already has a framework called Mercury installed, so that was out. Looking up synonyms for Mercury revealed Azoth, which is another old name for the element. I guess people really liked to name Mercury things… Anyway, that's the background, and as a result I claim the 'AZ' namespace for all the classes, constants etc. So we have AZApplication (not NSApplication), AZWindow, AZView etc. etc.

My Image
Back

Azoth - a developer blog

Picking the right technology

Choosing the rendering back-end is a big deal, there's not a lot of commonality between the available engines out there - choose QT, for example, and not only are you locked into their pricing model, but the environment demands that you write very specific code that doesn't transport to any other back-end should you wish to switch. It's a choice that locks you in, and you're making that choice right at the start, when you know the least about the project. Not a great combination.

There were other things to consider as well, rather than just the display output. It's not an issue when you're only thinking about one platform, but when cross-platform is the goal from the outset, things like input-device management (joysticks, controllers, etc) and audio (playing simple WAV files is generally ok, but spatial audio, synchronising output to events, multiple audio streams playing simultaneously etc. can get more complicated..)

So, looking elsewhere…

  • OpenGL used to be the standard for cross-platform performance graphics, but with Vulcan coming along everywhere except the Mac, and Apple doing its thing and coming out with Metal for Mac/iOS and not supporting Vulcan (well, apart from the compatible parts of MoltenVK) it's a bit of a mess down at that low level.

  • Other libraries I considered included raylib, which pretends to still be OpenGL but really it does whatever the platform needs under the hood - but it doesn't have shaders, and I want compute shaders ultimately for the game…

  • SFML seemed to be dead but then suddenly announced a new release, though it seems to be more internal-rewrite than new features and abilities, it also seems to be lacking in the shader department

  • Bgfx looked like a possibility, but it also looked pretty complicated for what it provided

  • And then SDL version 3 was released, being a massive rewrite of SDL2 (released about a decade ago), providing a GPU abstraction which allows vertex, fragment, and compute shaders, and being very very cross-platform. It's also in C, which (in my experience) tends to make integrating into Objective C a bit easier than when you have C++ involved. It's open-source so there's no licensing issues to worry about, and it supports all the platforms I'm looking at. SDL3 also provides more cross-platform things than graphics, it gives easy access to filesystems, audio, input devices (joysticks, paddles, etc). All told, it seemed the clear "winner".

So the die was cast, SDL3 it was.

There were some wrinkles when you got down to the fine-print. I was planning on using the SDL Render api because (a) it's really simple to use, it just provides a hardware-accelerated blit mechanism really, (b) I don't need 3D graphics, and (c) I've used it before, actually with shaders to do some of the heavy lifting, so I was very hopeful … until I realised the shader functionality was locked into the "GPU" api, and the "Render" api didn't have access to them. Neither could you run both the render and GPU apis together in the same window. There are good reasons but was still the proverbial spanner.

The GPU api is a superset (a far more capable superset) of the render api but it's a lot more complex to set up and use (still a lot easier than Metal or Vulkan though), so the plan was to divide and conquer - to make things work under the simple API first, and then to use the GPU api to provide the same functionality as the render api while exposing the extra functionality (eg: shaders) afterwards. I planned to abstract the render api into a protocol, which would then be implemented using two concrete classes - AZRenderer2d and AZRenderer3d for the Render and GPU api respectively.

"It's a bold plan, Cotton, let's see how well it works out for him"

My Image
Back

Azoth - a developer blog

Baby steps... Fundamentals are important

Design, it always comes down to design. The model I am going with is basically AppKit which has stood the test of time. It's a framework I've grown very used to over the years, and I prefer the familiar feel of AppKit to the declarative UI's that are more the current vogue. I blame it on being a physicist rather than a computer-science graduate… In any event there was a fair amount of boilerplate to get things even remotely bootstrapped…

Getting Going

SDL3 has introduced a "callbacks" mechanism so there isn't even a call to
main any more in the code, you link with SDL3 (after defining some constants to indicate that you want the callbacks in place) and you'll get a call to SDL_AppInit, quickly followed by calls to SDL_AppEvent when there are keyboard/mouse events and SDL_AppIterate when a new frame should be displayed. Eventually you'll get an SDL_AppQuit and your app should terminate.

This seemed to tie in nicely with a singleton pattern for
AZApplication (as a counterpart to NSApplication) which would be responsible for bootstrapping the window (we're currently one-window-only), setting up a content-view for that window and establishing it as the root view for events to filter down from. The first callback sets up the singleton, and the others are redirected into the singleton to start propagating the events and/or draw requests into the more-familiar AppKit style.

My Image
/*****************************************************************************\ |* Handle next frame \*****************************************************************************/ - (SDL_AppResult) nextFrameWithAppState:(void *)state { /* convert from milliseconds to seconds. */ const double now = ((double)SDL_GetTicks()) / 1000.0; /* choose the color for the frame we will draw. The sine wave trick makes it fade between colors smoothly. */ const float red = (float) (0.5 + 0.5 * SDL_sin(now)); const float green = (float) (0.5 + 0.5 * SDL_sin(now + SDL_PI_D * 2 / 3)); const float blue = (float) (0.5 + 0.5 * SDL_sin(now + SDL_PI_D * 4 / 3)); /* new color, full alpha. */ SDL_SetRenderDrawColorFloat(_renderer, red, green, blue, SDL_ALPHA_OPAQUE_FLOAT); /* clear the window to the draw color. */ SDL_RenderClear(_renderer); /* put the newly-cleared rendering on the screen. */ SDL_RenderPresent(_renderer); /* carry on with the program! */ return SDL_APP_CONTINUE; }
┌─────────────────────────────────────────────────────────────────────────────┐
│                          SDL3 events, redraw etc.                           │
└─────────────────────────────────────────────────────────────────────────────┘
                   ▲                                       ▲                   
                   │                                       │                   
┌─────────────────────────────────────┐ ┌─────────────────────────────────────┐
│            AZApplication            │ │              AZWindow               │
└─────────────────────────────────────┘ └─────────────────────────────────────┘
                                                           ▲                   
                                                           │                   
                                        ┌─────────────────────────────────────┐
                                        │               AZView                │
                                        └─────────────────────────────────────┘

Where we are and what's next
At this point, the only event we're handling is the SDL_EVENT_QUIT event. In the nextFrameWithAppState we simply clear the screen with a new colour, using some maths to make it fade gradually… Writing a framework is almost as much about testing it as about creating it. As I go along, I'll be creating test applications, the one above is the very first.

The code in the callout is not really very remeniscent of AppKit, but it's a start... Eventually all these SDL calls will be subsumed into AZ but for now it produces the application in the video...

Next up is to remedy some of this, and make the views responsible for their own content, and to regulate their subviews

Back

Azoth - a developer blog

Taking a view - organising the view hierarchy

Two main things getting added at this stage…

Event processing.
Start off with the mouse events since they're the most common. We do a depth-first search (so all subviews first), procedure is:

  • Does the event co-ordinate lie within the subview
  • if so, does -hitTest() respond back with YES (does by default)
  • if so, does the view-specific mouse-handler return YES ?
  • if so, we're done
  • if not, loop to next subview
  • if all subviews deny the event, then try ourselves (since this is a recursive call)
  • return whether handled

Note that this differs slightly from the AppKit way of doing things, in that an event handler returns a BOOL rather than being a void method. This lets us short-circuit the handling once an event has been "claimed" by a view.

My Image
Back