Index

2.06.2017

Post-Processing Shaders and Effects




Bitphoria has had a simple post-processing effects setup in place for some time now. This comprised a single framebuffer object with a color and depth texture attached to it, which would be read by a single fragment shader on a full-screen quad in order to generate a simple mipmap blur and contrast boost. The aim here was to just achieve a very simple and basic effect beyond what simply rendering the scene straight to the screen/window could achieve. This could not perform multiple post processing passes or effects that required multiple shader stages.

I've always had the nagging sensation that something's visually missing from Bitphoria, beyond the general lack of coherent visuals - which I attribute to not having sat down and actually designed a cohesive appearance to a game via the engine's scripting functionality. With the ability to add various post processing effects to Bitphoria I feel that a more coherent visual aesthetic can be achieved beyond what one simple post processing shader offered.

For the new post processing effect system I wanted to be able to easily add more shaders, and route their inputs/outputs between them. At the beginning of each frame a 'raw framebuffer' is bound that has three texture attachments: RGBA color, XYZW reflection vectors, an HSL 'overbright' emission texture, and of course a depth texture. The reflection vectors texture is generated by all of the shaders used to render world geometry, procedural models, etc.. If a surface is not supposed to be reflective this is indicated by a reflection vector facing the viewpoint (Z = 0). The HSL overbright emission texture is for the 'spectral trails' shader effect, which creates a sort of quickly-fading 'rainbow' trail behind the objects that write to that texture when rendered.

Objects leaving spectral trails on the screen. This can be annoying and so has been toned down and is mostly used sparingly for momentary effects like explosions.

At the end of the frame the postfx system is executed, which then goes ahead and renders a fullscreen quad for each registered shader effect - binding all necessary textures and setting all prescribed GLSL uniforms for each.

Creating an effect stage involves loading its shader, creating an FBO, and attaching a single color texture to the FBO to be used as the effect's "output". A depth texture is also created and attached to the FBO only for what's called "FBO completeness", even though the full-screen quads do not convey any depth information. Still, a shader effect *could* generate depth values that would be written to the depth textures using gl_FragDepth, and that depth texture could be utilized by succeeding effect stages.

From there shader uniforms can be added from the engine with "pfx_parm()", where ints, floats, vec3s and mat4s (4x4 matrices) can have a pointer to their memory stored and used to set any necessary values the effect's shader may require.


Bitphoria's current postfx shader pipeline. Green boxes are texture outputs from effect shaders.


Similarly, "pfx_input()" is used for referencing other effects, allowing their FBO color or depth textures to be bound when the current effect is being rendered, for routing the output of stages as inputs into others. Some effect shaders will use the FBO texture output of previous stages (or the output of later stages from the previous frame) as inputs for certain effects.




1.22.2017

Todo List Prioritization, Screen Space Reflections, and a Long Break


 After Bitphoria's initial release, I spent a few weeks with those who participated in helping to test it out and discovered a few bugs and things that desperately needed changing/fixing/etc. I also started working on fleshing out the three default games that come with it. In doing so it became clear what other features the scripting system was lacking. As a result of the public beta release the todo list grew quickly. Even though I was knocking out new features and bug fixes in that first few weeks the list continued to grow faster than I could keep up.


Bitphoria's accumulated "changes.txt" since v1.0a release (formerly 'v1.00b').


 I'd since decided to take a break. It was a power burn-out session in the few months leading to the established release date that I aimed to meet. The break was meant to strengthen my resolve in finishing Bitphoria as a project. The end goal is to produce a product that could hopefully generate an extra income for my family, my wife and two baby girls. I owe it to them to see this thing through after subjecting them to all of the time I have spent working on it.

 The master server has been running this whole time, well, at least 95% of the time, and I see that people have been looking for online games (of which there are usually none) and then starting their own to check out Bitphoria. I'm not going to continue actively promoting Bitphoria until things are further along. If other people want to show Bitphoria to others and share it, I'm not going to stop them. My goal here is a sort of soft-launch, whereby people can start playing with Bitphoria while it's being developed, and the 'die-hard-core' fans are the ones who keep it alive and seed a community and fanbase. Sounds good on paper, we'll see how it all pans out in practice. In the meantime I'm sticking with incremental public alpha/beta releases on itch.io until the scripted games are all fleshed out and everything's at least had a once-over - at which point I think I'll invest in a Steam Greenlight strategy, or maybe even a crowd-funding campaign for a virtually finished game! (That *is* what it takes nowadays to crowdfund a game, right?) At any rate there's no point to an all-too-premature push for exposure, which just so happens to have backfired on many-a-indie game project due to negative exposure from the poor quality of an unfinished product. This can end in permanently tainting and marring a game's reputation regardless of how the finished product turns out down the road.

 At this point, having started diving back into Bitphoria's code and becoming reacquainted with the todo list and where things are at. It's become clear that there's still a long ways to go if I want this thing to be one-hundred percent uncompromising. However, I don't think I have it in me to pull that off. So, the next step was organizing the todo list by priority. What things are absolutely essential to the core idea of Bitphoria? What things can it go without? What things can I add in later that are aesthetic-only? What ideas are pipe-dreams? This is where the project is currently at. By my estimation Bitphoria should be on sale in the summer.


I forgot how awesome it was to upgrade. Been out of the hardware game for over a decade.

 The todo list is fully consolidated, after another sleepless night, and I have my work cut out for me. I also recently bought a GPU for myself for Christmas, a 4GB XFX Radeon RX 460. Literally the first GPU I have gone out of my way to acquire in 15+ years. Half the reason I bought it was to play DOOM. Incidentally, the last GPU I made it a point to acquire was a GeForce3, back in 2001, so that I'd be ready to play Doom3, which ended up not being released for 3 more years. The new GPU has afforded me the opportunity to get crazy with Bitphoria's graphics, and makes it possible to record at least 720p gameplay footage at 30fps. The original plan wtih Bitphoria was to design a cool looking game/engine that would run on older and/or budget hardware at playable speeds - where 'playable' in my gamer opinion is 60+ frames per second, minimum.


Now for a blurring post-process shader to hide the jitter-induced dithering!

 I just finished implementing a decent screen-space reflection shader, written by hand from scratch, based on my own ideas about how to get the information where it needs to be in order to make it happen. Instead of having multiple FBO textures storing surface normal, surface position, etc.. I am actually only storing the vertex-shader calculated reflection normal that is interpolated by the fragment shader of reflective surfaces. This is in contrast with how most implementations calculate the reflection normal in the fragment shader using the fragment's normalized camera-space coordinate and normal of the fragment - both output by the reflecting surface's shader into (typically) different FBO textures.


Camera-space reflection vector, depicted as RGB representing XYZ -1/1.

 In Bitphoria, my idea was to pre-calculate the reflection vector in the vertex shader to be interpolated across surface triangles, and then pack it into the red-green channels of a single RGBA texture from the fragment shader using a spheremap transform, which is much more compact than packing both a coordinate *and* surface normal to accomplish the same result. However, I still need the coordinate of the fragment to perform the actual raytrace, which happens to be reconstituted from the fragment's W coordinate, stored in the alpha channel of the same texture. With both the W coordinate and depth buffer texture, as well as the pre-calculated inverse of the projection matrix that's used to render the scene, I can very cheaply calculate the camera-space coordinate of the fragment.


3D texture materials give surfaces a parallax depth to them unlike anything you've seen in a game.

 I also gave the sky shader a much needed re-design, along with tweaking the 3D material rendering shader to literally shade the depths of the procedural 3D materials. The 3d material shading method would have naively been implemented by sampling a 2x2x2 area and gauging the overall density gradient normal. The dot product of that normal and the incoming light vector would be used to determine the brightness of that point in the material during rendering. However, as Inigo Quilez explains in one of his articles (link below), you only really need two samples of the 3D volume - along the incident light vector itself.


The sky is like techno water, slowly drifting and undulating like pool water caustics.

 Even with a new GPU there's room for optimization, and the plan now is to refactor the ultra-basic post-processing system I wrote a while back to perform a basic Bloom glow shader, and properly integrate everything. The sky is procedurally generated by its own fragment shader, and could stand to be downscaled by a factor of two, or possibly four, and rendered to a texture that is then upscaled to the full view size. This is also the plan for the screenspace reflections which are ironically slower with a higher jitter factor that is used to disguise the banding artifacts by introducing a sort of dither. It's almost faster to just have a smaller raytrace step increment than it is to have a larger one and hide the banding with the jitter. It's quite possible that this is specific to my GPU. In either case, some kind of antialiasing/blur post processing shader is in order.

 Another thing I am exploring is the generation and use of a max-z quad-tree where the depth buffer is successively reduced into higher and higher mipmap levels that represent the nearest-most point. Raytracing then occurs through the resulting quad-tree. But, resorting to simply rendering SSRs at 1/4 the area of the full-resolution screen seems like it will speed things up to where the construction of a depth-buffer quad-tree would be slower - unless I were to utilize it for more than just SSRs (screen

 A few other things on the todo list are team-assigned 3D materials, whereby different areas of the world can have different 3D texture materials ascribed to them so as to more clearly depict team territories. Alongside that I am toying with the idea of allowing game server admins to also choose different procedural algorithms for each team, so that different teams' areas of the world can be dramatically contrasting with one-another. At the very least I aim to allow server admins to choose from different world-generation algorithms that would dictate what the entire world would be like.


More visitors coming from www.itch.io itself lately, as opposed to old Reddit/Gamedev posts.

 Anyway, I noticed that I've been getting a trickle of Twitter followers, when I normally never even look at Twitter, so I've begun tweeting more. I also noticed that on itch.io both my developer page that shows my three projects on there, including Bitphoria page, are getting more traffic from itch.io itself - whereas in the past the traffic came almost entirely from Reddit posts and gamedev forums where I shared a link to it. It would seem that Bitphoria has reached some kind of low-level critical mass for it to now be more prominent in search results. So that's neato.



Links:


9.02.2016

v1.00 Public Beta Testing


The biggest Bitphoria game in history.

Looks like some people have started taking interest in Bitphoria. There are a lot of kinks to work out, some people are having issues starting a game and having their friends join. I'm looking into it. Lots of little things are being figured out: players can be hard to see, chat messages disappear too quickly, the server I'm running crashed in the middle of the night while two people were playing, AI guys are more aggressive than they need to be, etc.

The end result is inevitably a new release, v1.01, which will have a new network protocol version. The master server will only relay game servers to your version of Bitphoria that are running with the same network protocol. I will be updating my game server to v1.01 when it's released, so people who are still playing with v1.00 will not be able to see it or play on it. An auto-update feature will be implemented down the road, for now my focus is on the engine.

It was also suggested that an automated installer would make it easier for some people to start playing Bitphoria. I already have a process in place for achieving this and wasn't planning on utilizing it until Bitphoria was out of beta, but I have decided to include an automated installer with the next release.

I estimate v1.01 to be released some time next week, maybe sooner.

Thanks to everybody who downloaded the current version and has been playing with it. This has been more fun that I could have imagined :)

8.30.2016

Bitphoria v1.00 Beta Released

Pre-release early screenshot of Bitphoria.
Bitphoria has been released. It's in public beta, and this is the very first (aka 'worst') version ever. Start games (don't forget to forward your UDP port if behind a router) and play with others. Dive into the scripting system and make whatever your heart desires. Let's see what Bitphoria can do.

An engine guide is in the works now, to help users better understand the various console variables and how to make full use of them as a sort of 'power user'. I'm sure anybody with their wits about them could figure out most of them just by surfing the console typing a few characters and pressing 'tab'. The confusion, by comparison to most engines with a console, will lie in the fact that the vast majority of console commands are for scripting, and not intended to be executed while in a game, or while just idling in the main menu system.

If you're the type of person who likes to get into new things, unafraid, and share your findings with others, then by all means download Bitphoria, start screwing around, and Youtube your experiences. Share this thing with the world. If you could notify me of your intentions to release a video of your time with Bitphoria that would be great, just so I could follow along and gain some insight as to what I should work on or do differently.