Index

7.05.2017

Long See no Time


It's been 5-months since my last blog post! In all actuality I simply burned out there for a while. I did manage to finally upgrade my CPU from a dual-core to a quad-core, allowing much better testing and enhancement of Bitphoria's multi-threading system. Many things have been added to Bitphoria since my last blog post.

I finally decided to add in a world-wide fluid dynamics simulation system with LOD to minimize computation. This effectively serves as a sort of 'windmap' for the world - and allows objects to either drag the 'air' around, pulling other objects, dynamic meshes, and particles around. Entities can also cause a momentary change in pressure at their location, for blast or black-hole vortex style effects that actually affect smoke and entities surrounding them. Rockets can now leave swirling trails of smoke, and cause particles to wisp around as they zoom through a cloud of them. It's pretty neato, but purely a superficial effect. It's something I've wanted to implement into Bitphoria since I first sat down and sketched out some ideas I wanted to see before I even started writing the engine.




I've also added signed distance function primitives to the procedural modeling stuff for easily constructing polygonal, wireframe, and point cloud geometries for entities - CSG style. This has made it much easier to model more interesting entity appearances, and now scripters aren't forced to plot individual vertices to create triangles.

Instead of explaining it much further I'll just show you a bunch of development screenshots from when I was working on the SDF modeling stuffs:

When things were first starting to work: plotting points for individual SDF voxels - sized according to the 'density' of the voxel they represent.

Generating a point-cloud only on the surface where the voxels are zero distance from the surface of the final distance field as produced from a cube added, a ring of spheres subtracted, and a single green sphere added to the top.

Utilizing the same voxel triangulation code to yield triangle mesh geometry from a SDF model.

In my attempt to add the ability to smooth an isomesh per the distance field, a lot of things were going wrong (these are supposed to just be smooth spheres).

A sphere that hasn't been smoothed, showing the base isomesh.

Some different tests: a smoothed cone (with crooked tip, this has since been fixed), a purple sphere with a pink capsule merged/added to it, as well as a green sphere with an orange capsule 'blended' into it (note the smooth transition between surfaces) while also being smoothed properly. You can smoothly blend/merge primitives while still producing a coarse isomesh with 45 degree angles.

A ring of spheres blend-merged and the result smoothed over.

Testing a different player appearance, using various things, while having the shield powerup - a set of overlapping point cloud spheres that all spin independently by rolling around against the surrounding world surfaces while the player moves.

A screenshot of the last of the testing/development game before I started scripting a new base game from scratch.

A bunch of other things have been added to Bitphoria as well, but I haven't touched it in at least 2 months and it's currently not something I'm particularly motivated to pursue. After making all the progress that I did on Bitphoria a few months ago I began scripting a base/default game that I would then use as a template for creating various game modes to release with the next version.

The base game is a sort of deathmatch game with simple AI drones and obstacles and hazards for players to negotiate while battling it out with one-another, which always seemed more interesting to me than just raw PvP deathmatch gameplay. Anyway, I just didn't find working on it rewarding anymore. I can come up with hundreds of little ideas and mechanics, knowing how to go about implementing them by exploiting the capabilities of Bitphoria's scripting system, but it just doesn't excite me like it did when I was younger. Back then I was working with the Quake engine. I'm sure scripting stuff in Bitphoria would be a blast for many other young spry minds out there, but I'm not of a mind to seek those kids out, even though they were what I was thinking of when I designed the whole thing.


Showing the effectiveness of Bitphoria's new FXAA post-processing shader at smoothing out super-jaggy aliasing on stair-step edge pixels. One of the many new things added to Bitphoria over the last while.

My goal has always been to make enough interesting stuff to show off Bitphoria's capabilities as a platform for creating, sharing, and playing custom games with other people - and hopefully have it be inspiring enough to motivate people to engage their own creative minds within the paradigm Bitphoria's scripting system provides. Well, as it stands, this probably won't be happening any time soon. I've had to make my peace with this fact over the last few months. I've been struggling to allow myself to work on anything else or pursue any of my other passions, not berating myself for letting Bitphoria development go idle. My resolve has been to look at this situation knowing that I owe it to myself to do what I must to take care of my own mental well-being and *let myself* pursue other projects and passions because nothing good comes about from sitting around not working on anything else purely out of guilt.

Yes, I wish that I could knock out Bitphoria in "record-manic-stay-up-all-night-not-caring-about-anything-else-in-life" time, which was naively my plan from the beginning, but it's just not in the stars. Am I lazy? Eh.. But if that were the case I don't think that I'd be feeling like there's not enough time in the day to work on what I *do* want to work on, especially after having overcome the self-inflicted shame I'd been enduring. I was tempted to release Bitphoria completely FOSS, just dump the code on the interwebs, and abandon any and all aspirations of trying to monetize it. I'd just be giving all of my work away for free. Alas, me lady talked me out of it, and explained that I should just let it sit until I was ready to come back to it. So that's what the plan is.


Bitphoria in its current form.

I've always been excellent at arcane technical pursuits and hacking away at them into the night, even now at thirty years old. But as far as actually designing a fun game or dealing with PR and promotion are concerned I am seriously lacking in drive and/or spirit. With recent developments I've become more inclined to focus on keeping my creative spirits high and working on what I love to work on: tackling difficult algorithmic problems. I've pretty much resigned to being the Wozniak to someone else's Jobs. I haven't met my 'Jobs' yet, and I hope I do someday, because I think that I have a lot to offer to and share with the world, and lack the ability to really get it out there.


The good old days.

In my relative slump I did manage to muster the gumption to start playing around on my CNC once again - the product of yet another 'abandoned' project that I had begun feeling guilty about for allowing to sit untouched and unloved for so long. It's really nice to have something to work on with my hands though :D

I've since explored a few ideas and have somehow finally convinced my wife that it's a financially worthwhile pursuit - making stuff on the CNC - which doesn't require dealing with nearly as many customers as our current crafting products do with our online business. We could be selling fewer big-ticket top-dollar high-end CNC-milled items rather than many cheaper smaller decorative items. In other words we could be making more money for less work, and deal with less customers, if we both transitioned our business toward producing large quality works as a team.

It would definitely be nice if we could spend more time together again like the old days, and I see CNC projects as being the nearest of several keys to unlocking that future for us, but it must be as a team. I don't believe she's the Jobs to my Woz, but I do believe we have the potential to achieve great things together. It has worked thus far with our online business, and I feel that she's fully capable of meeting me half-way while we engage a new medium together.

I've also been sketching out and outlining some ideas for an old project my late father had proposed and actively tried to encourage me to pursue. I'll save the details on that for a later blog post.



2.06.2017

Post-Processing Shaders and Effects




Bitphoria has had a simple post-processing effects setup in place for some time now. This comprised a single framebuffer object with a color and depth texture attached to it, which would be read by a single fragment shader on a full-screen quad in order to generate a simple mipmap blur and contrast boost. The aim here was to just achieve a very simple and basic effect beyond what simply rendering the scene straight to the screen/window could achieve. This could not perform multiple post processing passes or effects that required multiple shader stages.

I've always had the nagging sensation that something's visually missing from Bitphoria, beyond the general lack of coherent visuals - which I attribute to not having sat down and actually designed a cohesive appearance to a game via the engine's scripting functionality. With the ability to add various post processing effects to Bitphoria I feel that a more coherent visual aesthetic can be achieved beyond what one simple post processing shader offered.

For the new post processing effect system I wanted to be able to easily add more shaders, and route their inputs/outputs between them. At the beginning of each frame a 'raw framebuffer' is bound that has three texture attachments: RGBA color, XYZW reflection vectors, an HSL 'overbright' emission texture, and of course a depth texture. The reflection vectors texture is generated by all of the shaders used to render world geometry, procedural models, etc.. If a surface is not supposed to be reflective this is indicated by a reflection vector facing the viewpoint (Z = 0). The HSL overbright emission texture is for the 'spectral trails' shader effect, which creates a sort of quickly-fading 'rainbow' trail behind the objects that write to that texture when rendered.

Objects leaving spectral trails on the screen. This can be annoying and so has been toned down and is mostly used sparingly for momentary effects like explosions.

At the end of the frame the postfx system is executed, which then goes ahead and renders a fullscreen quad for each registered shader effect - binding all necessary textures and setting all prescribed GLSL uniforms for each.

Creating an effect stage involves loading its shader, creating an FBO, and attaching a single color texture to the FBO to be used as the effect's "output". A depth texture is also created and attached to the FBO only for what's called "FBO completeness", even though the full-screen quads do not convey any depth information. Still, a shader effect *could* generate depth values that would be written to the depth textures using gl_FragDepth, and that depth texture could be utilized by succeeding effect stages.

From there shader uniforms can be added from the engine with "pfx_parm()", where ints, floats, vec3s and mat4s (4x4 matrices) can have a pointer to their memory stored and used to set any necessary values the effect's shader may require.


Bitphoria's current postfx shader pipeline. Green boxes are texture outputs from effect shaders.


Similarly, "pfx_input()" is used for referencing other effects, allowing their FBO color or depth textures to be bound when the current effect is being rendered, for routing the output of stages as inputs into others. Some effect shaders will use the FBO texture output of previous stages (or the output of later stages from the previous frame) as inputs for certain effects.




1.22.2017

Todo List Prioritization, Screen Space Reflections, and a Long Break


 After Bitphoria's initial release, I spent a few weeks with those who participated in helping to test it out and discovered a few bugs and things that desperately needed changing/fixing/etc. I also started working on fleshing out the three default games that come with it. In doing so it became clear what other features the scripting system was lacking. As a result of the public beta release the todo list grew quickly. Even though I was knocking out new features and bug fixes in that first few weeks the list continued to grow faster than I could keep up.


Bitphoria's accumulated "changes.txt" since v1.0a release (formerly 'v1.00b').


 I'd since decided to take a break. It was a power burn-out session in the few months leading to the established release date that I aimed to meet. The break was meant to strengthen my resolve in finishing Bitphoria as a project. The end goal is to produce a product that could hopefully generate an extra income for my family, my wife and two baby girls. I owe it to them to see this thing through after subjecting them to all of the time I have spent working on it.

 The master server has been running this whole time, well, at least 95% of the time, and I see that people have been looking for online games (of which there are usually none) and then starting their own to check out Bitphoria. I'm not going to continue actively promoting Bitphoria until things are further along. If other people want to show Bitphoria to others and share it, I'm not going to stop them. My goal here is a sort of soft-launch, whereby people can start playing with Bitphoria while it's being developed, and the 'die-hard-core' fans are the ones who keep it alive and seed a community and fanbase. Sounds good on paper, we'll see how it all pans out in practice. In the meantime I'm sticking with incremental public alpha/beta releases on itch.io until the scripted games are all fleshed out and everything's at least had a once-over - at which point I think I'll invest in a Steam Greenlight strategy, or maybe even a crowd-funding campaign for a virtually finished game! (That *is* what it takes nowadays to crowdfund a game, right?) At any rate there's no point to an all-too-premature push for exposure, which just so happens to have backfired on many-a-indie game project due to negative exposure from the poor quality of an unfinished product. This can end in permanently tainting and marring a game's reputation regardless of how the finished product turns out down the road.

 At this point, having started diving back into Bitphoria's code and becoming reacquainted with the todo list and where things are at. It's become clear that there's still a long ways to go if I want this thing to be one-hundred percent uncompromising. However, I don't think I have it in me to pull that off. So, the next step was organizing the todo list by priority. What things are absolutely essential to the core idea of Bitphoria? What things can it go without? What things can I add in later that are aesthetic-only? What ideas are pipe-dreams? This is where the project is currently at. By my estimation Bitphoria should be on sale in the summer.


I forgot how awesome it was to upgrade. Been out of the hardware game for over a decade.

 The todo list is fully consolidated, after another sleepless night, and I have my work cut out for me. I also recently bought a GPU for myself for Christmas, a 4GB XFX Radeon RX 460. Literally the first GPU I have gone out of my way to acquire in 15+ years. Half the reason I bought it was to play DOOM. Incidentally, the last GPU I made it a point to acquire was a GeForce3, back in 2001, so that I'd be ready to play Doom3, which ended up not being released for 3 more years. The new GPU has afforded me the opportunity to get crazy with Bitphoria's graphics, and makes it possible to record at least 720p gameplay footage at 30fps. The original plan wtih Bitphoria was to design a cool looking game/engine that would run on older and/or budget hardware at playable speeds - where 'playable' in my gamer opinion is 60+ frames per second, minimum.


Now for a blurring post-process shader to hide the jitter-induced dithering!

 I just finished implementing a decent screen-space reflection shader, written by hand from scratch, based on my own ideas about how to get the information where it needs to be in order to make it happen. Instead of having multiple FBO textures storing surface normal, surface position, etc.. I am actually only storing the vertex-shader calculated reflection normal that is interpolated by the fragment shader of reflective surfaces. This is in contrast with how most implementations calculate the reflection normal in the fragment shader using the fragment's normalized camera-space coordinate and normal of the fragment - both output by the reflecting surface's shader into (typically) different FBO textures.


Camera-space reflection vector, depicted as RGB representing XYZ -1/1.

 In Bitphoria, my idea was to pre-calculate the reflection vector in the vertex shader to be interpolated across surface triangles, and then pack it into the red-green channels of a single RGBA texture from the fragment shader using a spheremap transform, which is much more compact than packing both a coordinate *and* surface normal to accomplish the same result. However, I still need the coordinate of the fragment to perform the actual raytrace, which happens to be reconstituted from the fragment's W coordinate, stored in the alpha channel of the same texture. With both the W coordinate and depth buffer texture, as well as the pre-calculated inverse of the projection matrix that's used to render the scene, I can very cheaply calculate the camera-space coordinate of the fragment.


3D texture materials give surfaces a parallax depth to them unlike anything you've seen in a game.

 I also gave the sky shader a much needed re-design, along with tweaking the 3D material rendering shader to literally shade the depths of the procedural 3D materials. The 3d material shading method would have naively been implemented by sampling a 2x2x2 area and gauging the overall density gradient normal. The dot product of that normal and the incoming light vector would be used to determine the brightness of that point in the material during rendering. However, as Inigo Quilez explains in one of his articles (link below), you only really need two samples of the 3D volume - along the incident light vector itself.


The sky is like techno water, slowly drifting and undulating like pool water caustics.

 Even with a new GPU there's room for optimization, and the plan now is to refactor the ultra-basic post-processing system I wrote a while back to perform a basic Bloom glow shader, and properly integrate everything. The sky is procedurally generated by its own fragment shader, and could stand to be downscaled by a factor of two, or possibly four, and rendered to a texture that is then upscaled to the full view size. This is also the plan for the screenspace reflections which are ironically slower with a higher jitter factor that is used to disguise the banding artifacts by introducing a sort of dither. It's almost faster to just have a smaller raytrace step increment than it is to have a larger one and hide the banding with the jitter. It's quite possible that this is specific to my GPU. In either case, some kind of antialiasing/blur post processing shader is in order.

 Another thing I am exploring is the generation and use of a max-z quad-tree where the depth buffer is successively reduced into higher and higher mipmap levels that represent the nearest-most point. Raytracing then occurs through the resulting quad-tree. But, resorting to simply rendering SSRs at 1/4 the area of the full-resolution screen seems like it will speed things up to where the construction of a depth-buffer quad-tree would be slower - unless I were to utilize it for more than just SSRs (screen

 A few other things on the todo list are team-assigned 3D materials, whereby different areas of the world can have different 3D texture materials ascribed to them so as to more clearly depict team territories. Alongside that I am toying with the idea of allowing game server admins to also choose different procedural algorithms for each team, so that different teams' areas of the world can be dramatically contrasting with one-another. At the very least I aim to allow server admins to choose from different world-generation algorithms that would dictate what the entire world would be like.


More visitors coming from www.itch.io itself lately, as opposed to old Reddit/Gamedev posts.

 Anyway, I noticed that I've been getting a trickle of Twitter followers, when I normally never even look at Twitter, so I've begun tweeting more. I also noticed that on itch.io both my developer page that shows my three projects on there, including Bitphoria page, are getting more traffic from itch.io itself - whereas in the past the traffic came almost entirely from Reddit posts and gamedev forums where I shared a link to it. It would seem that Bitphoria has reached some kind of low-level critical mass for it to now be more prominent in search results. So that's neato.



Links: