tag:blogger.com,1999:blog-43956464615273108912024-03-05T10:36:06.802-08:00DEFTWARE's DEVBLOGChronicling the life and times of a pseudo-visionary jack-of-all-trades.deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.comBlogger25125tag:blogger.com,1999:blog-4395646461527310891.post-16071741331839831102018-02-22T20:24:00.000-08:002019-07-13T02:10:23.111-07:00Work, Play, and the (Oculus) Rift that Divides<br />
<i>I currently have three ongoing projects:</i><br />
<b><i><br /></i></b>
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiEArvhiWbO0e2PhsvQyUhSWhEoZG4LIrgi5sGAA7Nz9bhoY-iiGyDZzCAkB6LirZh3yT5dnGrwtYu4_sp8WoUE9lGOne9j3u_X4_okl41cPuAfGrYLnzAWXpvK0bixuFidV-SWBvhrG98/s1600/pixelcnc_blogger.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="880" data-original-width="1600" height="350" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiEArvhiWbO0e2PhsvQyUhSWhEoZG4LIrgi5sGAA7Nz9bhoY-iiGyDZzCAkB6LirZh3yT5dnGrwtYu4_sp8WoUE9lGOne9j3u_X4_okl41cPuAfGrYLnzAWXpvK0bixuFidV-SWBvhrG98/s640/pixelcnc_blogger.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Creating toolpaths in PixelCNC using its own logo as image input.</td></tr>
</tbody></table>
<br />
<h3>
PixelCNC:</h3>
My most recent endeavor, PixelCNC, was started at the end of summer 2017. It has since been released in an alpha early-access state, with a few big items left on the todo list in order to get it where I really want it to be.<br />
<br />
PixelCNC relies on image-based operations to generate CNC toolpaths from image input. A near-infinite speedup in toolpath generation could be had by moving the image processing code to GPU. With my expertise and nearly 20 years of experience with OpenGL this is not too much of a hurdle to overcome, at least as far as planning it out and solving the problem itself. The most difficult aspect is making the commitment to spending the time mentally exerting myself.<br />
<br />
A larger and less clear goal would shift PixelCNC toward the realm of image editing and manipulation - where adding the ability to create and edit images as a whole new program mode would further remove the necessary step of using a separate dedicated program specifically for creating an image to feed into PixelCNC. The need for dealing with an image manipulation program could be reduced and/or eliminated, further streamlining the workflow for artistic CNC endeavors. That is, after all, the entire point of PixelCNC. I have a few ideas concerning what an image editing mode would comprise, including some things never before seen in an image editing program that lend themselves really well to sculpting depthmaps easily and intuitively - and would build on the existing image processing functionality I've already written.<br />
<br />
One more decently sized feature I'd like to add is an auto-update system, which would be introduced once PixelCNC enters beta, and would free up users' time so that they no longer need to manually download and install updated versions of PixelCNC as they are released.<br />
<br />
There's a bunch of other little things on the todo list for PixelCNC that aren't exactly along the path, and just require the time and effort. These comprise less consequential but still useful or handy things. A few to give you an idea are:<br />
<br />
<ul>
<li>Functionality to detect when a series of toolpath segments fit a circular arc within a given threshold and replace them with G03/G04 circular arc motions.</li>
<li>User-defined presets for CNC operations, so users can quickly create an operation they use frequently without having to edit each parameter and rebuild operations from scratch.</li>
<li>Defining rectangular/cylindrical stock shapes for confining generated toolpaths to. This is trickier than it sounds, simply because of how PixelCNC works.</li>
<li>Inlay generation mode, which would build on the existing medial-axis carving operation to allow the creation of a negative carving - whatever operations that would entail - to perfectly fit over an existing medial-axis carve operation.</li>
<li>Automatic G-code export by tool, which would build on the existing ability to toggle which operations are included in exported G-code, so that users could easily create CNC programs which will perform all operations concerning each tool individually.</li>
<li>Polygon Operation: similar to the spiral operation, except the spiral would be an N-sided polygon so that toolpaths could be concentric triangles, squares, hexagons, and so on.</li>
<li>Mesh export: allowing users to export the heightfield meshes that PixelCNC generates for visualization and certain CAM algorithms.</li>
</ul>
<br />
Another thing that I'm planning and setting up for is recording a sort of demonstration video for PixelCNC, which shows the entire process going from an image to creating a project in PixelCNC, defining tools, setting operations, using the simulation mode, exporting the G-code, loading it up into a CNC controller, and actually cutting stuff. This could also be cut down into a short and concise promotional video to serve as something the public could pass around to share the fact of PixelCNC's existence to efficiently and effectively get the idea across. I'm still deciding what exactly I want to actually demonstrate, because running the CNC is always a bit of an energy, time, and raw material commitment so I want to be sure of what I decide to do before I go ahead with the requisite expenditures.<br />
<br />
<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgIJA2yxsdLZYdOXcGDXLmvVHjl1nZVnjA0a9NisUWljUdrm3MbZlkYMjHNUuDMzcS8ExP1hWHOYgtBdQrZFDL0VtZ2kYXlU8U5tRA47xFaApDXnI4_Pie2jH96K8pl1IDDoRqAQtJiIWk/s1600/holocraft_logo.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="640" data-original-width="640" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgIJA2yxsdLZYdOXcGDXLmvVHjl1nZVnjA0a9NisUWljUdrm3MbZlkYMjHNUuDMzcS8ExP1hWHOYgtBdQrZFDL0VtZ2kYXlU8U5tRA47xFaApDXnI4_Pie2jH96K8pl1IDDoRqAQtJiIWk/s640/holocraft_logo.jpg" width="640" /></a></div>
<br />
<h3>
Holocraft:</h3>
A less recent but related project, Holocraft, which is a much more esoteric CNC related pursuit consists of a program that is in a less user-friendly state of partial disrepair. I could fix it up a bit, and begin selling it as well, which was at a time the tentative plan. The real plan was to sell actual holograms, but the lack of access to a CNC capable enough of realizing that vision put a relative end to that for the time being. An old friend I recently got in touch with disclosed the fact that he's been working to set up a hacker space which possesses machines that could make my original dream a reality.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgt7NCDFhVBgIJAIS23aCGUU16fMXxisEXCrjsX3aAcmu6XsQAsSkYifHcySLhbETQIaknsNznpxi3jijLBcKNUJ4HvPcLYMrqoS1g6h5xOrHHDMn9XmfwKYete1IepUJ1m_3O9AYRgl5s/s1600/holocraft2.0_blogger.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="685" data-original-width="924" height="470" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgt7NCDFhVBgIJAIS23aCGUU16fMXxisEXCrjsX3aAcmu6XsQAsSkYifHcySLhbETQIaknsNznpxi3jijLBcKNUJ4HvPcLYMrqoS1g6h5xOrHHDMn9XmfwKYete1IepUJ1m_3O9AYRgl5s/s640/holocraft2.0_blogger.jpg" width="640" /></a></div>
<br />
<br />
Another idea with Holocraft is to incorporate it into PixelCNC instead, as an operation the user would generate a toolpath for on a loaded image. The trick there is that Holocraft specifically relies on 3D geometry input which it then generates toolpaths for forming reflective optics that will recreate some representation of that geometry when viewed under a point light source.<br />
<br />
Decisions, decisions..<br />
<br />
<br />
<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjU_OiM-s8fzk2gFzcxg_suGW51NNLpuP_Z6udWk4Dg6c5PPN3ngqd_YBez1cTsvOSw7LfVlMM7TNxuaTmySh1EL6RoFY7VbOHSYXpDyVcPJOazu6oWcBs2fvXmeRFio3san8KCZTHGMzE/s1600/screen161.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="900" data-original-width="1600" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjU_OiM-s8fzk2gFzcxg_suGW51NNLpuP_Z6udWk4Dg6c5PPN3ngqd_YBez1cTsvOSw7LfVlMM7TNxuaTmySh1EL6RoFY7VbOHSYXpDyVcPJOazu6oWcBs2fvXmeRFio3san8KCZTHGMzE/s640/screen161.jpg" width="640" /></a></div>
<br />
<h3>
Bitphoria:</h3>
My biggest project, at least that I've invested the most time and energy into over recent years, is my game/engine Bitphoria. It's the culmination of a lifetime of learning all-things-gamedev, and virtually every novel game idea I've ever had. I take pride in the fact that it's written from scratch, and does things differently, but it's not quite "there" insofar as the visual polish and aesthetic are concerned. The actual 'game' aspect itself is largely incomplete, but it is basically ready to be made into a wide array of games. However, due to recent developments I've begun tinkering around with Bitphoria again, inbetween incremental PixelCNC updates, with a newfound vision for what it is meant to be.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><img height="360" src="https://pbs.twimg.com/media/DL34rU6VAAARpzA.jpg:large" style="margin-left: auto; margin-right: auto;" width="640" /></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The Oculus Go being announced at OC4 by Zuckerberg hisself.</td></tr>
</tbody></table>
<br />
By 'recent developments' I am referring to the fact that I finally caught the VR bug back in October, when the Oculus Go was announced during their OC4 event down in San Jose (it was San Jose, right?). Something just clicked, and I decided that I would acquire a Go as soon as humanly possible and make 2018 the year I dive into VR development after wrapping up PixelCNC. I am convinced that the lower price point and removal of the requirement for owning a high end smartphone, PC, or game console will prove to be fruitful for the VR industry as a whole. More people will suddenly find low-end VR affordable, which will result in many more people being exposed to at least some form of quality VR - and not some shoddy makeshift excuse for VR like Google Cardboard (at least when used with phones that have poor sensor quality or clogged up Android systems that imbue apps with unbearable motion-to-photon latencies). Only then will they know the reason VR is here to stay!<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqbuTKz7ROwNww7y8OEaHWVpyua97UaybHtBTLBY8xdC_BHa35Nm6ZSybGTrm4R1LWsI03GB_Ok-YBTcMKSbWkabmh0bnRNvj4zySSUL1a-SVUSuiIXuWQP0SiAWjbZrdDfdR_zLoiNY4/s1600/firstcuntakt.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="1000" data-original-width="1600" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqbuTKz7ROwNww7y8OEaHWVpyua97UaybHtBTLBY8xdC_BHa35Nm6ZSybGTrm4R1LWsI03GB_Ok-YBTcMKSbWkabmh0bnRNvj4zySSUL1a-SVUSuiIXuWQP0SiAWjbZrdDfdR_zLoiNY4/s640/firstcuntakt.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">"First Contact", the Oculus Rift demo I tried at Best Buy.</td></tr>
</tbody></table>
<br />
After a few months of watching PC VR headsets dropping in price I decided that I didn't want to limit myself to 3DOF (three degress of freedom) or the little directional remote controller, and started looking at different headsets. Eventually I found myself at Best Buy and demoed "First Contact" on the Rift. What a far cry from the DK2 that I had tried at a friend's house a few years prior! The Touch controllers made a world of difference, being able to actually interact with a virtual scene a universe away with my own hands was unlike anything I had ever imagined.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><img height="323" src="https://static.electronicsweekly.com/wp-content/uploads/2017/01/25110149/oculus-rift-touch.jpg" style="margin-left: auto; margin-right: auto;" width="640" /></td></tr>
<tr><td class="tr-caption" style="text-align: center;">You forget you're holding controllers, and feel like you're grabbing things.</td></tr>
</tbody></table>
<br />
While my wife was ordering a new PSU on New Egg I told her to go ahead and order a Rift as well. I've been a proud owner of the Rift for a month now and have already been integrating the Oculus PC SDK into Bitphoria. There has been a lot of work that needed to be done, especially due to the fact that the vast majority of a game in Bitphoria is described in external text files, I needed to introduce some means for connecting entities to the controllers and responding to different input states with the buttons and thumbsticks.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgLmF8s39EIMdWhY0RMR6hjkqphrTTv9m7aMQJ3OQb7p4FdboQZ3fQzSmlJJGwpXTryM9gByzPpQQh-33omc0LpaV-BKafZulGwJlB46zMtw52YiANappH5jwp_JFCHs6cDzR5RI5plxaM/s1600/bitphoria_controller_bloggy.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="667" data-original-width="786" height="542" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgLmF8s39EIMdWhY0RMR6hjkqphrTTv9m7aMQJ3OQb7p4FdboQZ3fQzSmlJJGwpXTryM9gByzPpQQh-33omc0LpaV-BKafZulGwJlB46zMtw52YiANappH5jwp_JFCHs6cDzR5RI5plxaM/s640/bitphoria_controller_bloggy.png" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">This is what scripting simple VR player flying controls to Bitphoria looks like.</td></tr>
</tbody></table>
<br />
There's still a bit of work to go before it's fully integrated, at which point I can release Bitphoria as a means for people to quickly and easily script all manner of multiplayer games without having any real gamedev or modding experience. Ultimately I'd like to expand on the existing game bitfile system and circumvent the scripting system altogether by crafting a WYSIWIG game editor, which allows users to craft games directly in VR. Bitphoria would then be the first VR-based game making system! No more screwing around with Unity or Unreal, just fire up Bitphoria and start building games.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><img height="360" src="https://forums.unrealengine.com/filedata/fetch?id=1353134" style="margin-left: auto; margin-right: auto;" width="640" /></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Even if you already know everything, you have to learn how *they* do it!</td></tr>
</tbody></table>
<br />
However, Bitphoria's codebase has begun showing its age already. The things that I wish I could've done differently are piling up, and make it difficult to work inside of. So, the plan for now is to just focus on making a single cool game out of Bitphoria, while promoting the scripting side of things to get people interested and involved in being able to make their own VR games with it. Ultimately, though, I plan to rebuild the engine from scratch - borrowing a lot of code and engine structure from the existing engine, but re-implementing everything more cleanly and with in-engine game editing in mind.<br />
<br />
There are several components that would require specially designed and implemented WYSIWIG interfaces for crafting Bitphoria games:<br />
<br />
<ol>
<li>Entity Types - Describes each possible entity type, serving as a sort of template. Dictates the various aspects pertaining to a game entity, such as what physics behaviors they have, what effects flags they have set, their collision volume and its size, what entity functions various logic states can trigger, any ambient/looping audio, particle emissions, appearance, etc..</li>
<li>Entity Functions - These are executable lists of instructions that produce specific entity behaviors as a result of different internal and external logic states or triggers that the engine detects through physics, player interaction, scripted timers and conditional statements becoming satisfied, and the like.</li>
<li>Model Procedures - Lists of modeling operations which produce geometry for entity appearances by plotting out points, lines, triangles, and signed-distance function primitives for modeling voxel-based geometries with constructive solid geometry conventions. These can be made to vary in a number of ways with each generated instance of a procedure, allowing entities to not appear exactly identical to others of the same type.</li>
<li>Dynamic Meshes - 3D point clouds of 'nodes' attached together using springs. Springs can be assigned procedural models to give the 'dynamesh' visual form and the appearance of multiple conjoined moving parts. Dynameshes allow entities to appear to have more dynamic physics interactions with the environment and other entities as well as allow for simple/crude animations. Otherwise entities would be restricted to appearing only as rigid static geometry.</li>
<li>Entity HUD Overlays - Procedural models assigned to entity types to display various stats and visual indicators conveying the state of the entity. Entity state and properties can drive modifiers to animate color, orientation, size, and position of the models drawn to allow for a variety of interesting HUD elements (aka 'widgets') to be drawn over an entity in a player's perspective.</li>
<li>World Prefabs - A more recent idea that I'm still toying with: world prefabs would consist of simple voxel models which the world-generation algorithm randomly places around the map per various modifier flags and statistical 'tendencies' specified for each prefab, providing some semblance of structure and design to worlds beyond what little is offered by the existing random plateaus/caves/pits. These could also have entity spawns placed in them so they can serve an actual function during gameplay.</li>
</ol>
<div>
Designing and implementing intuitive interfaces for users to define Bitphoria games with with - and have immediate feedback for quick iteration/turnaround time - that's a task unto itself. VR is a young medium that we're still becoming familiar with, and exploring the language of, so there's the added challenge of discovering what even works and what doesn't. I imagine that VR would allow for much more intuitive interfaces than 2D does, especially when it comes to crafting 3D content. We're still figuring it all out, all developers are, collectively. It's a bit of a wild-west.</div>
<div>
<br /></div>
<div>
Regardless, I believe it will be a powerful thing giving the average person the ability to easily create their own gaming experiences for VR with the right tools to enable them to quickly and easily produce quality interactive gaming experiences. Bitphoria's scripting system is designed to really reduce each possible dimension of a game to just a few simple options, but the number of dimensions and the freedom to connect up all sorts of pieces together is what allows for such a vast universe of permutations.</div>
<div>
<br /></div>
<div>
Ultimately, the goal has always been to create a system somewhat akin to Snapmap for DOOM, which coincidentally follows in the same vein as my original vision for Bitphoria - a platform for people to easily create and share their own mods/games. Snapmap's idea of a WYSIWIG editor trumps my weary plan for users to work in (relatively) tedious script files. Admittedly I was somewhat awe-struck and simultaneously irked when id Software unveiled DOOM at E3 2015, after I had already been working on Bitphoria for a year. "They stole my idea!" I'm just glad Snapmap was the last project on DOOM that Carmack worked on before his departure, as it was born of the community modding spirit which he was always such a proponent of.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgIbaQiDMC6LWzFP54QCDhTvaXtceGRBRM1dA2YYOtEO0ET2ci31YRYW4hT-f4GUuDpBdlRDbKrpA-f4B1NHRsYqWFaRoxgUwfszUFY77RN9Pp0d8lJCeqaCs9puut-bTqJojBztD_Das8/s1600/image_3.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="576" data-original-width="1024" height="354" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgIbaQiDMC6LWzFP54QCDhTvaXtceGRBRM1dA2YYOtEO0ET2ci31YRYW4hT-f4GUuDpBdlRDbKrpA-f4B1NHRsYqWFaRoxgUwfszUFY77RN9Pp0d8lJCeqaCs9puut-bTqJojBztD_Das8/s640/image_3.png" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Editing some logic gates in DOOM's Snapmap.</td></tr>
</tbody></table>
<br /></div>
<div>
<br /></div>
<div>
Anyway, that's pretty much that. There's also a bunch of things I need to revisit, such as the post processing effects system, and the windmapping stuff, as these really sopped up any remaining CPU and GPU that was previously left on the table. In the case of running in VR, however, they seem to push the envelope a bit too far to be viable - at least on my bare-minimum VR spec system. I'll have to buckle down and really push to keep certain things in there. Screenspace reflections? Likely out of the question now, but maybe I can hack something to work that provides a similar effect. It was always more of an aesthetic thing than an aspect to make Bitphoria have more visual realism. Particles and wind fluid dynamics perhaps could be moved to the GPU, but we'll see. They might just be effects reserved for the highest of system specs.</div>
<div>
<br /></div>
<div>
I should be able to re-engage at least some minimal post processing effects. My FXAA implementation was pretty solid, and surely would be faster than supersampling, and possibly faster than multisampling. I'll just have to see for myself. The postfx system also was responsible for final color adjustment, and also featured a really cool spectral tracer effect which particles and entities could leave overbright residual trails across. It was subtle, but it really accentuated the whole aesthetic and feel, making certain objects seem like they were really glowing blindingly bright, such as lasers and explosions. The windmapping really lent itself to the overall feel as well. Maybe I can figure out some kind of layered screenspace/frustumspace fluid dynamics solver, which would project onto the scene when particles and entities query the windmap. It was always purely a visual effect that wasn't intended to be used to actually affect gameplay-relevant entities, and it really gave a whole new dimension to the feel of Bitphoria. I miss my wind.</div>
<div>
<br /></div>
<div>
<br /></div>
<h3>
Conclusion:</h3>
<div>
Now I have a customer base with PixelCNC, customers who invested in it as early-access software with the promise of new features coming down the pipe. I owe it to them for their support to continue staying focused, and productive, on PixelCNC. As far as I am concerned, any supporters who put a financial investment in something should take precedence over anything else I may have going on.</div>
<div>
<br /></div>
<div>
<br /></div>
<br />
<br />
<br />deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-15883013918402065362018-01-24T07:35:00.000-08:002018-09-18T17:49:22.902-07:00PixelCNC: Images to CNC G-Code<br />
Sometime early last year I'd finally won my wife over with the idea of making and selling various CNC milled/routed items on our Etsy store - where we've been selling crafts and prints for years. She's developed her own process for creating designs which I would then run through my process to produce a final product on my tabletop CNC.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg-QL4pd2iypiCjVXs3mfWc_GDTkGwt2Klw680HJdCixYUlzwHhOJrBjus2u68ZEhAfw23hiQUCGMVO1IGjDY-pkuEYnZwa7diaXQ7XHtEDBbniK5v4t9-vo4S1Hs320RpA011fBbOJ8LA/s1600/treeoflife3d.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="900" data-original-width="1600" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg-QL4pd2iypiCjVXs3mfWc_GDTkGwt2Klw680HJdCixYUlzwHhOJrBjus2u68ZEhAfw23hiQUCGMVO1IGjDY-pkuEYnZwa7diaXQ7XHtEDBbniK5v4t9-vo4S1Hs320RpA011fBbOJ8LA/s640/treeoflife3d.jpg" width="640" /></a></div>
<br />
<br />
However, this process was somewhat cumbersome and tedious, involving meshing the image using Blender (which could take a while when applying a decimate operation to get the polycount down to something workable) and then fiddling around in a conventional CAM program to actually generate toolpaths. The whole process was a very tweaking-intensive operation, requiring constant refinement and adjustment, which consumed more time than I thought should've been necessary. Isn't there a way I can just get from an image to G-code?<br />
<br />
To improve the process I (apparently) wrote a program, 'TGA2STL' (<a href="https://github.com/DEF7/TGA2STL" target="_blank">https://github.com/DEF7/TGA2STL</a>), early 2016, before I had convinced my lady of the profitable nature of the CNC - which was mostly sitting idle while I worked on my game engine Bitphoria. I had completely forgotten about TGA2STL until I stumbled across it in my projects folder a few months ago. I was surprised at both my thoroughness when developing it, and total forgetfulness at the fact I had created it at all. At any rate, it became a part of my new process for converting my wife's designs into finished products. But I was still left at the mercy of whatever toolpath generation software I had at my disposal: however tedious or uninspired they may be. I knew I could do better!<br />
<br />
A project my father had always encouraged me to work on was a CAD/CAM package to undercut the professional packages that cost thousands of dollars, and that 90% of job shops only use 10% of the functionality from. My dad's big idea was for me to write a CAD/CAM program that only featured that 10% of functionality for all those shops, and sell it to them for the low price of $500. It was a project that always interested me from an algorithmic engineering standpoint, though it was never enough for me to drop existing projects to work on it.<br />
<br />
Now, my father's original idea was a professional CAD/CAM program that could be used for precision machining and as awesome as that sounds I don't have a professional CNC, and I've also ventured into CNC on my own from more of a creative/hobby/artistry angle. I don't really have a personal use for a professional CAD/CAM program that I'd be making myself - aside from selling it for money. In spite of these sorts of projects being a bit of an arcane art form that I *could* spend years on, easily, I'd rather not. This is especially the case considering how many subscription pro-level CAM packages that exist nowadays, it would ultimately exist as just a mental masturbation project of sorts. Alternatively, I do have a use for a program that can take my wife's designs and turn them into G-code on the fly. In other words, I do have a use for a sort of hybrid project that merges my dad's idea for a bare-minimum CAM program and the workflow my wife and I would prefer to use, in terms of our CNC endeavors.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg2AxnhGL2y1EVgD7eK4jOIk68Tl4W2wXhCfcfbaPUfgTMB__0xGvdBKo6a4JRE-MQs6GKe3CLRjQkK6B2PWBwOP1DQ0pLVV6JJWQYeokJeVe-Xk91f10eMab9sxYRxmZU_7wsd3YL8OHc/s1600/logos_itch.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="250" data-original-width="315" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg2AxnhGL2y1EVgD7eK4jOIk68Tl4W2wXhCfcfbaPUfgTMB__0xGvdBKo6a4JRE-MQs6GKe3CLRjQkK6B2PWBwOP1DQ0pLVV6JJWQYeokJeVe-Xk91f10eMab9sxYRxmZU_7wsd3YL8OHc/s1600/logos_itch.jpg" /></a></div>
<br />
<br />
Enter PixelCNC...<br />
<br />
I've been working on PixelCNC since the end of last summer, and it's finally released. You can check it out at http://deftware.itch.io/PixelCNC/ It's available as an early-access program, selling for $55.00, but there's a free trial version that can do all the same stuff, except for load images with > 65k pixels and load/save project files. Hopefully that's enough to get people hooked on it without giving it away for free.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjepVr7weAlgEjzQuJq49BsrMZzURtTw0jkNc3tM0A2pA7GpG-rVL-5DQp1UyCd9OixDNWAdXBER-qdLhE8k-6zIty6eBtFXodDVjWBtL-QbwT3sJCJHTPizPl2yEteM8q-GMZpExR7I14/s1600/pixelcnc+horzontal+ballnose.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="880" data-original-width="1600" height="352" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjepVr7weAlgEjzQuJq49BsrMZzURtTw0jkNc3tM0A2pA7GpG-rVL-5DQp1UyCd9OixDNWAdXBER-qdLhE8k-6zIty6eBtFXodDVjWBtL-QbwT3sJCJHTPizPl2yEteM8q-GMZpExR7I14/s640/pixelcnc+horzontal+ballnose.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">PixelCNC generating a 'horizontal milling' operation.</td></tr>
</tbody></table>
<br />
<br />deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-13345462080416765552017-07-05T22:12:00.003-07:002017-08-18T06:59:32.561-07:00Long See no Time<br />
It's been 5-months since my last blog post! In all actuality I simply burned out there for a while. I did manage to finally upgrade my CPU from a dual-core to a quad-core, allowing much better testing and enhancement of Bitphoria's multi-threading system. Many things have been added to Bitphoria since my last blog post.<br />
<br />
I finally decided to add in a world-wide fluid dynamics simulation system with LOD to minimize computation. This effectively serves as a sort of 'windmap' for the world - and allows objects to either drag the 'air' around, pulling other objects, dynamic meshes, and particles around. Entities can also cause a momentary change in pressure at their location, for blast or black-hole vortex style effects that actually affect smoke and entities surrounding them. Rockets can now leave swirling trails of smoke, and cause particles to wisp around as they zoom through a cloud of them. It's pretty neato, but purely a superficial effect. It's something I've wanted to implement into Bitphoria since I first sat down and sketched out some ideas I wanted to see before I even started writing the engine.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/YTlwEmjC1JQ/0.jpg" frameborder="0" height="266" src="https://www.youtube.com/embed/YTlwEmjC1JQ?feature=player_embedded" width="320"></iframe></div>
<br />
<br />
I've also added signed distance function primitives to the procedural modeling stuff for easily constructing polygonal, wireframe, and point cloud geometries for entities - CSG style. This has made it much easier to model more interesting entity appearances, and now scripters aren't forced to plot individual vertices to create triangles.<br />
<br />
Instead of explaining it much further I'll just show you a bunch of development screenshots from when I was working on the SDF modeling stuffs:<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgrGALZWrkfvCCOwqKyGCOse9UPfzMqGAA_of5QjLDqE_69Ccf-1OhiHXvJezOYPdmZiByjZmnFZ2O25FSxYWkIMETSKVEyhpgdEsEMXt0YlsHEQiuHmDlzCF6_XzzHL-CqA6FJPRRsG9s/s1600/screen000.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="900" data-original-width="1600" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgrGALZWrkfvCCOwqKyGCOse9UPfzMqGAA_of5QjLDqE_69Ccf-1OhiHXvJezOYPdmZiByjZmnFZ2O25FSxYWkIMETSKVEyhpgdEsEMXt0YlsHEQiuHmDlzCF6_XzzHL-CqA6FJPRRsG9s/s640/screen000.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">When things were first starting to work: plotting points for individual SDF voxels - sized according to the 'density' of the voxel they represent.</td></tr>
</tbody></table>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgQZtdHW93MHZWG1Egzk4BkeVJSNc0TU11VrBDSWvNmXSisZJOpYYA_2gt7xydO3aa3-BYrULa2g357i5lrTApIlFbwTtZITIqZu9RnigLCSqQ1nFo6bDmx4V1o5Vbq8mnFnaW_r62wps8/s1600/screen002.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="900" data-original-width="1600" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgQZtdHW93MHZWG1Egzk4BkeVJSNc0TU11VrBDSWvNmXSisZJOpYYA_2gt7xydO3aa3-BYrULa2g357i5lrTApIlFbwTtZITIqZu9RnigLCSqQ1nFo6bDmx4V1o5Vbq8mnFnaW_r62wps8/s640/screen002.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Generating a point-cloud only on the surface where the voxels are zero distance from the surface of the final distance field as produced from a cube added, a ring of spheres subtracted, and a single green sphere added to the top.</td></tr>
</tbody></table>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhjf6rn6iRXM619cuCVQYvT_z69h9lIHRmX9KETbCNYg3Hcq9mIoq4J97VlSERZITnqc5hkYkEAqs1o1QIgClrQERerB9Q4WFtKlDxiQHCFHz86fSRqoJ6KDYoBECtimIVPqipfLDUl_eo/s1600/screen003.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="900" data-original-width="1600" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhjf6rn6iRXM619cuCVQYvT_z69h9lIHRmX9KETbCNYg3Hcq9mIoq4J97VlSERZITnqc5hkYkEAqs1o1QIgClrQERerB9Q4WFtKlDxiQHCFHz86fSRqoJ6KDYoBECtimIVPqipfLDUl_eo/s640/screen003.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Utilizing the same voxel triangulation code to yield triangle mesh geometry from a SDF model.</td></tr>
</tbody></table>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjz2EcFccDadZvCujQfNKiSbQX96lC0T_8StXrswqMoe4S5B-334rXQT4TPJtBUCVTUYNXo4TVgmB12yl5OXqpYrTwRzIiHRytWXmv3h0DuKOJoMjGEK8rwAnPk4ok28nP2PUWh0CivYKU/s1600/screen012.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="900" data-original-width="1600" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjz2EcFccDadZvCujQfNKiSbQX96lC0T_8StXrswqMoe4S5B-334rXQT4TPJtBUCVTUYNXo4TVgmB12yl5OXqpYrTwRzIiHRytWXmv3h0DuKOJoMjGEK8rwAnPk4ok28nP2PUWh0CivYKU/s640/screen012.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">In my attempt to add the ability to smooth an isomesh per the distance field, a lot of things were going wrong (these are supposed to just be smooth spheres).</td></tr>
</tbody></table>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMO_AAb3naccLtX-sI-gX_EB7Gk5Cv6ReDJSH-liFfcUNi4sD4eHVsvuyDvRPXjQTq9Tt-zOBtDpLAAPbv_MBb-vWZppInITpPfc4jdTVCPbP08YGWqu9Zr6qGDOW0s-8R-DZVOARP1cE/s1600/screen014.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="900" data-original-width="1600" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMO_AAb3naccLtX-sI-gX_EB7Gk5Cv6ReDJSH-liFfcUNi4sD4eHVsvuyDvRPXjQTq9Tt-zOBtDpLAAPbv_MBb-vWZppInITpPfc4jdTVCPbP08YGWqu9Zr6qGDOW0s-8R-DZVOARP1cE/s640/screen014.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">A sphere that hasn't been smoothed, showing the base isomesh.</td></tr>
</tbody></table>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi6gsK7v6YYeHUk9wKNHJsFBOdNRbLiyb4svu7FNeUkSVCHg7ZbQg_VWmf52LHfZWYYMtYRmP0ucS7xY7nGB9efaNygJLDjs8JZFxw-_x1Q_6KDHU47cSOHiZ9_yBkqHvIND0B3vVLyGFo/s1600/screen019.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="900" data-original-width="1600" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi6gsK7v6YYeHUk9wKNHJsFBOdNRbLiyb4svu7FNeUkSVCHg7ZbQg_VWmf52LHfZWYYMtYRmP0ucS7xY7nGB9efaNygJLDjs8JZFxw-_x1Q_6KDHU47cSOHiZ9_yBkqHvIND0B3vVLyGFo/s640/screen019.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Some different tests: a smoothed cone (with crooked tip, this has since been fixed), a purple sphere with a pink capsule merged/added to it, as well as a green sphere with an orange capsule 'blended' into it (note the smooth transition between surfaces) while also being smoothed properly. You can smoothly blend/merge primitives while still producing a coarse isomesh with 45 degree angles.</td></tr>
</tbody></table>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgiQAgIeH1QIdklRclZDIQi2z4dIXyr65-2xANN7Vrphb9zKbNE74cKl2KeEmB5xIPwBhu4NnwiJQLdECIfcBbP_X5wD11k0bprxRjeWMarx9C1saenDZ7blxOiJBIPhcaLggnbx7KUu-Y/s1600/screen020.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="900" data-original-width="1600" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgiQAgIeH1QIdklRclZDIQi2z4dIXyr65-2xANN7Vrphb9zKbNE74cKl2KeEmB5xIPwBhu4NnwiJQLdECIfcBbP_X5wD11k0bprxRjeWMarx9C1saenDZ7blxOiJBIPhcaLggnbx7KUu-Y/s640/screen020.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">A ring of spheres blend-merged and the result smoothed over.</td></tr>
</tbody></table>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhv7eW9T70ci-yGZ11w8Ck9lUx9x6EKP24-bdyoTJf7nVdUsnSFKjEWktH16QZCb4GAXk3o9quQ3u2BzZs_orMbOcG55ejLYql0LYr3BOKr6qBsL-wQ-t4E63nlSz8WEU1gBwh6gZ-SupY/s1600/screen023.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="900" data-original-width="1600" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhv7eW9T70ci-yGZ11w8Ck9lUx9x6EKP24-bdyoTJf7nVdUsnSFKjEWktH16QZCb4GAXk3o9quQ3u2BzZs_orMbOcG55ejLYql0LYr3BOKr6qBsL-wQ-t4E63nlSz8WEU1gBwh6gZ-SupY/s640/screen023.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Testing a different player appearance, using various things, while having the shield powerup - a set of overlapping point cloud spheres that all spin independently by rolling around against the surrounding world surfaces while the player moves.</td></tr>
</tbody></table>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCQ694VNBmvn3iCA2_Vy6OwiTNF3JZUxO7Ksy_vFJ2QcgD51BtQAaVZRPGY3VahNu-9wRP1jq66YE7KsSyIdgfQnUOzGQDu5WIiXKAamrElQrO4B4Lrca6r-LB_TfoXBrf7PtZcHzZxjw/s1600/screen161.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="900" data-original-width="1600" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCQ694VNBmvn3iCA2_Vy6OwiTNF3JZUxO7Ksy_vFJ2QcgD51BtQAaVZRPGY3VahNu-9wRP1jq66YE7KsSyIdgfQnUOzGQDu5WIiXKAamrElQrO4B4Lrca6r-LB_TfoXBrf7PtZcHzZxjw/s640/screen161.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">A screenshot of the last of the testing/development game before I started scripting a new base game from scratch.</td></tr>
</tbody></table>
<br />
A bunch of other things have been added to Bitphoria as well, but I haven't touched it in at least 2 months and it's currently not something I'm particularly motivated to pursue. After making all the progress that I did on Bitphoria a few months ago I began scripting a base/default game that I would then use as a template for creating various game modes to release with the next version.<br />
<br />
The base game is a sort of deathmatch game with simple AI drones and obstacles and hazards for players to negotiate while battling it out with one-another, which always seemed more interesting to me than just raw PvP deathmatch gameplay. Anyway, I just didn't find working on it rewarding anymore. I can come up with hundreds of little ideas and mechanics, knowing how to go about implementing them by exploiting the capabilities of Bitphoria's scripting system, but it just doesn't excite me like it did when I was younger. Back then I was working with the Quake engine. I'm sure scripting stuff in Bitphoria would be a blast for many other young spry minds out there, but I'm not of a mind to seek those kids out, even though they were what I was thinking of when I designed the whole thing.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhAbsEVaFYa0QAd_Kd63OSsMIYO9xWAEJTIEKpKh2HDWAN3VSyUg6eo6uR41Cy6wkMTkv7Hg0ciWDqa9eiPRxYdK_8b5DBxdmCr0C6dJLDPlFb4kfQ0zs5eKFoG_3ywn_LxGutwU3f6JcI/s1600/swimmers_fxaa_cut.gif" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="233" data-original-width="294" height="505" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhAbsEVaFYa0QAd_Kd63OSsMIYO9xWAEJTIEKpKh2HDWAN3VSyUg6eo6uR41Cy6wkMTkv7Hg0ciWDqa9eiPRxYdK_8b5DBxdmCr0C6dJLDPlFb4kfQ0zs5eKFoG_3ywn_LxGutwU3f6JcI/s640/swimmers_fxaa_cut.gif" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Showing the effectiveness of Bitphoria's new FXAA post-processing shader at smoothing out super-jaggy aliasing on stair-step edge pixels. One of the many new things added to Bitphoria over the last while.</td></tr>
</tbody></table>
<br />
My goal has always been to make enough interesting stuff to show off Bitphoria's capabilities as a platform for creating, sharing, and playing custom games with other people - and hopefully have it be inspiring enough to motivate people to engage their own creative minds within the paradigm Bitphoria's scripting system provides. Well, as it stands, this probably won't be happening any time soon. I've had to make my peace with this fact over the last few months. I've been struggling to allow myself to work on anything else or pursue any of my other passions, not berating myself for letting Bitphoria development go idle. My resolve has been to look at this situation knowing that I owe it to myself to do what I must to take care of my own mental well-being and *let myself* pursue other projects and passions because nothing good comes about from sitting around not working on anything else purely out of guilt.<br />
<br />
Yes, I wish that I could knock out Bitphoria in "record-manic-stay-up-all-night-not-caring-about-anything-else-in-life" time, which was naively my plan from the beginning, but it's just not in the stars. Am I lazy? Eh.. But if that were the case I don't think that I'd be feeling like there's not enough time in the day to work on what I *do* want to work on, especially after having overcome the self-inflicted shame I'd been enduring. I was tempted to release Bitphoria completely FOSS, just dump the code on the interwebs, and abandon any and all aspirations of trying to monetize it. I'd just be giving all of my work away for free. Alas, me lady talked me out of it, and explained that I should just let it sit until I was ready to come back to it. So that's what the plan is.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgHTdW4PGxhnWy9yKox8n04J8r2Au2dw_-VOsCfIftbsjdvgzNG-w-m5LTkVTDW8FH6_WkfdsFzsuYZIOfgpHK4BgB7znxdRGN8YqrV4k8Gz2Vu3duEKHITP9WC3ISjo8wvkcCpyvHpiRU/s1600/screen000.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="900" data-original-width="1600" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgHTdW4PGxhnWy9yKox8n04J8r2Au2dw_-VOsCfIftbsjdvgzNG-w-m5LTkVTDW8FH6_WkfdsFzsuYZIOfgpHK4BgB7znxdRGN8YqrV4k8Gz2Vu3duEKHITP9WC3ISjo8wvkcCpyvHpiRU/s640/screen000.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Bitphoria in its current form.</td></tr>
</tbody></table>
<br />
<div>
I've always been excellent at arcane technical pursuits and hacking away at them into the night, even now at thirty years old. But as far as actually designing a fun game or dealing with PR and promotion are concerned I am seriously lacking in drive and/or spirit. With recent developments I've become more inclined to focus on keeping my creative spirits high and working on what I love to work on: tackling difficult algorithmic problems. I've pretty much resigned to being the Wozniak to someone else's Jobs. I haven't met my 'Jobs' yet, and I hope I do someday, because I think that I have a lot to offer to and share with the world, and lack the ability to really get it out there.</div>
<div>
<br /></div>
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8zuDgYtuMpTWzFmU9CKI9kxwzCOgYCFLTqWMwCtkJBQd_sOqePEd__KG68Am06K0qWLxRKhsyCq9DxsyP0gdlxStzS5lmH_aR2T9lz8Cnq9lVETQYFyY8ppm6jOKFzS4kD6eYWyuZZoQ/s1600/Steve-Jobs-and-Steve-Wozniak.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="451" data-original-width="623" height="462" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8zuDgYtuMpTWzFmU9CKI9kxwzCOgYCFLTqWMwCtkJBQd_sOqePEd__KG68Am06K0qWLxRKhsyCq9DxsyP0gdlxStzS5lmH_aR2T9lz8Cnq9lVETQYFyY8ppm6jOKFzS4kD6eYWyuZZoQ/s640/Steve-Jobs-and-Steve-Wozniak.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The good old days.</td></tr>
</tbody></table>
<br />
In my relative slump I did manage to muster the gumption to start playing around on my CNC once again - the product of yet another 'abandoned' project that I had begun feeling guilty about for allowing to sit untouched and unloved for so long. It's really nice to have something to work on with my hands though :D<br />
<br />
I've since explored a few ideas and have somehow finally convinced my wife that it's a financially worthwhile pursuit - making stuff on the CNC - which doesn't require dealing with nearly as many customers as our current crafting products do with our online business. We could be selling fewer big-ticket top-dollar high-end CNC-milled items rather than many cheaper smaller decorative items. In other words we could be making more money for less work, and deal with less customers, if we both transitioned our business toward producing large quality works as a team.<br />
<br />
It would definitely be nice if we could spend more time together again like the old days, and I see CNC projects as being the nearest of several keys to unlocking that future for us, but it must be as a team. I don't believe she's the Jobs to my Woz, but I do believe we have the potential to achieve great things together. It has worked thus far with our online business, and I feel that she's fully capable of meeting me half-way while we engage a new medium together.<br />
<br />
I've also been sketching out and outlining some ideas for an old project my late father had proposed and actively tried to encourage me to pursue. I'll save the details on that for a later blog post.<br />
<br />
<br />
<br />deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-91399182689971222502017-02-06T11:21:00.006-08:002017-02-06T11:21:56.494-08:00Post-Processing Shaders and Effects<br />
<center>
<iframe allowfullscreen="" frameborder="0" height="315" src="https://www.youtube.com/embed/H3VS-FIc0oY" width="560"></iframe></center>
<br />
<br />
Bitphoria has had a simple post-processing effects setup in place for some time now. This comprised a single framebuffer object with a color and depth texture attached to it, which would be read by a single fragment shader on a full-screen quad in order to generate a simple mipmap blur and contrast boost. The aim here was to just achieve a very simple and basic effect beyond what simply rendering the scene straight to the screen/window could achieve. This could not perform multiple post processing passes or effects that required multiple shader stages.<br />
<br />
I've always had the nagging sensation that something's visually missing from Bitphoria, beyond the general lack of coherent visuals - which I attribute to not having sat down and actually designed a cohesive appearance to a game via the engine's scripting functionality. With the ability to add various post processing effects to Bitphoria I feel that a more coherent visual aesthetic can be achieved beyond what one simple post processing shader offered.<br />
<br />
For the new post processing effect system I wanted to be able to easily add more shaders, and route their inputs/outputs between them. At the beginning of each frame a 'raw framebuffer' is bound that has three texture attachments: RGBA color, XYZW reflection vectors, an HSL 'overbright' emission texture, and of course a depth texture. The reflection vectors texture is generated by all of the shaders used to render world geometry, procedural models, etc.. If a surface is not supposed to be reflective this is indicated by a reflection vector facing the viewpoint (Z = 0). The HSL overbright emission texture is for the 'spectral trails' shader effect, which creates a sort of quickly-fading 'rainbow' trail behind the objects that write to that texture when rendered.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjZciyOarw6fIVb05oCLOamISR7Vh_0HfNd2OJTl6k6j8lkt5g0VfKhdPB_mmZH8McTcdHHh-Jl0vQspubP2LArU2j9vsqbVfOpsalg7-nS8EjWRGhwHgTpTWYEstQ69hKkkZtrfa2-Bow/s1600/screen050.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjZciyOarw6fIVb05oCLOamISR7Vh_0HfNd2OJTl6k6j8lkt5g0VfKhdPB_mmZH8McTcdHHh-Jl0vQspubP2LArU2j9vsqbVfOpsalg7-nS8EjWRGhwHgTpTWYEstQ69hKkkZtrfa2-Bow/s640/screen050.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Objects leaving spectral trails on the screen. This can be annoying and so has been toned down and is mostly used sparingly for momentary effects like explosions.</td></tr>
</tbody></table>
<br />
At the end of the frame the postfx system is executed, which then goes ahead and renders a fullscreen quad for each registered shader effect - binding all necessary textures and setting all prescribed GLSL uniforms for each.<br />
<br />
Creating an effect stage involves loading its shader, creating an FBO, and attaching a single color texture to the FBO to be used as the effect's "output". A depth texture is also created and attached to the FBO only for what's called "FBO completeness", even though the full-screen quads do not convey any depth information. Still, a shader effect *could* generate depth values that would be written to the depth textures using gl_FragDepth, and that depth texture could be utilized by succeeding effect stages.<br />
<br />
From there shader uniforms can be added from the engine with "pfx_parm()", where ints, floats, vec3s and mat4s (4x4 matrices) can have a pointer to their memory stored and used to set any necessary values the effect's shader may require.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhDYBIP_LYrzFtSq7NUBil8JUJ3rIqYCXf0RzwTWVGwsEsA9hH-qNG4p9dyjWyO1XamUe8Y8VsZNyg9LEGe2RzjRvPosUo4a-3nWyHb_0Zd4muQdlTzocjTEfOkceNUGxC0S-wHxErr3U8/s1600/bitphoria_postfx_pipeline.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhDYBIP_LYrzFtSq7NUBil8JUJ3rIqYCXf0RzwTWVGwsEsA9hH-qNG4p9dyjWyO1XamUe8Y8VsZNyg9LEGe2RzjRvPosUo4a-3nWyHb_0Zd4muQdlTzocjTEfOkceNUGxC0S-wHxErr3U8/s640/bitphoria_postfx_pipeline.png" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="font-size: 12px;">Bitphoria's current postfx shader pipeline. Green boxes are texture outputs from effect shaders.<br />
<div>
<br /></div>
</td></tr>
</tbody></table>
<br />
Similarly, "pfx_input()" is used for referencing other effects, allowing their FBO color or depth textures to be bound when the current effect is being rendered, for routing the output of stages as inputs into others. Some effect shaders will use the FBO texture output of previous stages (or the output of later stages from the previous frame) as inputs for certain effects.<br />
<br />
<br />
<br />
<br />deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com3tag:blogger.com,1999:blog-4395646461527310891.post-65206202983953133722017-01-22T11:07:00.003-08:002017-02-17T12:16:57.055-08:00Todo List Prioritization, Screen Space Reflections, and a Long Break<div>
<br /></div>
<div>
After Bitphoria's initial release, I spent a few weeks with those who participated in helping to test it out and discovered a few bugs and things that desperately needed changing/fixing/etc. I also started working on fleshing out the three default games that come with it. In doing so it became clear what other features the scripting system was lacking. As a result of the public beta release the todo list grew quickly. Even though I was knocking out new features and bug fixes in that first few weeks the list continued to grow faster than I could keep up.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEihwRexYAnCCtx_f1YJR-lA8oGYo8-_vuntMiG4xhjMco6up-kIq1qpOjBI1IfyuNz3Z-ybujxNIazM4LYO9LwGKKqeOzVZYsKvYwEGi0I3vQc1EZaR3b8qbHW1pquGF8YD-o3GZoza1pc/s1600/changes.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEihwRexYAnCCtx_f1YJR-lA8oGYo8-_vuntMiG4xhjMco6up-kIq1qpOjBI1IfyuNz3Z-ybujxNIazM4LYO9LwGKKqeOzVZYsKvYwEGi0I3vQc1EZaR3b8qbHW1pquGF8YD-o3GZoza1pc/s640/changes.png" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Bitphoria's accumulated "changes.txt" since v1.0a release (formerly 'v1.00b').</td></tr>
</tbody></table>
<br />
<br />
I'd since decided to take a break. It was a power burn-out session in the few months leading to the established release date that I aimed to meet. The break was meant to strengthen my resolve in finishing Bitphoria as a project. The end goal is to produce a product that could hopefully generate an extra income for my family, my wife and two baby girls. I owe it to them to see this thing through after subjecting them to all of the time I have spent working on it.</div>
<div>
<br />
The master server has been running this whole time, well, at least 95% of the time, and I see that people have been looking for online games (of which there are usually none) and then starting their own to check out Bitphoria. I'm not going to continue actively promoting Bitphoria until things are further along. If other people want to show Bitphoria to others and share it, I'm not going to stop them. My goal here is a sort of soft-launch, whereby people can start playing with Bitphoria while it's being developed, and the 'die-hard-core' fans are the ones who keep it alive and seed a community and fanbase. Sounds good on paper, we'll see how it all pans out in practice. In the meantime I'm sticking with incremental public alpha/beta releases on itch.io until the scripted games are all fleshed out and everything's at least had a once-over - at which point I think I'll invest in a Steam Greenlight strategy, or maybe even a crowd-funding campaign for a virtually finished game! (That *is* what it takes nowadays to crowdfund a game, right?) At any rate there's no point to an all-too-premature push for exposure, which just so happens to have backfired on many-a-indie game project due to negative exposure from the poor quality of an unfinished product. This can end in permanently tainting and marring a game's reputation regardless of how the finished product turns out down the road.</div>
<div>
<br /></div>
<div>
At this point, having started diving back into Bitphoria's code and becoming reacquainted with the todo list and where things are at. It's become clear that there's still a long ways to go if I want this thing to be one-hundred percent uncompromising. However, I don't think I have it in me to pull that off. So, the next step was organizing the todo list by priority. What things are absolutely essential to the core idea of Bitphoria? What things can it go without? What things can I add in later that are aesthetic-only? What ideas are pipe-dreams? This is where the project is currently at. By my estimation Bitphoria should be on sale in the summer.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhodjl0nMfCyRrADv3QRBa1pJh13sXxkLmqJ_2bo5RjRG1ihbapqGHWIPqWOoH8UkeBQZTgO-ab8jsWXUcLSKkXs0Ci6M-1GwS6IYdcW27i0UTahYTxHm_FMwFF9oUrpFFIVGCLVA63EW0/s1600/4604gbddbox.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="465" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhodjl0nMfCyRrADv3QRBa1pJh13sXxkLmqJ_2bo5RjRG1ihbapqGHWIPqWOoH8UkeBQZTgO-ab8jsWXUcLSKkXs0Ci6M-1GwS6IYdcW27i0UTahYTxHm_FMwFF9oUrpFFIVGCLVA63EW0/s640/4604gbddbox.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">I forgot how awesome it was to upgrade. Been out of the hardware game for over a decade.</td></tr>
</tbody></table>
<br />
The todo list is fully consolidated, after another sleepless night, and I have my work cut out for me. I also recently bought a GPU for myself for Christmas, a 4GB XFX Radeon RX 460. Literally the first GPU I have gone out of my way to acquire in 15+ years. Half the reason I bought it was to play DOOM. Incidentally, the last GPU I made it a point to acquire was a GeForce3, back in 2001, so that I'd be ready to play Doom3, which ended up not being released for 3 more years. The new GPU has afforded me the opportunity to get crazy with Bitphoria's graphics, and makes it possible to record at least 720p gameplay footage at 30fps. The original plan wtih Bitphoria was to design a cool looking game/engine that would run on older and/or budget hardware at playable speeds - where 'playable' in my gamer opinion is 60+ frames per second, minimum.<br />
<br /></div>
<div>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgFl6xPSY-W-e31Y99qa7P4Ib29bzJghcnUakX1bFRhCvkW7ADlHcnF0dtfq7J-A3N222bVrHceZl82FJ-zakf7b_bByjXybNPVTlrKcu0ArDWYPmPk9gRih-DSgcDksyk1QS2Kms6si_I/s1600/screen041.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgFl6xPSY-W-e31Y99qa7P4Ib29bzJghcnUakX1bFRhCvkW7ADlHcnF0dtfq7J-A3N222bVrHceZl82FJ-zakf7b_bByjXybNPVTlrKcu0ArDWYPmPk9gRih-DSgcDksyk1QS2Kms6si_I/s640/screen041.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Now for a blurring post-process shader to hide the jitter-induced dithering!</td></tr>
</tbody></table>
<br />
I just finished implementing a decent screen-space reflection shader, written by hand from scratch, based on my own ideas about how to get the information where it needs to be in order to make it happen. Instead of having multiple FBO textures storing surface normal, surface position, etc.. I am actually only storing the vertex-shader calculated reflection normal that is interpolated by the fragment shader of reflective surfaces. This is in contrast with how most implementations calculate the reflection normal in the fragment shader using the fragment's normalized camera-space coordinate and normal of the fragment - both output by the reflecting surface's shader into (typically) different FBO textures.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiiJdim0CdybE70GF6JkFqGG1APEjxJZhyphenhyphenLV11QiyPgm2oAiNH8KWn_wX1dI5tNZ3iMq85pbr2mx-sWpHIj81KuNBPISdyAcE9Kh8X-7Th9xXYgbnqwEY7hcED7ASNUVeGYcxcS7PO2FUU/s1600/screen001.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiiJdim0CdybE70GF6JkFqGG1APEjxJZhyphenhyphenLV11QiyPgm2oAiNH8KWn_wX1dI5tNZ3iMq85pbr2mx-sWpHIj81KuNBPISdyAcE9Kh8X-7Th9xXYgbnqwEY7hcED7ASNUVeGYcxcS7PO2FUU/s640/screen001.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Camera-space reflection vector, depicted as RGB representing XYZ -1/1.</td></tr>
</tbody></table>
<div>
<br /></div>
In Bitphoria, my idea was to pre-calculate the reflection vector in the vertex shader to be interpolated across surface triangles, and then pack it into the red-green channels of a single RGBA texture from the fragment shader using a spheremap transform, which is much more compact than packing both a coordinate *and* surface normal to accomplish the same result. However, I still need the coordinate of the fragment to perform the actual raytrace, which happens to be reconstituted from the fragment's W coordinate, stored in the alpha channel of the same texture. With both the W coordinate and depth buffer texture, as well as the pre-calculated inverse of the projection matrix that's used to render the scene, I can very cheaply calculate the camera-space coordinate of the fragment.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhAP4doj5H3S0MGTYG8Hr-pP8D8X6Vc9c5d4_9qcj2doPbwa2cy1oIGcRryInAtJaRQp07tspMYGfQrceK7HCL13gh2fjVB92HyA2jviG1TkNKmRehWobLispjCQ-HnG8oHiZP2pbHrgZY/s1600/screen042.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhAP4doj5H3S0MGTYG8Hr-pP8D8X6Vc9c5d4_9qcj2doPbwa2cy1oIGcRryInAtJaRQp07tspMYGfQrceK7HCL13gh2fjVB92HyA2jviG1TkNKmRehWobLispjCQ-HnG8oHiZP2pbHrgZY/s640/screen042.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">3D texture materials give surfaces a parallax depth to them unlike anything you've seen in a game.</td></tr>
</tbody></table>
<br />
I also gave the sky shader a much needed re-design, along with tweaking the 3D material rendering shader to literally shade the depths of the procedural 3D materials. The 3d material shading method would have naively been implemented by sampling a 2x2x2 area and gauging the overall density gradient normal. The dot product of that normal and the incoming light vector would be used to determine the brightness of that point in the material during rendering. However, as Inigo Quilez explains in one of his articles (link below), you only really need two samples of the 3D volume - along the incident light vector itself.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiUbLGbFyO3qjZkXU5FIFVNGOAC4ht5xnwEY2SPRimzb-bq-JJwh_5t42W54-6uBZq7R3JJz63ekus-NYiSosycZsBSntubCqV5TtBkGSjPgymNHBNrXOOQPivxnjGaz08_Q3hcF49iQ8A/s1600/screen044.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiUbLGbFyO3qjZkXU5FIFVNGOAC4ht5xnwEY2SPRimzb-bq-JJwh_5t42W54-6uBZq7R3JJz63ekus-NYiSosycZsBSntubCqV5TtBkGSjPgymNHBNrXOOQPivxnjGaz08_Q3hcF49iQ8A/s640/screen044.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The sky is like techno water, slowly drifting and undulating like pool water caustics.</td></tr>
</tbody></table>
<br />
Even with a new GPU there's room for optimization, and the plan now is to refactor the ultra-basic post-processing system I wrote a while back to perform a basic Bloom glow shader, and properly integrate everything. The sky is procedurally generated by its own fragment shader, and could stand to be downscaled by a factor of two, or possibly four, and rendered to a texture that is then upscaled to the full view size. This is also the plan for the screenspace reflections which are ironically slower with a higher jitter factor that is used to disguise the banding artifacts by introducing a sort of dither. It's almost faster to just have a smaller raytrace step increment than it is to have a larger one and hide the banding with the jitter. It's quite possible that this is specific to my GPU. In either case, some kind of antialiasing/blur post processing shader is in order.<br />
<br />
Another thing I am exploring is the generation and use of a max-z quad-tree where the depth buffer is successively reduced into higher and higher mipmap levels that represent the nearest-most point. Raytracing then occurs through the resulting quad-tree. But, resorting to simply rendering SSRs at 1/4 the area of the full-resolution screen seems like it will speed things up to where the construction of a depth-buffer quad-tree would be slower - unless I were to utilize it for more than just SSRs (screen<br />
<br />
A few other things on the todo list are team-assigned 3D materials, whereby different areas of the world can have different 3D texture materials ascribed to them so as to more clearly depict team territories. Alongside that I am toying with the idea of allowing game server admins to also choose different procedural algorithms for each team, so that different teams' areas of the world can be dramatically contrasting with one-another. At the very least I aim to allow server admins to choose from different world-generation algorithms that would dictate what the entire world would be like.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjwJvU18mWf34GbpHgJA2jHlj1EOclNEq81SKfpBLJidhMlHDLa7zOa5YgCaa4bEwAZMD_R2fwTvlElK1o4u1w7r8kQXlo63hrEVTi-AzMqNBrfN413EBdjedRsqVkhvEo13phEn79JDNA/s1600/itchioreferers.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjwJvU18mWf34GbpHgJA2jHlj1EOclNEq81SKfpBLJidhMlHDLa7zOa5YgCaa4bEwAZMD_R2fwTvlElK1o4u1w7r8kQXlo63hrEVTi-AzMqNBrfN413EBdjedRsqVkhvEo13phEn79JDNA/s1600/itchioreferers.png" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">More visitors coming from <a href="http://www.itch.io/">www.itch.io</a> itself lately, as opposed to old Reddit/Gamedev posts.</td></tr>
</tbody></table>
<br />
Anyway, I noticed that I've been getting a trickle of Twitter followers, when I normally never even look at Twitter, so I've begun tweeting more. I also noticed that on itch.io both my developer page that shows my three projects on there, including Bitphoria page, are getting more traffic from itch.io itself - whereas in the past the traffic came almost entirely from Reddit posts and gamedev forums where I shared a link to it. It would seem that Bitphoria has reached some kind of low-level critical mass for it to now be more prominent in search results. So that's neato.<br />
<br />
<br />
<br />
Links:<br />
<div style="font-family: verdana; margin: 0px;">
<a href="http://jcgt.org/published/0003/04/04/paper.pdf">Efficient GPU Screen-Space Ray Tracing</a></div>
<div style="font-family: verdana; margin: 0px;">
<a href="http://roar11.com/2015/07/screen-space-glossy-reflections/">Screen Space Glossy Reflections</a> - Roar11.com</div>
<div style="font-family: verdana; margin: 0px;">
<a href="http://aras-p.info/texts/CompactNormalStorage.html">Compact Normal Storage for small G-Buffers</a> - aras-p.info<br />
<a href="http://www.iquilezles.org/www/articles/derivative/derivative.htm">Directional Derivative Based Lighting</a> - iquilezles.org</div>
<br />
<br /></div>
deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-64097915008429031952016-09-02T12:28:00.000-07:002016-09-07T14:41:36.464-07:00v1.00 Public Beta Testing<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh0vqbDqZeHnEhfKdA1r2n72BY3HzTqVSiwW0yltErPV5uVnY2wRhalXGc0Z-olOkqeam801QItNBeBzqSga2cPnq73T3JFGg4B1N-txUtv2iVuxUCyDgOT-xHhJsZgMmHXrRlIFZsTrJs/s1600/screen116.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh0vqbDqZeHnEhfKdA1r2n72BY3HzTqVSiwW0yltErPV5uVnY2wRhalXGc0Z-olOkqeam801QItNBeBzqSga2cPnq73T3JFGg4B1N-txUtv2iVuxUCyDgOT-xHhJsZgMmHXrRlIFZsTrJs/s640/screen116.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The biggest Bitphoria game in history.</td></tr>
</tbody></table>
<br />
Looks like some people have started taking interest in Bitphoria. There are a lot of kinks to work out, some people are having issues starting a game and having their friends join. I'm looking into it. Lots of little things are being figured out: players can be hard to see, chat messages disappear too quickly, the server I'm running crashed in the middle of the night while two people were playing, AI guys are more aggressive than they need to be, etc.<br />
<br />
The end result is inevitably a new release, v1.01, which will have a new network protocol version. The master server will only relay game servers to your version of Bitphoria that are running with the same network protocol. I will be updating my game server to v1.01 when it's released, so people who are still playing with v1.00 will not be able to see it or play on it. An auto-update feature will be implemented down the road, for now my focus is on the engine.<br />
<br />
It was also suggested that an automated installer would make it easier for some people to start playing Bitphoria. I already have a process in place for achieving this and wasn't planning on utilizing it until Bitphoria was out of beta, but I have decided to include an automated installer with the next release.<br />
<br />
I estimate v1.01 to be released some time next week, maybe sooner.<br />
<br />
Thanks to everybody who downloaded the current version and has been playing with it. This has been more fun that I could have imagined :)deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-17112090188548691092016-08-30T11:52:00.002-07:002016-09-07T14:41:59.297-07:00Bitphoria v1.00 Beta Released<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhr8L4P6O6CEwVUNACKy0mNIclhrpEUyvh9NvslK2Bteiy5LeLM0CIg8YDUxZ8wnFMQfCKlhPLHY4d6SyDvmxyOidj81r4B5rQWL2QWFA-zBFYMip5m63nFltCByDh_ZnVxaqWiqGccULE/s1600/screen064.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhr8L4P6O6CEwVUNACKy0mNIclhrpEUyvh9NvslK2Bteiy5LeLM0CIg8YDUxZ8wnFMQfCKlhPLHY4d6SyDvmxyOidj81r4B5rQWL2QWFA-zBFYMip5m63nFltCByDh_ZnVxaqWiqGccULE/s640/screen064.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Pre-release early screenshot of Bitphoria.</td></tr>
</tbody></table>
Bitphoria has been released. It's in public beta, and this is the very first (aka 'worst') version ever. Start games (don't forget to forward your UDP port if behind a router) and play with others. Dive into the scripting system and make whatever your heart desires. Let's see what Bitphoria can do.<br />
<br />
An engine guide is in the works now, to help users better understand the various console variables and how to make full use of them as a sort of 'power user'. I'm sure anybody with their wits about them could figure out most of them just by surfing the console typing a few characters and pressing 'tab'. The confusion, by comparison to most engines with a console, will lie in the fact that the vast majority of console commands are for scripting, and not intended to be executed while in a game, or while just idling in the main menu system.<br />
<br />
If you're the type of person who likes to get into new things, unafraid, and share your findings with others, then by all means download Bitphoria, start screwing around, and Youtube your experiences. Share this thing with the world. If you could notify me of your intentions to release a video of your time with Bitphoria that would be great, just so I could follow along and gain some insight as to what I should work on or do differently.<br />
<br />
<br />deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-61854469151361845782016-08-29T13:45:00.001-07:002016-09-07T14:42:16.126-07:00Final Day Thoughts<br />
I started Bitphoria in April of 2014, focusing solely on how the world was generated, represented, rendered, etc.. I had a pretty clear vision of what I wanted the engine to be capable of, and I feel I have achieved that.<br />
<br />
Yesterday I finally reached the 20k lines of actual code that I predicted Bitphoria would have by its release date. The scripting manual for making games out of Bitphoria is complete. I'll be uploading the game tonight for release. It's a public beta version that has all of the features I wanted but are not fully polished and probably have some bugs. The goal is to see what kind of feedback comes from releasing it in the state it is in. I haven't been actively promoting Bitphoria very much at all, and figure that if it's any good it will sell itself by spreading via word of mouth. People will post screenshots, youtubes, start making games, playing against eachother online, etc..<br />
<br />
Only time will tell.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiaBhGzBMaFa2ujOaJZZ9-Ztuw5S6MApc6qdmroqO7-yFuPxXbRSQYSUenqKrRBFhytE9sepV6Xe0XQYh-m83bfNemTCdUVLyP3M1vqPZRfG4txxXJgNdiBDhlqPvbYsvVhZnXLrchuP48/s1600/bitphoriasourcefolder.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiaBhGzBMaFa2ujOaJZZ9-Ztuw5S6MApc6qdmroqO7-yFuPxXbRSQYSUenqKrRBFhytE9sepV6Xe0XQYh-m83bfNemTCdUVLyP3M1vqPZRfG4txxXJgNdiBDhlqPvbYsvVhZnXLrchuP48/s640/bitphoriasourcefolder.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Bitphoria's source code folder, comprised of 20k actual lines of code.</td></tr>
</tbody></table>
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
Tomorrow I will be sharing its release on Reddit and other various online communities, so we'll see what sort of response develops. This blog itself has just been a place for me to share my thoughts, and it does get a few dozen hits a day now, but nothing I would consider a great internet success, not by a long shot. I hope some people have found enjoyment and value in my blog, and perhaps if more people knew about it there would be more people that did find value in it.<br />
<br />
<div>
<br /></div>
I developed Bitphoria on my own time, while being a full-time stay-at-home Dad and running an Etsy business with my wife Heidi. It's a project I've always wanted to do, that I've attempted many times for the past 20 years, and life always seemed to get in the way. It was a matter of time before I finally made something.<br />
<br />
Bitphoria's future is in the hands of the people after today. Once it's out there I'll probably stop working on it altogether unless it starts developing some kind of following or fanbase and I begin receiving feedback. At this point, it will serve as a great portfolio piece that demonstrates how well-rehearsed I am with all aspects of game development, from graphics programming, to networking programming, procedural techniques, and everything else in-between. This is just what I could do by myself in a limited amount of time, and I could do a lot, because I always dreamed to.<br />
<br />
Bitphoria avoids issues like asset management and art pipelines, because I didn't plan on having artists, I planned on the players being the artists, almost in the same way that Quake and Half-Life were made popular by Team Fortress and Counter-Strike respectively. It wasn't the base game that made the game famous so much as it was the creativity of the masses. I'm counting on this, somewhat, with Bitphoria, and the fact that people like to make stuff. Bitphoria is meant to serve as a platform for people to create and share their creations, by abstracting things in a highly simple fashion that is easily accessible through a simple script-based construct.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhrSj2zpew2ASuMwz6xz4_a0wtfeJ7iUjs695e9HplVdChIO3srKtQ-XHh7Nrz1eHOd4vRntujg29v00XkL9qtz4flRFQcpxewuEsJxUeNNmhDYUKtE8_Xa39tQPIewbUQ8Yk3gu583h5A/s1600/bitphoriainitialization.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhrSj2zpew2ASuMwz6xz4_a0wtfeJ7iUjs695e9HplVdChIO3srKtQ-XHh7Nrz1eHOd4vRntujg29v00XkL9qtz4flRFQcpxewuEsJxUeNNmhDYUKtE8_Xa39tQPIewbUQ8Yk3gu583h5A/s640/bitphoriainitialization.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Bitphoria's initialization function.</td></tr>
</tbody></table>
<br />
<br />
My goal was to have three separate games for players to play that were somewhat fleshed out. I will probably flesh them out in time, but I spent more time in the engine code than I had thought I would, so this first release will only really feature the deathmatch game, and the two barely-started CTF and instakill games that are far from complete. Fortunately every functionality of the scripting system is utilized by these games, for I had to test each one while developing it, so they appear somewhere in at least one of the three games' scripts. Hopefully this, in combination with the scripting manual I toiled to produce, will allow someone to start making something more inspired than the default games. The capabilities required to make a great game is there, that's a promise.<br />
<br />
The master server will be running once I upload and link the release. I haven't fully tested it yet, so if you're one of the first people to download and playtest with a friend, please let me know if you encounter any issues immediately, so I can rectify them as quickly as possible and get something that people can fully dive into without having to worry about being able to connect with other players.<br />
<br />
This is going to be a sort of Wild-West time if Bitphoria picks up and people start taking interest. There will probably be bugs and simple tricks to take advantage of the game, perhaps malicious server-crashing bugs, etc. that will be discovered and exploited. If these are found I will solve them just like everything else throughout the game's development, and they will only be found if people are actively trying to enjoy it. If people want more from Bitphoria, I will work to provide it, and it will be a process but we can get it there. I've been developing Bitphoria in a virtual vacuum, without much outside influence or suggestion, aside from the few comments coming from my long-time friend Paul Hindt. Other than that, this is the first time Bitphoria will see the light of day. This is going to be the worst Bitphoria release yet, and better ones are to come if enough people are interested.<br />
<br />
If the game is DOA and nobody takes interest, I will eventually release the engine's source code as well, because I feel it is important for people to have resources to gain inspiration, insight, and ideas from. I took a lot of inspiration from the Quake engine, and the original Cube engine, insofar as some of the conventions for game logic are concerned.<br />
<br />
Following in the same vein, Linux and MacOS ports will not be hard to produce being that the game runs almost entirely ontop of SDL2, but I don't plan on doing the necessary steps to releasing ports unless a demand arises. The only platform-specific code involved in the engine, as a matter of fact, is the code that lists all of the games in the games folder. There is no platform-independent method for listing the files in a directory so I wrote a simple command-line that performs a 'dir' command and dumps the result into a temp file that is loaded and the names of the game files are then extracted from. Other than that everything is occurring through SDL, so it shouldn't take very much time to compile ports.<br />
<br />
All-in-all this release could have happened sooner, but I didn't want to have an Alpha release that lacked many features from the final product. At least at this point they are present and accounted for. There are maybe two or three little things I'd still like to add in, but at this point the project needs to get into the hands of the public so we can see what happens with it and where it should go next, if it is something that's bound to go anywhere at all.<br />
<br />
Ultimately, if Bitphoria becomes somewhat popular, I will be pushing for actually selling it. To mitigate piracy I will be selling online player accounts, following the Minecraft model, which will integrate with the master server and be required in order to play on servers that are listed on the master server. That's down the road, but is solidly the plan if, again, people show interest.<br />
<br />deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-49535719747243097392016-08-02T19:08:00.001-07:002016-08-11T13:03:46.756-07:00Bitphoria's Release Date Announcement<br />
I have decided to make my 30th birthday the release date for Bitphoria, which happens to be August 30th 2016. There are several things I'd like to finish, among bug-fixes and networking improvements, before the first release and I thought I'd share that list here. Whatever state Bitphoria is in on that day it will be released.<br />
<br />
It is obviously crunch time now (a great motivator for me) so the things on my list that I'd like to have done might be cut short, but I will continue working on it well into the future beyond the release. This will manifest itself in the form of new and updated releases. The initial release itself is meant to serve further development of Bitphoria by allowing other people to finally start messing around, playing games, scripting games, and providing me with bugs and feedback about everything.<br />
<br />
I plan on releasing Bitphoria as a sort of public beta, for people to get to know it and see what I've dumped almost two years of my life into. Maybe down the road I will sell it for a few bucks on Steam/Itch.io. Maybe I should crowdfund polishing it, because I clearly have something that's worth something to someone, somewhere, out there, and I think it has a chance at being a crowdfunding success (after all the hard work is done already).<br />
<br />
Hardware contributions/donations are always welcome! ;) I'm currently developing Bitphoria on four different machines: two low-end laptops from a few years ago, a low-end pre-built HP desktop from the same time, and a custom-built desktop - all with integrated graphics. I did manage to borrow an nVidia GTX 680 in the custom-built desktop system that allowed me to record some footage of Bitphoria earlier on, but have since been without and could definitely use a decent GPU just for the sake of tuning Bitphoria's graphics to where high-end systems can be fully taken advantage of for hardcore PC gamers, as well as allowing me to record updated videos of what Bitphoria looks and behaves like. Recording video is actually my primary goal insofar as GPU acquisition is concerned, because without it I will have a harder time demonstrating the game. Worst-case scenario hopefully fans out there will take it upon themselves to youtube videos of Bitphoria. If nobody does that then I've failed at making something that sells itself, obviously, and that's the goal of any project of mine!<br />
<br />
If the case is that nobody ever cares about Bitphoria, and it just gathers cobwebs and dust in a dark corner of the web (on this blog, and in a few reddit posts), never amounting to anything more than just a "portfolio piece" for me to acquire a dead-end soul-crushing software engineer job, then at that point I will just release the code in its entirety for everyone to tinker with and learn from. At any rate, the code will ultimately be released eventually. The 'when' of that is still up in the air, and probably will be for at least another year. But if I can earn a living for my family via my lifetime hobby then I'm certainly going to try before giving it all away. Over two decades of programming experience has gone into this project, and there are even things that I learned along the way while developing it that I'd like to invest into a new game engine project, but those are things that I'm just going to hang onto until I feel that this project is 'done', through-and-through, which includes promoting it to a respectable degree that hopefully provides it with enough exposure to ascertain as to whether or not it is something that there is interest in.<br />
<br />
As far as my pre-release todo-list is concerned I believe that setting up a master server for people to find each-others' games is imperative, otherwise the whole project is pretty pointless if people can only start a game server, run around by themselves, and then quit out and never play it again. Scripting features and functionality seem less consequential and could be added in along the way at a later time after the initial release, but the existing setup is rather competent. Finishing the scripting documentation also remains a top-priority issue due to the fact that I wish for people to be able to start playing around with the engine itself to see what they can come up with within the creative sandbox that it represents. It has its quirks and nuanced requirements insofar as game performance are concerned (graphically, physics-wise, and networking) which is the case with any game engine out there. However, I would like to point out that it takes no game-development know-how or modding experience to make something out of, or with Bitphoria. Anybody should be able to simply open up the script files that comprise the default games included with Bitphoria and start surfing the documentation to figure out how it all comes together.<br />
<br />
Here's my list of things left to do for Bitphoria before releasing it that I'm currently focusing on for the next few weeks:<br />
<br />
<br />
Master Server: Write and launch a master server to run here from my home to allow players to be able to start and find/join eachothers' games. Along with this I'd prefer to invest in a domain name that points to the master server (my house IP) for the in-game server browser to download the server list from. This would actually kill two birds with one stone. Firstly, it would eliminate the learning curve of PHP and MySQL that I'd need to traverse if I were to create an HTTP master server. I *do* have enough experience with both to make it happen but I'm not competent or well-rehearsed enough to simply knock it out within a day or two - and it would require that I implement HTTP request functionality into Bitphoria, be it by hand or integrating CURL/libCURL. Secondly, running a custom-coded master server from home would allow me to easily implement a NAT-punchthrough handshake protocol to allow anybody to start a game without being required to deal with their router/firewall and port-forward. Conversely, anyone else would also be able to join any game regardless of whether or not they themselves are also behind a router/firewall. This functionality requires that the master server notify game servers when someone is trying to join them, and from what IP/port, allowing the game server to go ahead and send a 'trailblazing' packet to the client player for the sake of tunneling through the NAT and have it route packets from that client player's address to the game server. To have a console application running in the background on my desktop, written in C, using winsock or SDL_net, that would be easy to manage would be much simpler and cheaper than any remote/online option I've come across so far. If at some point there's too much traffic for my home connection I'd then move the master server program to a dedicated virtual host and simply point the domain name at that, and everything will just continue working for everyone without any changes for end-users.<br />
<br />
<br />
Network Buffering: Implement a network buffering system to, for one, allow for the simulation of latency and its fluctuation for testing/tuning purposes. So far I've only been able to generate repulsive/gross/erratic network behavior when my daughter watches Youtube/Netflix on her machine while I have two machines sharing a Bitphoria game over our wireless LAN. Also, in spite of my efforts to fight network update jitter using extrapolation when an update packet arrives later than intended, it would seem that Gaffer's strategy (gafferongames.com) of simply buffering network updates long enough to encompass most network jitter and then subsequently emitting them to the engine internals at the interval they were intended to arrive at would be vastly more effective in its precision and lack of interpolated correction. Buffering network packets as such would also allow me to simulate internet conditions locally without having to track down willing testers (I'm not really a 'people-person', and I have no more computer friends left) and fine-tune everything for what I like to think of as the 'fringe-market' - people who don't have high-end gaming systems (a netbook) or fiber-optic connections. I'm of the mind to release a game that looks prettier the better the setup, but also is completely performant on less ideal setups. Why focus on one area of the market if you can focus on the entire market?<br />
<br />
Scripting System Documentation: Finish the scripting system documentation, which would include indications as to where inside of the the sample/default games that are packaged with the initial release that users can find examples of each script command, along with tips/tricks. Also, early on I made it a point to document everything that I script for Bitphoria because I know that people out there should be encouraged to tinker around and pull everything apart.<br />
<br />
Artificial Intelligence: Add in simple AI functionality that allows an entity to seek out other entities and designate them as its target, which it can then follow or aim at. This would make it possible for simple zombie-type enemies that just follow players and harass them while they try to carry out other things. I'd like to include some kind of simple navmesh generation that is derived from the distance field of the world but I am thinking I am just going to literally index the distance field and use that for obstacle avoidance while entities try to pursue other ones. If a target is unreachable then it could back out following the distance field outward until the target is visible again, or a new target is found.<br />
<br />
World Generation Modes: Add options for the style of the world generation itself. Right now it's a fixed algorithm and starting a game server entails selecting a random seed and then a vertical-scale which is forwarded to clients so they can then generate the same world themselves and play. The world is generated as a 128^3 volume and there's a lot more that could be done with it than just leaving one algorithm in there for people to experience.<br />
<br />
Physics Attachment: Entity attachment which allows for an entity to literally attach itself to another entity and inherit its position/motion, with or without an offset that is oriented with the entity being attached to. This could be used for CTF games or power-up modes that want to display some kind of visible effect.<br />
<br />
<br />
These are more-or-less listed in order of priority. What actually happens by the release date is still completely variable, these are just my intentions and goals, and it's hard to say what exactly will occur as I pursue each bullet point.deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-83203167843747633172016-06-19T17:53:00.002-07:002017-02-17T10:54:08.054-08:00Bitphoria Screenshots<br />
I just felt compelled to post some new screenshots. Not much has changed as I've been working on underlying stuffs and writing the scripting manual, but it sure looks purdy.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj2UlqTn_IMcsDFW_Qd6LgPaRn7tXQCWmjkyIaU1KO20XccnqYyHFjb3xQH6Tu6sU9uH9B39BFV_lD6LoQzm_pjP3ZZQEMU1-iYKpagcKcgihm4xPxLFyI3FXMRAqfqlncngQJKlaNFAUE/s1600/screen001.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj2UlqTn_IMcsDFW_Qd6LgPaRn7tXQCWmjkyIaU1KO20XccnqYyHFjb3xQH6Tu6sU9uH9B39BFV_lD6LoQzm_pjP3ZZQEMU1-iYKpagcKcgihm4xPxLFyI3FXMRAqfqlncngQJKlaNFAUE/s640/screen001.jpg" width="640" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8eh8DyZwS_taf-qmtOWR48OQn9t9Ohj1WIcIRCtQjj7ZdVo1B7gtjvo-7ibxM3ub8P2LMd5Kk_AvtsdebNSpEB8ldp6C_ya8Pzv0FQfsWHeNvQHytbtmQeOQhcq-aB32Ajq-7QYlkXsc/s1600/screen005.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8eh8DyZwS_taf-qmtOWR48OQn9t9Ohj1WIcIRCtQjj7ZdVo1B7gtjvo-7ibxM3ub8P2LMd5Kk_AvtsdebNSpEB8ldp6C_ya8Pzv0FQfsWHeNvQHytbtmQeOQhcq-aB32Ajq-7QYlkXsc/s640/screen005.jpg" width="640" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgzv9IhxeULTogQuWlfAVALn6kA6QfsLl63fm0Mqtzc1__P1uwX0xmsVEedNwi-i2jVyxUtQ7jKPzpAh1Jee9HelDArxTdVeyGbKan0T7L_VKkuiEojfyEnlDybnfZVy4Tdel67aX3aldc/s1600/screen007.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgzv9IhxeULTogQuWlfAVALn6kA6QfsLl63fm0Mqtzc1__P1uwX0xmsVEedNwi-i2jVyxUtQ7jKPzpAh1Jee9HelDArxTdVeyGbKan0T7L_VKkuiEojfyEnlDybnfZVy4Tdel67aX3aldc/s640/screen007.jpg" width="640" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhmvGLFeLEsIgZTd42TbT-HqNL2ExtcpXla5TK8cOUWihiD01kRVNH3vib94WhQ6tshNhwQHncck-C_kLu831Olmemyt4woWZsTE51ME9w3O-DqPTNE4L3oiErAPVCGi__WrA4jBelKUl8/s1600/screen010.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhmvGLFeLEsIgZTd42TbT-HqNL2ExtcpXla5TK8cOUWihiD01kRVNH3vib94WhQ6tshNhwQHncck-C_kLu831Olmemyt4woWZsTE51ME9w3O-DqPTNE4L3oiErAPVCGi__WrA4jBelKUl8/s640/screen010.jpg" width="640" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8aZDsW9mfNdVQp6PkR3WebktPdZnrn-Hqz-kgkTHsuQ5u4AQqf_EY2-6ZLtzCvALrqDxpYyW-t7u9m7Kz7oNPFStI6uVEMeMn5TSFvCEmEhtpx7eXjEtwecWenpjsL0zPmdOsVel_jYk/s1600/screen011.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8aZDsW9mfNdVQp6PkR3WebktPdZnrn-Hqz-kgkTHsuQ5u4AQqf_EY2-6ZLtzCvALrqDxpYyW-t7u9m7Kz7oNPFStI6uVEMeMn5TSFvCEmEhtpx7eXjEtwecWenpjsL0zPmdOsVel_jYk/s640/screen011.jpg" width="640" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh_S0wKOKMH8Lrg6I0cv1vDKBxK6bOmTby5qkh3IUHwZbx6BYwQnLSlBefZMbeQ0edL0G1yx-bAS20ACe6lw56lzGzmbwgKMLFeQqX8j6_BfMvr8m9ZOHWQRmp9kTD3ulHF_R43Kg8ZLvk/s1600/screen012.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh_S0wKOKMH8Lrg6I0cv1vDKBxK6bOmTby5qkh3IUHwZbx6BYwQnLSlBefZMbeQ0edL0G1yx-bAS20ACe6lw56lzGzmbwgKMLFeQqX8j6_BfMvr8m9ZOHWQRmp9kTD3ulHF_R43Kg8ZLvk/s640/screen012.jpg" width="640" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh16v6hslDJutUw3I-TEkVnrpQzl4KzBy72-LwpnWCtYRS6zO-xFFXsLwfEoJSI4OrGdh_nMdyvXc6Zeu1efuqmQM0iK3fARfmQgEOWpSkJ9FgMgu5XcLv9F9qKm5YF2XoJA6oIrnln4Ik/s1600/screen013.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh16v6hslDJutUw3I-TEkVnrpQzl4KzBy72-LwpnWCtYRS6zO-xFFXsLwfEoJSI4OrGdh_nMdyvXc6Zeu1efuqmQM0iK3fARfmQgEOWpSkJ9FgMgu5XcLv9F9qKm5YF2XoJA6oIrnln4Ik/s640/screen013.jpg" width="640" /></a></div>
<br />
Check back soon for more updates. I'm in the midst of designing the master server situation so players can start and find one-anothers' games easily without any port-forwarding nonsense. I'm also hammering out the Bitphoria Scripting Manual as a guide and reference for people who want to make their own games out of Bitphoria and share them with the world via the ingame server-browser. Players do not have to download anything externally to play custom Bitphoria games. It's a thing of beauty.deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-70378586706792411882016-04-29T17:26:00.002-07:002016-06-23T14:49:21.741-07:002D Rectangle Overlap Algorithms<br />
While working (not so) diligently on the re-write of Holocraft I encountered a situation where the need for an age-old algorithm arose. The algorithm itself, and/or its implementation, has had many minds over the course of time to boil it down to a few bare-necessity lines of code.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgY4GGH3qouJ2MjEUxWxsnT46u89Gr83N3Vjkhl51dOkNsD-K8bcf8sI3vP-vXiMBvjJux-7jTzbmM0e7lH5eJhR_Pr3OrA6t8AgZ3iiziXLzePd_s39zKF6M7OeDLMRUFaLTsIzh_9vug/s1600/6-Both-Collisions.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="120" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgY4GGH3qouJ2MjEUxWxsnT46u89Gr83N3Vjkhl51dOkNsD-K8bcf8sI3vP-vXiMBvjJux-7jTzbmM0e7lH5eJhR_Pr3OrA6t8AgZ3iiziXLzePd_s39zKF6M7OeDLMRUFaLTsIzh_9vug/s400/6-Both-Collisions.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Example of the problem and some possible conditions.<br />
From: <a href="http://www.owenpellegrin.com/articles/vb-net/simple-collision-detection/" target="_blank">http://www.owenpellegrin.com/articles/vb-net/simple-collision-detection/</a></td></tr>
</tbody></table>
<br />
Determining whether or not two rectangles overlap or intersect has been one of those problems that has found application in many a project. It's rather simple to implement, and most novice programmers take pride in discovering a solution to the problem, even including the case where the algorithm is extended into the 3rd dimension in the form of 'axis-aligned bounding-boxes', or 'AABBs'.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgOUXjkH_O57a82AXpENN0gK4NFdXzMW93Pw4E_QgnFOOf_Bf7PuEJB9ZNh_knFM_60cyRSWdbuch1MB3q1OSKRGbJmwTV7VqRwKc1hFrxXgjTSFD0aPSa5QgTmP39tZjstFkcuKM_N9ck/s1600/aabbtoaabb.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgOUXjkH_O57a82AXpENN0gK4NFdXzMW93Pw4E_QgnFOOf_Bf7PuEJB9ZNh_knFM_60cyRSWdbuch1MB3q1OSKRGbJmwTV7VqRwKc1hFrxXgjTSFD0aPSa5QgTmP39tZjstFkcuKM_N9ck/s1600/aabbtoaabb.png" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">3D axis-aligned bounding boxes intersecting.<br />
From: <a href="http://www.miguelcasillas.com/?p=30" target="_blank">http://www.miguelcasillas.com/?p=30</a></td></tr>
</tbody></table>
<br />
There have been many games which utilize either a 2D or 3D variant of this algorithm to determine if two objects in a game world are colliding with one-another. The original Quake game is an example of a game where entities had their collision volume defined by way of an axis-aligned bounding box. This caused players to axially slide off other objects as if they were a flat cubical shape. Personally, I prefer using a cylindrical or capsule collision volume for players and NPCs, because this gives a smoother collision resolution where the player just sort of slips and slides around other players and objects. Most modern games utilize this sort of collision volume for players nowadays for low-fidelity intersection tests, and sometimes as a pre-test to avoid the cost of doing more intricate intersection tests in the case of hitscan weapons and the like.<br />
<br />
At any rate, during my outset to re-write Holocraft it became clear that it was imperative that there be a way to detect where groove optics are overlapping, or otherwise intersecting one-another due to the fact that many of my trial-holograms thus far have been muddied in areas where the density of intersecting groove optics is too many. Too many intersections and too much overlap of grooves can completely annihilate the clarity of their reflectivity. The solution is to prevent the machining process from scribing over the same areas, allowing the fore-front optics of the hologram to take precedence over the background optics, optimizing the overall clarity of the hologram as a whole.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhbUV7KJURjszzwToZLa8knzM64tKMl3jeE7fAnNgmRp8q4MOvn59pbCulQNUNZVObcrDSplYWf4ejr1RtTQYzYj9UQ27hyphenhyphenHo02cfKfP0Ez5ZbR49WwSUcdrJrb3YkHhe4Pg9n03_CXEbc/s1600/010.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="265" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhbUV7KJURjszzwToZLa8knzM64tKMl3jeE7fAnNgmRp8q4MOvn59pbCulQNUNZVObcrDSplYWf4ejr1RtTQYzYj9UQ27hyphenhyphenHo02cfKfP0Ez5ZbR49WwSUcdrJrb3YkHhe4Pg9n03_CXEbc/s640/010.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">A screenshot of the new Holocraft showing the geometry and optics of a moderately detailed specular hologram. As you can see it is very easy to create many overlapping and intersecting optics (the curved colored lines) which will reflect light to depict the white dots along the edges of the model's geometry. Blue optics will appear nearest, red the furthest, and green will appear with mid-range depth.</td></tr>
</tbody></table>
<br />
Now, per Matt Brand's whitepaper on Specular Holography, groove optics are hyperbolas that are calculated from the assumed incident light ray altitude angle that will be illuminating the hologram. I have chosen to calculate from the the hyperbola a cubic Bezier curve, which is defined by a starting and ending point, and two control points which dictate the curvature of the line that traverses the space between the endpoints. This curve then serves as the internal representation of the base shape from which all optics for the hologram are then derived, based on the 3D position of the point that is chosen from the source hologram geometry's vertices, edges, or surfaces that will be depicted by this reflective optic.<br />
<br />
These optic curves are segmentized by an occlusion-culling pass, which designates which spans of the curve should reflect light based on whether or not the hologram geometry should be 'blocking' their visibility, on a per-degree of viewing angle (or 'azimuth') basis.<br />
<br />
Once these segments of visibility are determined I then need to determine which ones are intersecting or otherwise overlapping one-another too closely to cause degradation in the final hologram. One way is to brute force compare the distance of every single point of each optic segment against every single point of every other optic segment to determine if the segment needs to be clipped or split/culled, etc..<br />
<br />
To speed this process up I have resorted to the use of a simple bounding rectangle overlap comparison. Effectively, this is a 2D axis-aligned bounding box intersection/overlap test. This is what I call an 'early out', where the core loop that is performing these comparisons and tests can quickly and cheaply pre-determine that there is no possible way for two given optic segments to be intersecting (because they are, for example, on opposite sides of the specular hologram).<br />
<br />
If there *is* an overlap detected between the two rectangles bounding two optic segments, then Holocraft proceeds with a more detailed comparison of the two segments, effectively comparing each point on each segment with each point on the segment in question. This is a simplified explanation of the actual process which determines overlap/intersection, as the actual algorithm uses a heuristic method to dynamically adjust the increment at which it 'steps' along each segment to the next point to compare with all the points of another segment to perform a distance check. This helps to accelerate the process, which is albeit fast enough without the heuristic method being as Holocraft only performs intersection detection when the user is saving the holographic output, currently as either an SVG vector image file or directly to CNC g-code.<br />
<br />
Now, the whole point of this blog post is to share the method by which I devised a rectangle intersection test that serves more utility for Holocraft's purposes. It would seem that with my growing years as a programmer I have been growing a rabid aversion to doing things the tried-and-true way, with a sort of desperation to find a novel way to go about doing things that people have already been doing another way for decades.<br />
<br />
A part of the heuristic method by which Holocraft accelerates optic intersection/overlap detection relies on knowing the rectangle of overlap itself in which two given segments are possibly intersecting. Knowing that their bounding rectangles is overlapping as a mere boolean piece of information is not sufficient for it would require that *every* point on each segment is compared against *every* other point on the opposing segment. To minimize this only the points on each segment that are within the overlapping area of the rectangles is considered.<br />
<br />
Here's a copy-pasta of what Holocraft does:<br />
<pre><code>
//
typedef struct
{
float x, y;
} vec2;
//
// 2d rectangle
typedef struct
{
vec2 min, max;
} rect2d;
//
#define MAX(a, b) ((a) > (b) ? (a) : (b))
#define MIN(a, b) ((a) < (b) ? (a) : (b))
// returns zero if no overlap, otherwise returns
// area of overlap and optionally the rectangle of overlap via '*o'
float intersect_rects(rect2d a, rect2d b, rect2d *o)
{
rect2d c;
float dx, dy;
// find horizontal overlap
c.min.x = MAX(a.min.x, b.min.x);
c.max.x = MIN(a.max.x, b.max.x);
// not overlapping horizontally...
if((dx = c.max.x - c.min.x) < 0)
return 0;
// find vertical overlap
c.min.y = MAX(a.min.y, b.min.y);
c.max.y = MIN(a.max.y, b.max.y);
// not overlapping vertically...
if((dy = c.max.y - c.min.y) < 0)
return 0;
// caller requires rectangle of overlap ?
if(o)
*o = c;
// return area..
return dx * dy;
}
//
</code></pre>
<br />
This is with the goal of quickly calculating the rectangle of overlap itself, while putting priority over the horizontal check over the vertical check (perhaps in Holocraft's case this should be reversed?) and will bail out if the horizontal overlap fails before bothering with the vertical overlap check.<br />
<br />
This is in contrast to most of the functions you will find on the internet, which perform a few simple greater-than/less-than checks against the min/max of the bounding rectangles/volumes and provide no other useful information.<br />
<br />
In the past, for game physics, I would just make everything a sphere and perform a simple pythagorean distance check against the combined radii of the two entities in question. I almost opted to use circles as early-out bounding volumes for optical segments until I actually sketched on paper how much extra space a circle would typically have over a simple bounding rectangle.<br />
<br />
Incidentally, the rectangle boundary calculated for each segment is derived from the convex hull of each curve, which is defined by both the endpoints of the curve and the control points that define the shape of the curve itself. It is said that the entirety of a cubic Bezier curve is contained within this area, and if there was a fast way to perform intersections against these convex hulls then that is what I would do, but there really isn't. This would require a series of line intersection tests and be quite a bit more complicated than it is worth.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhv44990gfdi22PG_1nhaBIVfp8uxy9CPo4dkY4_pQ6Y6oiWvccWr15mgsRJJoDVqSC3jk7WrFjTTv4rYsig3g_EFiKqbCGY_E_-ibLM5i2dszGEjh4QJX5Ajp2Qn9Pv9t1FVQOYw09i5U/s1600/convexhull.gif" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhv44990gfdi22PG_1nhaBIVfp8uxy9CPo4dkY4_pQ6Y6oiWvccWr15mgsRJJoDVqSC3jk7WrFjTTv4rYsig3g_EFiKqbCGY_E_-ibLM5i2dszGEjh4QJX5Ajp2Qn9Pv9t1FVQOYw09i5U/s1600/convexhull.gif" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Demonstrating the confinement of a cubic Bezier curve to its convex-hull, as defined by its endpoints and control points.<br />
From: <a href="http://www.scratchapixel.com/lessons/advanced-rendering/bezier-curve-rendering-utah-teapot" target="_blank">http://www.scratchapixel.com/lessons/advanced-rendering/bezier-curve-rendering-utah-teapot</a></td></tr>
</tbody></table>
<br />
One of the advantages of using cubic Bezier curves as an intermediate representation is that outputting a vector image via the SVG file format allows for cubic Bezier curves to be defined as-is, without converting or transforming to any other form.<br />
<br />
This is in contrast with outputting CNC g-code, in which machine toolpaths can only be defined as either linear or circular motions, requiring an intermediate step in Holocraft which deduces a series of circular arcs chained together to form an optic curve. This is done within a user-supplied tolerance value, to give some degree of control over how large/complex the final CNC g-code output is, and the degree to which the machined grooves are faithful to the calculated optic hyperbola.deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com3tag:blogger.com,1999:blog-4395646461527310891.post-68155228139815005842016-04-03T19:33:00.000-07:002016-05-07T22:29:19.075-07:00NewbieTime is Live<br />
I made a game for my children called NewbieTime. It's an idea I had a few years back but didn't get around to working on until last summer in the middle of Bitphoria development. I had everything pretty much finished and polished and ready to release and somehow decided to move on to other things instead. Well, I stumbled across it and decided it was time that it saw the light of day and now it's live for download/sale at <a href="http://deftware.itch.io/" target="_blank">deftware.itch.io</a>.<br />
<br />
<center>
<iframe frameborder="0" height="167" src="https://itch.io/embed/28679?dark=true&linkback=true" width="552"></iframe></center>
<br />
<br />
Check it out and let me know what you think!<br />
<br />
Working on a re-write of Holocraft that is much cleaner. After much aluminum scoring and grooving I had come to the realization that I needed to mitigate the fact that groove optics were allowed to intersect with abandon. This allowed for spots on holograms that are densely populated with optics to appear 'dirty' and effectively ruin the holographic effect. In order to properly deal with this within the way that Holocraft was slapped together would be just more sloppy work, so I opted to re-write to allow for much better.... well... everything.<br />
<br />
Occlusion culling will be on-point, and the program itself will run much faster, and deal with larger and more complex geometry much better. More on this once it's finished.<br />
<br />
Also on the todo list is documenting Bitphoria's scripting system, which is the final step before an initial public release. I want people to be able to begin playing with the scripting system and see what they come up with. Multiplayer functionality is all there, and entire games can be made out of Bitphoria but I have not been so inclined to actually make a game as of the past year.<br />
<br />
I still need to figure a master-server setup that will allow players to easily start and join one-another's games. This will be included in a later release.deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-87437983026015063952016-02-19T14:08:00.001-08:002016-06-23T14:49:54.760-07:00Holocraft - The CNC Machining Adventures<br />
Much has happened with Holocraft since the last blog post. I now have an X-Carve tabletop CNC routing/milling machine now, which I spent the entire holiday season building from the ground up. I needed a place to put the machine on, because I had no space large enough to accommodate it's massive 31" square size (except the kitchen/dining table, but that wouldn't have gone over well with the wife). So I invested in two smaller tables from IKEA that when placed butted up against one-another they create a perfectly sized platform for the machine.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhRwECgqWRasa_P7b_W3OP95aBP4vbPrDZ8SsnUb51OW1Ndm7RLw7wZcrGkj8FbKGjbG_qTebYdOUq0LN9Q8pIHyGOPbJx_xyy85QFPae1Y42Vmd-cAE6meweMUFJoc86ob50u-7dK3Sto/s1600/tarend-table-black__0241636_PE381441_S4.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhRwECgqWRasa_P7b_W3OP95aBP4vbPrDZ8SsnUb51OW1Ndm7RLw7wZcrGkj8FbKGjbG_qTebYdOUq0LN9Q8pIHyGOPbJx_xyy85QFPae1Y42Vmd-cAE6meweMUFJoc86ob50u-7dK3Sto/s320/tarend-table-black__0241636_PE381441_S4.JPG" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The TARENDO table from IKEA.</td></tr>
</tbody></table>
<br />
The tables are sturdier than they look. I assumed the legs would be wood, just as the top surface of the table is, but they are actually steel, and the whole underside of the table is also braced with similar steel beams. The tables, however, are not completely impervious to wobbling when the machine is dancing about across a hologram, which required some extra fastening and stabilization to be put in place.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiyDThD2VE0ApeHCSBKur7-bqNVWug6NUk-7T0-AtEXG-oj5esATsUIOaBkqx_-GzfyPrHQoDcIcDIfWFUa_K1Z8nd32FOwAQVa_I3_kwYqB30cGIM5x4-bw1CT5MMA7RfyYDN6HkpbyJg/s1600/1230151708-00.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="300" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiyDThD2VE0ApeHCSBKur7-bqNVWug6NUk-7T0-AtEXG-oj5esATsUIOaBkqx_-GzfyPrHQoDcIcDIfWFUa_K1Z8nd32FOwAQVa_I3_kwYqB30cGIM5x4-bw1CT5MMA7RfyYDN6HkpbyJg/s400/1230151708-00.jpg" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The machining setup nearing completion atop the pair of TARENDO tables ordered from IKEA</td></tr>
</tbody></table>
<br />
When I ordered my X-Carve from <a href="http://www.inventables.com/" target="_blank">Inventables.com</a> I went ahead with the option to get the Dewalt 611 router to use as a spindle in the machine. Unfortunately both the DWP611 and the power supply interface board were out of stock and wouldn't be shipped out until January 8th at the latest. I ordered my machine on the 16th of December, which was a Wednesday. Everything arrived (minus the spindle and PSU) the following wednesday in a pair of boxes.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgoMqbUO7TlHXSkMN4k0JcltgEhcQ-x07Af6xIlFCe9d3867hfCKwRusP_eQoUH9hPglq5KmufpwY2x_KhPYQDy9S_9BqJSkMNB0ApAhh-WHvXFkY8DD1imjMrGxipOauzgnqlljP04BWA/s1600/1223151528-01.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgoMqbUO7TlHXSkMN4k0JcltgEhcQ-x07Af6xIlFCe9d3867hfCKwRusP_eQoUH9hPglq5KmufpwY2x_KhPYQDy9S_9BqJSkMNB0ApAhh-WHvXFkY8DD1imjMrGxipOauzgnqlljP04BWA/s320/1223151528-01.jpg" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">UPS had just left my driveway when this happened.</td></tr>
</tbody></table>
<br />
<br />
Since the Dewalt spindle was going to take nearly a month since my X-Carve order was placed to arrive at my homestead, I opted for a backup solution and invested in a high value package deal I found on Amazon for a Konmison 48v DC spindle that comes with it's own PSU, and a set of collets up to 7mm in size (just over a quarter inch). It also comes with its own CNC mounting hardware but it isn't usable on the X-Carve, so I opted for the 'universal spindle mount' that Inventables.com sells on their website, and it has been working out just fine.<br />
<br />
Being that both the X-Carve and Konmison spindle have their own PSUs, I opted to stack them so that the fan in the XC PSU would help the Konmison spindle PSU cool down as well, by placing the KSPSU upside down ontop of the XC PSU.<br />
<br />
This was the end result:<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjI2f93PDxsKemQNmRvCO7JDqn1_QUmVZ9LmW8mJlzZG2AHi1vkI5E7WETduUPghpEPw1tP956Sbc-Qtd1obq5mE4EKszQD40b6d7WXLQv0GvLYYyeLJZkIm3cvrBIfFIfVpT8C2uvAtl8/s1600/1230151707-01.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="480" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjI2f93PDxsKemQNmRvCO7JDqn1_QUmVZ9LmW8mJlzZG2AHi1vkI5E7WETduUPghpEPw1tP956Sbc-Qtd1obq5mE4EKszQD40b6d7WXLQv0GvLYYyeLJZkIm3cvrBIfFIfVpT8C2uvAtl8/s640/1230151707-01.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">I call it the 'Frankensupply'.</td></tr>
</tbody></table>
<br />
As you can see, the XC Arduino and GShield are on the left, and the Chinese PSU is strapped ontop of the XC PSU, with the spindle potentiometer strapped down on the corner of the spindle's PSU. I figured that this configuration will best be suited to whatever I come up with insofar as an encasement or enclosure are concerned, cutting holes for the potentiometer, air flow, and power/USB lines. So far I have not been electrocuted, so that's good.<br />
<br />
Another issue that arose was how I was going to be mounting my aluminum into the machine as I opted to save 250 bucks by not going for the default wasteboard that comes with the XC machine. Instead, I tried mounting the aluminum directly to the machine frame itself. This *does* work, but it allows for much torque against the X-axis gantry. This is due to the fact that the workpiece sits so far down below the gantry that there is a considerable amount of leverage against the gantry itself, being that the tool is not directly below the gantry but instead jutted out infront of it where the spindle itself is.<br />
<br />
The Konmison DC spindle itself has been serving rather appropriately. I have managed to create a few different things with it.. On the plus side it is very light (compared to the DWP611) and runs at a rather decent 12k RPM, according to the seller, but this is something I've yet to determine with something like a tachyometer.<br />
<br />
<br />
On McMaster Carr's website there is a slew of different metals to choose from. Knowingly, I opted to go for either the 1100 alloy or the 3003 alloy, both of which are extremely pure forms of aluminum. Being that they are pure they are also extremely corrosion resistant, simply because aluminum itself is highly non-corrosive. The other property of aluminum is that it, in its pure state, is very soft. It is on the order of between lead and copper. You can easily scratch it with a pushpin. It's not exactly the softness of lead, but it's definitely not steel.<br />
<br />
I had originally opted to go for the softest and purest aluminum available via McMaster, which is the 1100 alloy. The reason for this was simply that it was just the purest they offered, and it was pretty cheap. I started out with five 6x6x0.063" sheets of this (1/16" thick) just to try out. At the time I was still milling the aluminum, using tiny 1/16" and 1/32" ball-nose end mill cutters, and cutting this soft aluminum was wretched at best. It would effectively pile up the removed aluminum along the sides of the grooves. This was disgusting.<br />
<br />
I then went ahead and tried out the 3003 alloy, which is actually a bit cheaper than the already-cheap 1100 alloy. It seems to be virtually the same, machining-wise. But sine it's cheaper it's going to be my go-to hologram metal.<br />
<br />
The 3003 also comes in a wider variety of thicknesses. Now, provided that I will only be making grooves that are only a few thousandths of an inch deep into the metal, I still need the metal plate itself to not be flimsy and prone to getting bent by slight forces, so thus far I have been opting for sheets that are .050". I actually have on order some more that are only .032" thick, just to see what that's like (plus it's even cheaper).. So hopefully that works out. But thus far, between the 1100 and the 3003 I think I will be sticking with the 3003 simply due to its price.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiguW1ZXHMBht-xw1kWVhgNZOSIaC88CylDsikAAvsDchn-_HJ0Qw2Xvxv_-4WXJA2-m3xZKbsI_ecSWmI2Xt1eeprj4f4xVEdcrQajM9decIjpyfQugWrDasOPaN_qHkfPDfyvkxMnfjU/s1600/makercam.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="345" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiguW1ZXHMBht-xw1kWVhgNZOSIaC88CylDsikAAvsDchn-_HJ0Qw2Xvxv_-4WXJA2-m3xZKbsI_ecSWmI2Xt1eeprj4f4xVEdcrQajM9decIjpyfQugWrDasOPaN_qHkfPDfyvkxMnfjU/s640/makercam.png" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Makercam.com made life somewhat easy for a while.</td></tr>
</tbody></table>
<br />
Now, being that Holocraft's original means of output was by way of spitting out SVG file paths for the eCraft paper crafting machine, it was a miracle to discover that out of all the free CAM software out there there was only one that could handle the thousands, and even tens of thousands of grooves being output by Holocraft, and that was MakerCam.<br />
<br />
MakerCAM made it a snap to convert my hologram toolpaths from an SVG file into a CNC g-code file. This was not without its caveats, of course... For one, the user cannot control how exactly the toolpaths are generated for the paths of any given SVG file. Many times I would find that MC's output would resume cutting a groove from a single side. Over and over it would enter the material to cut a groove, cut through to the other end, then it would raise the tool up over the surface to move back to the beginning side again. To my mind, it would be much quicker to finish a groove, move a little deeper into the material, and then continue back the other way in reverse. No, this wasn't happening with MakerCAM.<br />
<br />
In order to actually control my machine and run my hologram-groove CNC programs to create actual metal surfaces I needed a program that would drive my GRBL-based CNC machine. Among the popular online communities there are a few recommendations that seem to satisfy everybody's needs. Unfortunately, I had no luck with these suggestions because they were not suited for massive g-code programs with tens of thousands of lines of code. These programs could only handle *maybe* a thousand lines of g-code at a time, which was useless for specular holography purposes.<br />
<br />
Lo-and-behold, I managed to come across grblControl, which is a relatively newer GRBL controller program that features all of the bells and whistles of the other popular programs with the exception that it runs FAST. It can handle the largest of g-code programs I can throw at it, without breaking a sweat. To anybody using a GRBL based CNC I highly recommend you check it out. It is the only program I have used with my CNC, ever.<br />
<br />
It is simple, efficient, and has plenty of features that make machining as painless as possible (which is still rather painful, but at least grblControl doesn't contribute to the pain).<br />
<br />
There were a few things that were not exactly desirable about grblControl, but being that it is open source I was able to install and load up QT-creator and dive into the code and start making my own custom version of it. The first thing I opted to change was the fact that it operates strictly in metric, and all of my experience, expertise, and tools are designed for imperial. So, after some hunting and pecking I made my own imperial version of grblControl.<br />
<br />
Aside from that I have made a slew of other changes, visually, and also functionally, just to get grblCotrol to be best suited to what I am trying to accomplish with it. Thus far I am really happy with where it's at now.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiwV78xe_rDCTCAWdoBP23ahDCe6WIRmGy9dE4YTGEjzpPhJNpqXkVbuyDem5LzNTq55kqfhPhjEUI1EqJconcJMKglUhXwdC6IKQSbN0Z8iS4uwVUHOHhJlnWrNfTueeACA_cFfL3EkT4/s1600/grblcontrol_doom.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="344" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiwV78xe_rDCTCAWdoBP23ahDCe6WIRmGy9dE4YTGEjzpPhJNpqXkVbuyDem5LzNTq55kqfhPhjEUI1EqJconcJMKglUhXwdC6IKQSbN0Z8iS4uwVUHOHhJlnWrNfTueeACA_cFfL3EkT4/s640/grblcontrol_doom.png" width="640" /></a></div>
<br />
<br />
<br />
After a while it became apparent that if I were to have finer control over how the grooves were being cut I would need to implement functionality in Holocraft that would directly output g-code for a CNC machine, and completely obviate the need for any CAM software to generate the toolpaths in the first place.<br />
<br />
The way that MakerCAM was interpreting the SVG files was random, at best. I could not rely on the fact that just because a path was defined going from point A to point B that it would be machine in that order. In some cases it seemed to make weak attempts at optimizing the toolpath by alternating directions between successive optical grooves, where when it ended cutting one groove it would then move up to the end of the next groove and cut toward the starting side of the groove definition, but there was no metric by which to reliably cause this to happen, or not happen..<br />
<br />
As it stands, my machine has issues with machining holograms (or metal, as it were) due to the mere fact that when it is cutting in the Y+ direction (cutting away from the front of the machine) the leverage on the gantry causes it to lift up, which effectively prevents it from cutting as deep as it is supposed to because it is too friendly to the surface when moving in the Y+ direction.<br />
<br />
Conversely, when the machine is cutting in the Y- direction, the gantry leverage just sucks the tool down harder and deeper into the material, gouging it much deeper than would be intended. The end result is that I must cut my holograms with the grooves being formed in one direction along the Y axis. I chose the Y+ direction simply because I'd rather have lighter grooves than inevitably and irreversibly deep gouging grooves that are formed otherwise.<br />
<br />
This was discovered when originally cutting holograms where the grooves were traveling from X- to X+, in a left-to-right fashion. What was happening was that the left side of the groove (traveling in the Y+ direction while moving X+ rightward) was lighter than the groove was on the downslope traveling in the Y- direction. This is purely a product of the design of my machine, which was not designed to be used as a sort of drag-engraving machine in the first place, and so ways and means have to be put into place to work around this weakness of the machine.<br />
<br />
Another issue that arose was the resolution of the machine itself. Was the X-Carve even capable of distincting grooves into the surface of the aluminum without there being obvious stair-stepping resolution artifacts? Well, we are running with 20-tooth pulleys on 2mm pitch belts means one revolution is 40mm along the belt.. With 200 steps per motor shaft revolution, with 8x microstepping, should be at 40mm / 1600steps = 0.025mm.. Therefore, we should easily have a resolution of at the very most .001" of an inch for our grooves, which seems plenty fine provided that we are malking those grooves fast and smooth and not moving to each exact increment of the motor to scribe the metal's surface.<br />
<br />
Now, in practice, what I've found is that running the machine at the highest possible speed (going into GRBL config via '$$' command and playing around with max speeds and accelerations while tweaking the power dials on the gShield going out to the steppers) I've managed to get my CNC to fly like none other. The problem is that when I exert force into the surface of aluminum with a carbide bit at such speeds there is enough leverage at play to allow for what we in the States refer to as 'speed wobble'. This is not a machining term, this is something that happens when you are flying down a hill on a bicycle or a skateboard and your rate of speed just becomes too much.... Too.... Much... The end product being that your handlebars or skateboard begin resonating side-to-side uncontrollably so until the point that a crash of some kind is usually inevitable.<br />
<br />
In this case, it results in wobbly little grooves, which are exactly *NOT* what we want, because we are trying to scribe optically accurate/useful grooves into the surface of the aluminum.<br />
<br />
So, it has become a balance of slowing the machine down to minimize the wobbles as much as possible, without sacrificing speed, and without introducing a sort of stippling that arises from the actual machine position increments manifesting themselves in the grooves themselves, which are equally as ugly when it comes to optical applications.<br />
<br />
Playing with the depth of the groove and the speed at which it is formed has been a bit of a journey, as well as taking other measures to mitigate the 'wobble' by raising the workpiece closer to the gantry to minimize the leverage that the tooling edge has against it. Stiffening up everything on the machine has been another project as well.<br />
<br />
<br />
Here is a low-fi video of a hologram I have been working on for my younger sister, for her birthday. When I can manage a better camera I will (feel free to donate!).<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.blogger.com/video.g?token=AD6v5dyp_ydCL1poqigFxS41kS587EOB0rqlj7DP-Q8Ttzoq0ExpLTaqc1jMhtx_44iRItNhMTcZe-tr6E612lqRvQ' class='b-hbp-video b-uploaded' frameborder='0'></iframe></div>
<br />
<br />
<br />
Here is yet another video of another test hologram featuring some random abstract cuboidal shapes merged together in a sort of splatted configuration. Again. feel free to donate better camera ware!<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.blogger.com/video.g?token=AD6v5dwoA6WtcgvAdD6cnq91TamkNDwVw-iUo_MgSf0vqqgRL8WfQgbR3j6qLW-dhO3i-Hn31dsN08z5DLQDek2j-w' class='b-hbp-video b-uploaded' frameborder='0'></iframe></div>
<br />
<br />
<br />
Links:<br />
<br />
<a href="http://www.ikea.com/us/en/catalog/products/S99000483/" target="_blank">IKEA TARENDO Table</a> - ikea.com<br />
<a href="http://www.amazon.com/gp/product/B0154LENB6?psc=1&redirect=true&ref_=od_aui_detailpages00" target="_blank">Konmison 300w Spindle Motor with PSU and 13pcs ER11</a> - Amazon.com<br />
<a href="http://www.makercam.com/" target="_blank">MakerCAM</a> - makercam.com<br />
<br />
<br />deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com1tag:blogger.com,1999:blog-4395646461527310891.post-6048053308842887902015-12-04T19:16:00.000-08:002016-06-23T14:50:44.208-07:00Holocraft - Process for Fabricating Specular Holograms<div>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjKR2aJ_6IZoAZRfrWzTBboqMllOPodiLehTOhnpHsbs0XUCAfabQIHCJhfHrQQ0LA7JyTwwOvsq69JEKATZRG94YxTQtUngz2irCn-utfxeTViui8EgIxXRHYbWNQncyLc6GOMhBwByUg/s1600/specularholography.gif" imageanchor="1" style="margin-left: auto; margin-right: auto; text-align: center;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjKR2aJ_6IZoAZRfrWzTBboqMllOPodiLehTOhnpHsbs0XUCAfabQIHCJhfHrQQ0LA7JyTwwOvsq69JEKATZRG94YxTQtUngz2irCn-utfxeTViui8EgIxXRHYbWNQncyLc6GOMhBwByUg/s1600/specularholography.gif" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The clearest video of a Matthew Brand hologram on the web.</td></tr>
</tbody></table>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
</div>
<div>
It would seem that my blog is getting a bit more traffic now due to my previous post about my adventures with specular holography. I decided that it would be prudent to offer up my holograms in a crowdfunded fashion. I just launched a relatively modest campaign for $5000, which would fund a low-end CNC machine, aluminum stock, and a few months rent on a little tiny office/studio space where I can setup the machine and 'get my hologram on'.</div>
<div>
<br /></div>
<div>
<a href="http://igg.me/at/OMi-1W8Aikw/x/12915392" target="_blank">If you want to contribute and receive prototype reflective cardstock holograms, or even solid metal plate holograms machined from the metal and with the machine that your contribution would be helping to fund, then please click this text to check out my Indiegogo campaign.</a></div>
<div>
<br /></div>
<div>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgyNDIqhFB4IYlw8IQ6RfNR0kvUScGna_ij2qgSz_Fv2PWqM_8OmzZsZ7sKSSPk86sfho8qXSDb81w7IqY2SwHatKNoYLJP8RqJZc-7j6i6uWk_hMoL27GvUicCmrnlB6BSvhrxV5nWRT0/s1600/holocraft.gif" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="340" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgyNDIqhFB4IYlw8IQ6RfNR0kvUScGna_ij2qgSz_Fv2PWqM_8OmzZsZ7sKSSPk86sfho8qXSDb81w7IqY2SwHatKNoYLJP8RqJZc-7j6i6uWk_hMoL27GvUicCmrnlB6BSvhrxV5nWRT0/s400/holocraft.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">'Holocraft', a program that generates toolpaths for a hologram's fabrication.</td></tr>
</tbody></table>
<br />
Originally, when I set out to do this project, my plan was to refine the cardstock holograms to a viable product that could eventually fund a CNC machine for creating high quality metal plate holograms, but the cardstock holograms are just not quite "there". The cost/benefit for getting them to look pristine is just not a viable route. But, I am sure someone would like to have them just as a novelty item, or maybe even to frame and hang on their wall. I am just not comfortable directly marketing them as a finished product, because I personally wouldn't want one myself. I want a heavy duty super duper shiny metal hologram!</div>
<div>
<br /></div>
<div>
I ask that if you think this project is cool, and want to see more stuff happen, please spread the word about it and tell your friends. Facebook, Tweet, and otherwise social-network the snot out of this blog and/or my Indiegogo campaign. I have this strong feeling that this medium is the tip of the iceberg, and that a lot of energy will change form as a result of it. It just needs to get out there in front of people and on their mind.</div>
<div>
<br /></div>
<div>
I've already figure out virtually everything there is to figure out about how to run Holocraft output on the CNC machine, what cutting tools to experiment with, what grade aluminum I should use, and have already sourced every single thing that I will need to buy to make it all happen.</div>
<div>
<br /></div>
<div>
After much consideration, weighing the pros and cons of every available desktop CNC google would show me, I settled on an X-Carve by Inventables, which is controlled by an Arduino that's running the open source g-code interpreter GRBL. Worst-case it hold a tolerance of 5-thousandths of an inch, which is pretty sloppy by professional fabrication standards, but is just good enough for my holographic endeavors. It also boasts a 31x31 inch work space, which is a far cry better than the size that most other machines offer. Some of them offer less than a foot of working space, but most of them have working area dimensions that are less than two feet. I want to be able to make holograms that are relatively large-format. Once you get too big, though, the fact that the light source isn't infinitely far away starts to interfere and distort the hologram, because each area of the hologram is then receiving incident light at more widely varying angles. This can be compensated for, but it's going to require special attention and more math.</div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhUaohdrK_-WinTPHWc__nGrCVmgOihdIlvZQb_nMqkNHlIpS-5JWYwaUzrBykGf4IHWm0uagZnPAb-4AdPYY2sTXXV4tvVQNqyIYo3yeuveQsqCMrLlyy2EWozdy5_8IZLbB18EKMN65Q/s1600/X-Carve_1000_Angle.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="270" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhUaohdrK_-WinTPHWc__nGrCVmgOihdIlvZQb_nMqkNHlIpS-5JWYwaUzrBykGf4IHWm0uagZnPAb-4AdPYY2sTXXV4tvVQNqyIYo3yeuveQsqCMrLlyy2EWozdy5_8IZLbB18EKMN65Q/s400/X-Carve_1000_Angle.jpg" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="font-size: 12.8px;">The X-Carve CNC mill I aim to acquire for holographic purposes.</td></tr>
</tbody></table>
After a bit of research I've opted to go with 1/16th inch plates of 1100 aluminum alloy. 1100 aluminum is 99% pure, which is much more pure than all of the alloys, with the exception of 1050 and 1060. Pure aluminum is much softer to work with, and machine, with a Brinell hardness of 28 (copper is at 35, lead is 5, steel is 150). This means that the tiny cutters used to machine the hologram reflector grooves will not wear down as quickly, which is a good thing. 1100's softness and purity lend themselves well to optical applications, because it can be polished to a mirror shine without impurities mucking it up.</div>
<div>
<br /></div>
<div>
Another issue that cropped up was the fact that CAM software typically isn't designed to handle the sort of input that Holocraft generates. Typically, when importing a path of some kind, these programs like to assume it's a closed shape, that you either want to use as a hole, or an island/extrusion of some kind. In this case it's neither, I just want the machine to cut an arbitrary groove as output by Holocraft.</div>
<div>
<br /></div>
<div>
Of all the different high-end CAM software which I was able to locate trial-versions of to investigate, none of them were going to be able to import Holocraft data and use it properly. I was about to give up when I came across MakerCam, which is actually just a Flash applet that runs in a browser (<a href="http://www.makercam.com/" target="_blank">http://www.makercam.com</a>). By the grace of awesomeness it happens to do exactly what I need it to do. It will import an SVG that Holocraft outputs, and allows me a 'follow path' operation, and let me output the g-code as such. However, it struggles a *little* with the high number of paths that Holocraft outputs, which can be upwards of tens of thousands of individual little curved reflecting groove optics, which tempts me to just implement direct g-code generation capabilities into Holocraft. It would not be the first time that I wrote a program which generates g-code.</div>
<div>
<br /></div>
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhWTKvctC1sWRfqPQzGYhGiV0r7lElOhWntR3Ni-Sw3kl7CK3s9J-WtGzQGMyl6eVSheZS-deKM-Cijs8akQLbHPpHRaZAfEDwfsH34bU2PAp52JZTbpoYpZwoWf2Y60ChxtTrlo11KA8w/s1600/chamferking.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="418" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhWTKvctC1sWRfqPQzGYhGiV0r7lElOhWntR3Ni-Sw3kl7CK3s9J-WtGzQGMyl6eVSheZS-deKM-Cijs8akQLbHPpHRaZAfEDwfsH34bU2PAp52JZTbpoYpZwoWf2Y60ChxtTrlo11KA8w/s640/chamferking.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">'Chamfer King' - One of the several CNC machining utilities that I wrote for my dad's CNC shop back in the day.</td></tr>
</tbody></table>
<div>
<br /></div>
<div>
What I really need to do now is brush up on my 3D modeling skills, and start producing my own content to 'holographize' that actually represents my own creative self-expression, of which I have plenty to draw upon. I've only been testing out Holocraft using models on the various online repositories of 3D-printer models. There are some really decent paid models which would make some good holograms, but I'm more interested in making good holograms that depict what I want them to depict, not just what works.</div>
<div>
<br /></div>
<div>
<br /></div>
<div>
Links:</div>
<div>
<a href="https://www.inventables.com/technologies/x-carve" target="_blank">Inventables: X-Carve desktop CNC mill</a></div>
<div>
<a href="http://www.mcmaster.com/" target="_blank">McMaster-Carr: Industrial/Manufacturing Supplier</a></div>
deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com4tag:blogger.com,1999:blog-4395646461527310891.post-80360584559249222312015-12-02T05:36:00.002-08:002016-06-23T14:51:14.078-07:00Holocraft - Adventures with Light, in Time and Space<div>
<br />
Bitphoria is not dead. I did, however, take a break over the summer to take care of some Real-Life situations. Recently I have mostly been cleaning up code, doing a second-pass over the code to add in error-checking wherever I didn't when I initially wrote the code. I also managed to work out the bugs and kinks in the networking that were highly problematic. There were a handful of bugs that were total project-killers at the time that I decided to take a break. Once I came back to it, they ended up being simple one-line fixes that I had originally feared would require revamping entire swaths of code. Exciting times. Bitphoria is alive and well, and I will be posting more updates about further progress sooner than later.<br />
<br />
In the meantime, I have recently found myself distracted with another project. My wife and I run a business out of our home, selling crafts we create ourselves using various computer controlled printers and cutting machines. There is a sort of pressure to come up with new products to make and sell to keep ourselves relevant in the online marketplace. This side project seems to me to be quite a lucrative endeavor that I am excited to be able to work on.<br />
<br />
Now, in the late 90's, when I was barely a teenager, I discovered a website by a man named William Beaty. This page was about his discovery and adventures with something that is now referred to as 'scratch' or 'abrasion' holography. At the time I thought it was interesting, but only cared about learning about conventional holography, using lasers, and producing interference patterns in photographic media. I hadn't really thought much about scratch holograms since then, until one night a few weeks ago.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEji2qSAdbGP4CtVIpgG3qP97d6UocYsNmPLczBjzRnKmXeygDLD1h-KWukNyIICORVxw7Z6tgHW5Fv9-RtjZqfmP-dBd206dydU8LHngBoamGtnqNrimeYZEfp_GksDee9WHvEjfUuN4cs/s1600/scratchsite.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEji2qSAdbGP4CtVIpgG3qP97d6UocYsNmPLczBjzRnKmXeygDLD1h-KWukNyIICORVxw7Z6tgHW5Fv9-RtjZqfmP-dBd206dydU8LHngBoamGtnqNrimeYZEfp_GksDee9WHvEjfUuN4cs/s1600/scratchsite.png" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">William Beaty's Website as of 1999 (via Archive.org) I specifically remember this ASCII art from my initial discovery of his webpage.</td></tr>
</tbody></table>
<br />
I was goofing around in the dark in the bathroom of our home with my four year old daughter, with a flashlight. Incidentally I was telling her about the 'ghosts' I wanted to make in Bitphoria, that chase the player around, just like the ones in Pacman. She asked what they looked like, and being quick on my feet I smudged one into the mirror above the sink. We discovered and observed a few neat optical properties of this greasy smudge I made. Shining the flashlight on it casted a shadow on the ceiling of the smudge, because it wasn't reflecting the light to the ceiling the way the rest of the mirror was. I also noticed the holographic depth effect being produced by the fact that both my eyes were receiving a different specular configuration of light reflecting from the smudge.. This reminded me of the scratch hologram website, particularly where he explains that he witnessed a holographic effect in a car's windshield that exhibited the same properties and behavior.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjXa5C-OLi9wTXVuS8Ofc6-quk31_4n1CzmQgBmc3LbY3aS_SBHQw4iTDib95xM_9JBybDbEOS6bU4F7SiiyCdvkZfUZdtOfGv4VTFLkHP0CJYjcro9yjfY65xMrEhgrtsz-c45T17Twdc/s1600/hood1a.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjXa5C-OLi9wTXVuS8Ofc6-quk31_4n1CzmQgBmc3LbY3aS_SBHQw4iTDib95xM_9JBybDbEOS6bU4F7SiiyCdvkZfUZdtOfGv4VTFLkHP0CJYjcro9yjfY65xMrEhgrtsz-c45T17Twdc/s1600/hood1a.jpg" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Beaty's 20-year old photograph of the hand smudges that inspired his experiments.</td></tr>
</tbody></table>
<br />
<br />
The site I originally stumbled across as a teenager is still up today, and I managed to track it down that night. I don't know exactly when or why I became fixated. Maybe I had unconsciously wanted to make these holograms all my life, at some point, and it seemed that point in time had come.<br />
<br />
The gist of scratch holography is that by using a drafting compass, or some other means for producing circular arcs, one can embed scratches, or grooves, into a reflective surface, which will then catch the light producing specular glints that shift depending on the angle of the viewer/light/surface. The end result is that one can produce points of light that behave as if they are suspended behind or infront of the medium or material. The circular arc scratches that produce these points of light can be configured to yield various shapes and designs that exhibit themselves with a holographic effect.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgLWx_awtnqgSHnR4tB0nZTKeXLHSM4oRpiRl1Y2eICYdWTdLx249ungK8cCjMDUEiyLmyUjVAFdQ-3uaPH9xKQWUd20vAi7Yl6qXeKci_yXtM62IX92YJwc1c2vvHTXhgIzHx1i9dy4xQ/s1600/abrasionholography_2-up_cc.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgLWx_awtnqgSHnR4tB0nZTKeXLHSM4oRpiRl1Y2eICYdWTdLx249ungK8cCjMDUEiyLmyUjVAFdQ-3uaPH9xKQWUd20vAi7Yl6qXeKci_yXtM62IX92YJwc1c2vvHTXhgIzHx1i9dy4xQ/s1600/abrasionholography_2-up_cc.jpg" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">A collection of Beaty's hand-drawn holograms, and a diagram depicting producing a 'V' hologram on an appropriate surface medium, showing the resulting configuration of arc-scratches that produce a V-shaped hologram out of reflected light.</td></tr>
</tbody></table>
</div>
<div>
<br />
My first thought was to automate the process, to produce the best possible scratch holograms. It quickly became apparent, in my googleage that I was not the only person to have this idea. I did a lot of searching, and it would seem that there is only one program that became somewhat popular online, called 3DSilhouette. It is a VisualBasic application, and is simply not available online anymore. From what I gather you can email the author for a copy, and he will provide instructions that it must be installed to a specific directory path on your computer in order to work.<br />
<br />
3DSilhouette creates a variety of output. It can output the actual scratch arc positions, or it can output a pattern of vertical lines that allow one to use a compass by anchoring the compass to one end of the line, and then reaching the scratching-end of the compass to the other end of the line to get the proper arc-radius, and then go ahead and make the arc itself. This all seems great, if you want to make scratch holograms by hand.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgsj26_dedoKAdCwV0O0q4gdtgeKlt0Li7Ux47dHktgRbpdBjkseh3hB8OW1Ki4vWYktihUkEcfWLUWnqj04qTtUj6kQvpJ8Y4OXvY40-57eN3vq4RXgJ78hy_QnOESAHEe1UJnntodsy8/s1600/get_silhouette.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgsj26_dedoKAdCwV0O0q4gdtgeKlt0Li7Ux47dHktgRbpdBjkseh3hB8OW1Ki4vWYktihUkEcfWLUWnqj04qTtUj6kQvpJ8Y4OXvY40-57eN3vq4RXgJ78hy_QnOESAHEe1UJnntodsy8/s1600/get_silhouette.jpg" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Raul's scratch-hologram generator program, which loads .3DS model files (as output by 3D Studio Max) and calculates circular arcs that are to be made to reproduce the model in a holographic form via scratch holography.</td></tr>
</tbody></table>
<br />
<br />
The only other program I was able to find anything about was in the form of a video, and the program was a part of a project by a team of MIT students who were creating both the software and the machine to produce the holograms in a completely automated fashion. This Youtube video is all that seems to exist of the man's work, as he has posted no updates about the program, machine, or it's output.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/JaGZ651U4j4/0.jpg" frameborder="0" height="266" src="https://www.youtube.com/embed/JaGZ651U4j4?feature=player_embedded" width="320"></iframe></div>
<br />
<br />
This program appeared to be very promising, and possibly of higher quality than Raul's 3DSilhouette program. But there is not one other shred of evidence that this program ever existed online for the public, or that it was ever something that actually did anything, and could have just as easily sat and on someone's hard-drive since the making of this video. Judging by the video, however, it does appear to produce the best possible scratch hologram by way of orchestrating a plethora of circular arcs required to reproduce the supplied 3D model data, to some degree.<br />
<br />
After some reading, and Youtubing, I learned that utilizing circular arcs, per-se, as an optical surface are not the ideal geometry if you wanted a rigid holographic effect that didn't show distortion and collapse the 3D scene being displayed. This distortion can be seen on Raul's videos of the scratch holograms his software output patterns for. It's very obvious that the 3D effect is rather distorted, and over-exaggerated, in that the perspective of the 3D scene rotates too much when the viewer moves very little horizontally. This is not desirable.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/A40e2PgHPCQ/0.jpg" frameborder="0" height="266" src="https://www.youtube.com/embed/A40e2PgHPCQ?feature=player_embedded" width="320"></iframe></div>
<br />
<br />
Several people have explored scratch holography, in search of optimization of these mechanically produced holograms. One person, named Matthew Brand, had the means and the know-how to go about figuring exactly what was needed to produce the best-possible holographic effect using the scratch hologram medium. He managed to discern the exact math that would allow the calculation of an optical surface topology that would yield distortion-free holograms via the specular reflection of light.<br />
<br />
Brand began this project in 2008, and demonstrated that an optical surface could be machined, which foliates (or approximates) the necessary mathematically accurate surface calculated that would produce the desired holographic effect for a given set of 3D points. His extension of scratch holography is referred to as 'Specular Holography', in that it hinges on calculating how to manipulate a surface and the resulting specular reflectivity so that a much higher fidelity holographic effect is produced. Brand made scratch/abrasion holography into child's-play, and refined the art into something that nobody else has been able, or willing, to explore since.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/reM6XhbkQXM/0.jpg" frameborder="0" height="266" src="https://www.youtube.com/embed/reM6XhbkQXM?feature=player_embedded" width="320"></iframe></div>
<br />
<br />
He went on to produce over one-hundred works, most of which were a part of an art installation at the 'MoMath' exhibit (Museum of Mathematics) in New York. Currently, the Specular Holography installation is being sold off, for $1,200 per hologram, via <a href="http://momath.org/home/light-grooves/" target="_blank">http://momath.org/home/light-grooves/</a><br />
<br />
That is quite a pretty penny that these holograms are fetching. In the meantime, Brand has moved on to bigger and better things (ie: Lumography). He wrote a whitepaper on the workings of Specular Holography, which was published in 2010. Since then, it seems that nobody has taken interest in his work, to expand on it, or explore it further. Brand himself has not even followed through with a second paper that he mentions being necessary in his existing paper. It would seem that Specular Holography, in all of its glorious precision and beauty, has yet to catch on as a medium.<br />
<br />
To my mind, it's something that has the potential to 'catch-on' and go mainstream.<br />
<br />
At any rate, it became clear that if I wanted to create an automated means for producing marketable holograms, I would have to write my own 'hologram generator' program that incorporates Brand's mathematical derivations. I spent a few days decrypting the academic conventions Brand conveys his ideas through, and managed to glean from his paper what was required of a surface to depict rigid beautiful holographic surfaces.<br />
<br />
By happenstance, one of our cutting machines had recently 'died' - a Craftwell eCraft Die Cutting Machine, which we used to create half of the products that we sell online to support ourselves. This machine died whilst being utilized to produce prototype halloween decorations in early September of this year. Now, when I say 'died', I mean that the bearings which are inside the blade-holding assembly, which performs the actual cutting of the paper/cardstock, 'became' jammed (long story).<br />
<br />
The machine still functions as designed, moving the blade head where it's supposed to go on the cutting medium, except for the fact that we can no longer cut out material via the requisite swiveling blade mechanism that it relies on to be able to cut at all. That is to say that the machine still attempts to cut, it just cannot due to the fact that the blade is essentially stuck in one orientation and cannot follow the direction of the cut anymore.. But the software and machine, otherwise, are still operational.<br />
<br />
In the face of my wife telling me to throw the machine out for weeks, or months even, I resisted, knowing all the while that someday I would figure out *something* that I could do with the poor feat of engineering that had lost its way.<br />
<br />
The night with my daughter, and being reminded of Beaty's website as a kid, I had finally found an application that I could re-purpose our otherwise 'dead' machine for, breathing some life back into it's utility and value, thus turning it from a liability, as something just taking up space and collecting dust, into a valuable asset. All that needed to be done was bridge the gap between the idea of making holograms, and producing an actual marketable product, and the resulting income from profits would further fund life, ventures, Bitphoria, etcetera.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgW9RkQAamavAAciUFhyeP2mlCyzVBOa44o5XqjXfbFsAs5TUFKZC9ZT-JcCPju92ZHLx48L5QxkdT-K2igxgLcTO1MjQrkgdXlDte64JKAj-flEau7dPcQkv2RphU72eZ0yUuIFu1JJM0/s1600/hqdefault.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgW9RkQAamavAAciUFhyeP2mlCyzVBOa44o5XqjXfbFsAs5TUFKZC9ZT-JcCPju92ZHLx48L5QxkdT-K2igxgLcTO1MjQrkgdXlDte64JKAj-flEau7dPcQkv2RphU72eZ0yUuIFu1JJM0/s1600/hqdefault.jpg" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The Craftwell eCraft Die Cutting Machine, since discontinued after we acquired ours during the holiday season of 2012. It's a machine that required some 'finesse', thus frustrating many buyers and leading to its demise.</td></tr>
</tbody></table>
<br />
My work was cut out for me. I had to figure out how to write a program that generates the necessary arcs to produce a viable hologram, and output an SVG file that could be imported into the eCraft software. Then, figure out how to fashion a means for the machine itself to actually produce the proper effect on an unknown material that would yield a hologram as the end result. I began exploring my options for a material or medium that I could scribe light-catching grooves into.<br />
<br />
Initial thoughts were about foil, adhered to cardstock, and using the built-in pen tool of the eCraft to produce relatively cylindrical grooves that would catch the light reliably when embedded in foil/cardstock. After some preliminary tests with foil/cardstock and a ballpoint pen, it was clear that the arcs would not be able to create sharp enough 'glints' of light, and were too hazy and blurry to be usable.<br />
<br />
The next idea was to use something like Mylar adhered to cardstock. Surely Mylar was reflective enough, and flexible/yielding enough to allow these ballpoint-pen embedded grooves to form and produce the necessary specular glints that were required, in order for a hologram to work. After trial-and-error, with Elmer's glue, cardstock, Mylar, it became clear that the moisture in the glue was warping the cardstock too much, and was simply unusable. My next idea was to try to use petroleum jelly (Vaseline, Aquaphor) instead, because I knew that it would not 'absorb' into the cardstock and cause it to expand and warp. This *did* produce perfectly flat sheets of cardstock with Mylar adhered to it, but was a tedious process involving a big greasy mess, and a rubber roller to roll the greasy blob as flat as possible between the Mylar and regular cardstock.<br />
<br />
It became clear that simply using reflective cardstock was the answer. It's effectively cardstock with Mylar adhered to it in a factory, and so it can be expected to be the best possible material for a limited fabrication means. So I picked some up and it seems to be working out as well as one could expect. It took some trial-and-error to determine what sort of tool I could engineer (ie: a finishing-nail I filed down using my late father's tools), that was capable of embedding grooves into the reflective cardstock surface which catches the light at a shallow-enough angle to be a viable means of creating a holographic product which we can market to the average consumer.<br />
<br />
The only requirement for viewing these holograms is that they are positioned below a light source that's not too far infront of the hologram. Brand's work on Specular Holography allows one to compute an optical surface capable of depicting a given set of 3D points for a given illumination point altitude, relative to the hologram itself. One can calculate the best possible configuration/foliation of the optical surface of the hologram for various illumination altitudes, where the light source is progressively more infront and less above the hologram. If you take a hologram calculated to work with an illumination point positioned almost directly above it, the 3D depth effect is magnified as one moves the light source further and further infront of the hologram. The grooves catch the light as steeper and steeper angles, resulting in the glints moving across the curve of the grooves more and more as one's perspective changes. This effectively turns the hologram into a smear of light because the specular 'glints' stretch out as every position of the groove approaches an angle that can reflect light to the viewer.<br />
<br />
It is my belief that a hologram produced to operate ideally at a 22.5 degree illumination angle (where the light source is half-way between a 45-degree angle and being positioned directly above the hologram) one could reliably decorate a wider variety of rooms with an overhead light source and have the holographic effect operate as much as can be expected. This is what I believe to be a marketable hologram, because it will work in the widest variety of situations that the holograms will probably wind up.<br />
<br />
Once I figured out a viable material for creating the holograms on, the next step was actually writing a program that could properly calculate tool paths for the hologram grooves, output them as an SVG file which could then be loaded into the eCraft software, and then actually produce the hologram on the reflective cardstock. Somehow I opted for the name 'Holocraft', combining 'hologram' and 'eCraft'.<br />
<br />
I spent some time surfing 3D model websites, and reading up on various model formats. I opted to use the STL, or 'Stereolithography', model file format, which happens to be one of the primary formats used to store and convey 3D printer models. Many of the websites that serve as platforms for communities of 3D printer enthusiasts are chocked full of STL models one can download and print on their own 3D printer. Simultaneously, the STL file format itself is extremely simple to load and parse.<br />
<br />
After a few days I had something usable. Initially I was working with circular arcs, to get everything up and running, and produce something interesting. I had to figure out the best size and width/height ratio for the arcs, and also learned how to use the arc path command in SVG so that I wouldn't have to output individual points for the arcs, but could instead utilize the built-in capabilities of the SVG vector image format. This would effectively allow the eCraft software to determine the best possible set of points to re-create the arcs itself, and keep the SVG files being output as small as possible.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjQm9-Z45847RxOI324NEwIezTLiaOl-jCp6D_S3FQbsBQVYOOosMEef0DUXY6PvmRFJE1E_0MATW1nxBusiRjreFwCp5Nj8QXmAg342ChTeXlub8GSDgU7jsL_4-btlL4kcGbjvKGnxZg/s1600/holocraft.gif" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="288" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjQm9-Z45847RxOI324NEwIezTLiaOl-jCp6D_S3FQbsBQVYOOosMEef0DUXY6PvmRFJE1E_0MATW1nxBusiRjreFwCp5Nj8QXmAg342ChTeXlub8GSDgU7jsL_4-btlL4kcGbjvKGnxZg/s320/holocraft.gif" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Holocraft, in its earliest form, before it was actually generating correct toolpaths.</td></tr>
</tbody></table>
<br />
After things began shaping up, I re-worked the arcpath code to create the proper hyperbolic toolpaths described in Matthew Brand's whitepaper on Specular Holography. This allowed me to then produce holograms that do not distort and warp beyond a viewing position that was directly infront of the hologram. This meant that my arcpath SVG output was no longer viable. Initially it appeared that I would have to start outputting giant SVG files with many points being plotted along each curve. After some more reading about the SVG format, it appeared that I could utilize the cubic Bezier curve functionality in the SVG format to keep the output small. Then it became a matter of calculating the starting and ending control points that manipulate the Bezier curve from a set of points that lie on eah hyperbola. This took a day or two to fully figure out and write into Holocraft.<br />
<br />
At this point in time I am still working on Holocraft, it is unfinished. I have added a few other features. One in particular allows the user to select from a few different modes of generating the points used to construct the hologram from, instead of being limited to using a model's vertices or a selection of points on polygon edges. Using procedural texturing techniques I am generating a variety of different patterns and textures of points on the surface of the models that can be used to create a hologram. This allows for a wider variety of models to be usable to generate holograms from, instead of being limited to simple low-polygon models, now larger more complex models can be used and a surface pattern texture can be generated to simplify its appearance so as to prevent too many overlapping grooves from being generated, which only serves to corrupt their ability to reflect light.<br />
<br />
I am also in the middle of finishing up some occlusion culling code that allows the model to obstruct itself and 'chop' the grooves into smaller and or shorter segments so that a point of light can appear to disappear behind the foreground parts of the model as the viewer's perspective changes.<br />
<br />
As great as all of this sounds, the reflective cardstock isn't the best medium, nor is the eCraft the best machine for producing a hologram with. The cardstock can only support so many grooves before the hologram turns into a mess that no longer catches the light properly. The cardstock is also prone to warping with enough grooves, and even shredding the reflective material if the tool goes over the same spot too many times. Putting the finished holograms into a heat press has shown to be somewhat effective in flattening out some of the warping, but it's not one-hundred percent effective. Using the fabrication method I've fashioned requires some finessing with Holocraft, to try and squeeze as many grooves into the hologram as possible without them hindering one-another by overlapping too much, which is extremely easy to do.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/AAXAEIFl1vQ/0.jpg" frameborder="0" height="266" src="https://www.youtube.com/embed/AAXAEIFl1vQ?feature=player_embedded" width="320"></iframe></div>
<br />
<br />
Ideally I'd like to offer up the reflective cardstock holograms as a product in our online store. The best possible way to use them is to frame them in a way that keeps them flattened as much as possible, and hang them or set them where there is a light shining on them from above. The reality is, though, that better holograms are to be had. By fabricating holograms into metal, using a CNC machine, many more grooves can be embedded to create a more vivid and detailed hologram, with a greater specularity than the reflective cardstock can produce, making holograms brighter and sharper. This can be seen looking at images of Brand's holograms. They are simply beautiful.</div>
<div>
<br />
That is why I will be launching a crowdfunding campaign, either to fund a simple low-end CNC setup, or fund a CNC retrofit kit for my late father's manual Bridgeport manual milling machine that's setup and ready to go, but that nobody is using for anything at all. I have a nagging sense that I owe it to him to put his old stuff to use as much as I can, in his spirit, and in celebration of his life and who he was. A part of me wishes I had discovered this project while he was still alive and well, and it could have been one of the father-son projects that will always seem too few and far in-between.<br />
<br />
<br />
Links:<br />
<a href="http://amasci.com/amateur/holo1.html" target="_blank">William Beaty's Hand Drawn Holograms Page</a> - Amasci.com<br />
<a href="http://3dalter.50megs.com/" target="_blank">Scratch Holography Software</a> - 50megs.com<br />
<a href="https://www.youtube.com/watch?v=JaGZ651U4j4" target="_blank">Abrasion Hologram Printer Video</a> - Youtube.com<br />
<a href="http://www.zintaglio.com/" target="_blank">Light and Illusion by Matt Brand</a> - Zintaglio.com<br />
<a href="http://arxiv.org/pdf/1101.0301.pdf" target="_blank">M.Brand's Specular Holography Paper</a> - Arxiv.org<br />
<br />
<br />
<br />
<br /></div>
deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-37400483977008120942015-04-24T21:18:00.002-07:002016-08-11T13:04:45.902-07:00BITPHORIA, The Game Itself<div>
<br /></div>
<div>
It occurred to me that most of what I write about on this blog has been the technical side of my thoughts and ideas while working on my game BITPHORIA. I haven't really been posting much in the way of actual progress on the game itself.</div>
<div>
<br /></div>
<div>
I thought I'd take a moment to share what is going on with BITPHORIA. As of now, by my estimation, the game engine is 80% done, and the game itself is roughly 15% completed. I am currently on the cusp of moving from working in the engine to working in the default game scripts.</div>
<div>
<br /></div>
<div>
I have spent a long while tweaking the visuals over the months, and trying out different little tricks, in an attempt to refine the overall appearance of the game into something that is stimulating and attention-grabbing. My entire philosophy on anything is to make it so visually appealing that anybody who sees a screenshot will automatically find themselves looking for a video, and anybody who sees a video will want to play the game.</div>
<div>
<br /></div>
<div>
If BITPHORIA doesn't captivate, visually, through a screenshot, then something needs to change. I don't want to make a product that isn't good enough to sell itself. </div>
<div>
<br /></div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj1fIaGWjdyRIvrDT2vsY1ieMSIZ9yzdoJdzDXgLd6E0nnQF_JpVs23CwaMgDXyNcDJsElND1vBQfJf2yPCh6buRnOyF0I2L372aD-ql5aTXLBhHOvI-FZ2O2WKLG0ZZpkyXRyS_gtm9Qg/s1600/screen012.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="356" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj1fIaGWjdyRIvrDT2vsY1ieMSIZ9yzdoJdzDXgLd6E0nnQF_JpVs23CwaMgDXyNcDJsElND1vBQfJf2yPCh6buRnOyF0I2L372aD-ql5aTXLBhHOvI-FZ2O2WKLG0ZZpkyXRyS_gtm9Qg/s1600/screen012.jpg" width="640" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgM4POKaAG6sVc5cU28-jdirXv4M_6lDfXS1_GEWNsZXbgQWSk7oFcQIO89eHn8ax8YvHidU7KNPDNgsW3HbxZ7GGlrnsU0Nf1Z1d6aFjSBHIGrclrTAJ8NEMQ_HJOmD30Va7NmZFZsaKA/s1600/screen029.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="358" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgM4POKaAG6sVc5cU28-jdirXv4M_6lDfXS1_GEWNsZXbgQWSk7oFcQIO89eHn8ax8YvHidU7KNPDNgsW3HbxZ7GGlrnsU0Nf1Z1d6aFjSBHIGrclrTAJ8NEMQ_HJOmD30Va7NmZFZsaKA/s1600/screen029.jpg" width="640" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhCyjRaAfbdiCZcxnxw6SZbZhN0GKHjetwP-FfIIPxIEtTdEUFF5LBaOhL0-zMdM-NUy9K-MNFEG3FgbmiO3tQe9VsbfKMxKZ1dohTBxGLK8WndHsSZp2h1GWlSaoO5xx3oTdJVJa6PRuA/s1600/screen033.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="358" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhCyjRaAfbdiCZcxnxw6SZbZhN0GKHjetwP-FfIIPxIEtTdEUFF5LBaOhL0-zMdM-NUy9K-MNFEG3FgbmiO3tQe9VsbfKMxKZ1dohTBxGLK8WndHsSZp2h1GWlSaoO5xx3oTdJVJa6PRuA/s1600/screen033.jpg" width="640" /></a></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<div>
I think that these screenshots portray the overall aesthetic and graphical design of what the final game will consist of. You'll probably notice the low frame-rates that my netbook achieves. It's playable on here, but you will want something with a half-way decent GPU to perform the raymarching on the 3D texture materials. There will be options to reduce the demand on the GPU so that it will be smoother for players with budget/older hardware.</div>
<div>
<br /></div>
<div>
The scripting system is mostly in place. There are a handful of features that I aim to implement to expand functionality further, but the majority of all the commands for scripting each system are present and operational. There is still a lot of validation and verbose warning/error stuff that I need to go back in and write in there, to help aspiring mod developers along.</div>
<div>
<br /></div>
<div>
Documenting the sets of commands for each system is another task that is needed. I am not sure about when this will happen, because I intend to do most of the scripting for the game myself, and so it isn't something I have a need for until the game is released. Until it's released, I am really the only person who will be using it, and I'd like to finish BITPHORIA as soon as possible.</div>
<div>
<br /></div>
<div>
Netcode is operational. Players can start a server and it can be joined from another machine, on a LAN, or over the internet. There is no server-browsing in the menu yet, but that is on the todo list, which goes along with other menu UI features I'd like to add in for various things. One in particular is a sort of holographic preview of the world-volume that would be generated from the current seed value. As a server admin adjusts a slider for the seed value it updates the preview of the world so the user can get an idea for the type of layout that their game will offer other players who join in.</div>
<div>
<br /></div>
<div>
I have a good number of sound effects already in there that I have produced on my own, some of which have yet to find a use. There are 23 different sound effects, and only about half of them have found a place in the current scripts. I feel that I will use up the leftovers and need to make some more sounds before all the sounds are done.</div>
</div>
<div>
<br /></div>
<div>
I have also produced several 2-minute looping music tracks, that suit the general low-fi 8-bit aesthetic that underlies the graphical style of BITPHORIA. I'm not sure if all of them will make the final cut, and I'm not sure if server admins who start a game will be able to choose one themselves or if one will be randomly chosen based on the seed value for their game.</div>
<div>
<br /></div>
<div>
Because of the way the scripting works, where a set of scripts defining one game 'mode' is kept in a folder with the name of that game mode, users will be able to duplicate a game script folder and use it as a base for their own modded game mode. The scripts are relatively simple script files, and all that is needed is the documentation for the various commands that each scripted system utilizes. Anybody will be able to edit their script files to customize their game modes, or just create their own new one from scratch.</div>
<div>
<br /></div>
<div>
When someone has produced a BITPHORIA mod, there is no need for other people to manually download and install anything to their BITPHORIA installation. You simply see what game mode servers are playing in the server browser, and automatically download and run the mod when you join in. Infact, no scripts are loaded when you join a server, you only execute whatever scripts the server executes. This allows complete modding freedom. Anybody with BITPHORIA can play your mod, instantly.</div>
<div>
<br /></div>
<div>
If someone wanted to run their own server using a specific mod, they would need to manually download and install the mod scripts. This could change, I may set it up to allow servers to have the option to 'allow mod copying' for clients, at which point the server would let clients download the actual script text files and save them to their game for later use.</div>
<div>
<br />
All of this is working, the game is currently playable as a simple little deathmatch game. The UI is vastly incomplete, there are no options for setting up a game, or joining a game. The menu system is started, but not currently 'fleshed out'. It is merely a framework with some minor functionality to traverse menu hierarchies using buttons. There are also nice little editboxes for editing configuration strings :)<br />
<br />
I started the code base for BITPHORIA exactly a year ago today. It has just over 16k lines of actual code (not counting comments or whitespacing). I have never written 16k lines for one project in my life, nor have I ever worked for a year straight on one project. I have high hopes for BITPHORIA, not as something that will make me rich and famous (although, one can hope), but as something that the gaming industry takes notice of. I figure, at the very least, it will serve as a good portfolio piece if I ever breakdown and decide to get a job working for someone else (ughh).<br />
<br />
I feel I have something valuable to contribute to gaming, as a whole, as well as anybody who aspires to make games or learn programming. I just want to be as valuable a resource as I possibly can, whether that means as a provider of fun and interesting games, or creative inspiration.<br />
<br />
I hope people find my ideas as intriguing and enjoyable as I do making them come to life.<br />
<br /></div>
deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-8564902565047825522015-04-19T21:13:00.001-07:002020-02-26T09:57:04.436-08:00Forays Into Entropy Coding<div>
<br /></div>
<div>
One of the many minutiae that concerns me is bandwidth consumption. The fact of the matter is that the internet is not a particularly forgiving means of conveying data from one place to another. It is merely *the* means for conveying data. It is what we have to work with; everyone with a different connection.</div>
<div>
<br /></div>
<div>
Some poor souls are forced to use dial-up, way out there, in the middle of nowhere, and others are privileged with fiber optic connections (we could use a visit, Google). In the middle are the broadband users, with varying capability, via DSL or cable.</div>
<div>
<br /></div>
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.speedtest.net/result/4271840282.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://www.speedtest.net/result/4271840282.png" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">A 'D-' for my perfectly usable connection. It's only near-failing<br />
if the application in question is failing the user.</td></tr>
</tbody></table>
<div>
<br /></div>
<div>
You can see here that my home cable connection has a bandwidth of roughly half a megabyte downstream and 100k/sec upstream. Nobody reads/writes/sends/receives anything in bits (except for programmers), so I like to look at these things in terms of bytes, because they are infinitely more relevant to me (and you). You can see that my connection's score is a 'D-'. I could see that if my priorities involved watching 1080p video. Instead, I'd give my connection a 'B+' because it is something I almost never have to think about, it is plenty fast for my needs. I'd give it an 'A' if it weren't for the random outage that occurs once every few months for an hour or two.</div>
<div>
<br /></div>
<div>
The reality is that it's not the connection that matters, it's how the application uses the connection, and what the end-user's experience is. It makes no difference how I obtain the experience, via 56k or 1-Gbps fiber, as long as the experience is 'A' worthy. Even the newest consumer GPUs are brought to a crawl by games made by those who have no idea what they are doing. This doesn't mean the GPU isn't up to snuff, it means the game designer is doing gamers a disservice by not taking a realistic idea of common hardware configurations into consideration, especially if they took their money for it.</div>
<div>
<br /></div>
<div>
My strategy with BITPHORIA is to make something new, and interesting, that takes advantage of newer hardware capabilities to perform novel rendering, without requiring the most up-to-date setup. Being a multiplayer game, this applies to a player's internet connection as well.</div>
<div>
<br /></div>
<div>
If I can support the vast majority of the existing configurations out there, then that maximizes the potential player base, which translates to customers. Primarily, though, I don't want to leave anybody out. I want the high-end gaming rig players to be happy with their investment, and I also want the newbies on netbooks to be able to enjoy a rousing session of BITPHORIA.</div>
<div>
<br /></div>
<div>
I don't want people to be forced to play on large fiber-connected servers. I want a newbie with a netbook on a wifi connection be be able to host a game server, that can host at least a few players. Even a 'low-end' broadband connection like my own only has only a 100kb/sec upstream, which could easily be saturated if I were to host a server running any popular FPS game with 8 players. In order to make this possible there must be a minimal amount of data traversing the network connections between the server and player clients.</div>
<div>
<br /></div>
<div>
Naturally there are several strategies for minimizing bandwidth usage when conveying a game state across a network connection. Quantizing, or 'bit-packing' various data based on its type and behavior is one extremely important method. Typically, values for angles/orientation/etc are represented and dealt with as floating-point values (or double-precision, if your application demands it). Floating points values (aka 'floats') are 32-bits, and sometimes only a small range of their capable range is used. For instance, in a game, you may have objects with velocities that never go above a certain speed. This knowledge can be used to effectively remove the extra unused-bits from velocity information about an object.</div>
<div>
<br /></div>
<div>
Another strategy is avoiding sending redundant data, and only send certain properties when they change, instead of re-sending the same information over-and-over. This applies to things like an object's position in the game, and orientation/angles. If the object is stationary, there is no need to send this information about it.</div>
<div>
<br /></div>
<div>
Another issue that comes up is the game's network update rate. The update rate, in most client/server games, must be as high as possible without putting too much strain on the server or client connections. With lower update rates the game can begin to feel a little sloppy, especially to gamers who have acquired a fine sense for such things. I've seen game servers with their update rates so high that some player connections couldn't keep up. This is just plain unacceptable. Some games keep their update rates really low because they are sending too much data per-update to be able to have it any higher without making the game unplayable for slower connections.</div>
<div>
<br /></div>
<div>
Keeping a low update rate is another possible strategy, and needs fine tuning alongside other important aspects of the networking that handles interpolation and extrapolation, and maintaining the game simulation's fidelity.</div>
<div>
<br /></div>
<div>
Compressing the network data on it's way in/out before actually sending it is the strategy I am currently working to employ in BITPHORIA. My initial plan was to just follow suit with Quake3's use of a static-Huffman encoding, which breaks down as a simple method of re-assigning byte values a new binary code, where more frequently appearing values are represented using a smaller bit code, and less frequent values use a larger bit code. This is a form of entropy coding.</div>
<div>
<br /></div>
<div>
With entropy encoding it's all about exploiting the known likelihood that a byte will be a certain value. This is orthogonal to dictionary encoders, which operate on exploiting the fact that there are usually repeating-patterns in data. Entropy encoding doesn't care where the values are in the stream, they can be clumped together next to like-values, or spread evenly, and the output will be the same size as long as there are the same number of each possible value. Dictionary encoders typically produce a much better compression than entropy encoders, but are much slower. They also do not operate well on small pieces of data, and are better equipped to compress large data.</div>
<div>
<br /></div>
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.cs.princeton.edu/courses/archive/spring07/cos226/assignments/huffman.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="170" src="https://www.cs.princeton.edu/courses/archive/spring07/cos226/assignments/huffman.png" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td class="tr-caption" style="font-size: 12px;">Generating Huffman codes for symbols a1..a4 using their probability to<br />
build a tree from which the codes are derived (ie: 0 = left child, 1 = right child).</td></tr>
</tbody></table>
</td></tr>
</tbody></table>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div>
There are two major entropy encoding algorithms that exist, Huffman coding, and arithmetic/range coding. The deal here is that Huffman can be reduce to, as I mentioned above, a simple exchange of byte values for bit codes to be output. This works well as a simple array/table look up in code. Arithmetic/range coding lends itself to better compression ratio, because the resulting bitcodes generated more closely suit the probabilities of each possible value, and therefore produces output that is closer to the actual informational content of a piece of data. The catch is that arithmetic/range encoding is more CPU intensive.</div>
<div>
<br /></div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhmYEXXZjC2dAEhfnL-tc58uiZXgCJi6kBJZOVl_pAqRgYeoUjp6bM9RIcfu80w6jAsn9dA18vvSmp2BXaHPQShAQiEUE-7pIVOGn21rRkQxRy_BdRcrDlBpfmX_etqMXvtqdvXJCUkx68/s1600/rangeencoding.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="428" data-original-width="818" height="167" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhmYEXXZjC2dAEhfnL-tc58uiZXgCJi6kBJZOVl_pAqRgYeoUjp6bM9RIcfu80w6jAsn9dA18vvSmp2BXaHPQShAQiEUE-7pIVOGn21rRkQxRy_BdRcrDlBpfmX_etqMXvtqdvXJCUkx68/s320/rangeencoding.png" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Range coding represents data as a single value generated<br />
by recursively narrowing down each symbol in it's "range".</td></tr>
</tbody></table>
<br />
<br /></div>
<div>
Now, to be honest, I could probably get away with simply using either, and nobody would know the difference. This is where my neuroses comes into play. If I can do better, I will do better. So after some research I saw potential in the idea of using arithmetic coding, specifically range-encoding, which is the integer-based version of arithmetic coding.</div>
<div>
<br /></div>
<div>
After a day I came up with my very own entropy encoding, which was essentially a bastardized hybrid of Huffman and range encoding combined. Without an academic background in math, I was simply fumbling around, hoping to stumble across a discovery. The goal was to produce the speed of Huffman encoding with the higher precision of range encoding. The end result, dubbed "binary-search encoding" has roughly the speed of Huffman, with neither the compression ratio of Huffman or range encoding. So that was basically a failure. I was able to compress a 512 kilobyte sample of BITPHORIA's networking data down to 405kb. So that was a compression ratio of ~1.26, whereas a simple Huffman encoder can get the same data down to 341kb, a ratio of ~1.5. My binary-search encoding was not gonna fly, at least not in this situation.</div>
<div>
<br /></div>
<div>
Arithmetic coding does the same as, or better than, Huffman, because Huffman is essentially a special case of arithmetic encoding where value probabilities are powers of two. This is why it cannot achieve an encoding that is closer to the actual informational content of a piece of data.</div>
<div>
<br /></div>
<div>
During my research to better understand range encoding, and why it works as well as it does, I was hoping to incorporate these principles into my little algorithm to get better compression than Huffman, even if it wasn't as good as true range encoding. This is when I stumbled across Asymmetric Numeral Systems, and Finite State Entropy. A new algorithm recently developed and even more recently made to be as fast as Huffman encoding, with the compression of range/arithmetic encoding.</div>
<div>
<br /></div>
<div>
ANS captures the raw essence of arithmetic coding, without the convoluted means of obtaining such an encoding. At the end of the day the system breaks down encoding and decoding into a table of bitcodes for each possible byte value, just like an optimized Huffman implementation does. The end result, though, is a better choosing of bitcodes for byte values by maintaining an internal 'state' from which encoding a symbol into some bits yields a new 'state' for the next one.<br />
<br />
My initial attempt that utilized a binary search was flawed in that it had to 'start from scratch' with each symbol that was to be encoded. There was no internal state being maintained, and each symbol was treated as a lone isolated incident without any context. ANS maintains this context, which allows it to utilize less bits for encoding/decoding.</div>
<div>
<br />
If you enjoy compression and information theory, please explore these links!<br />
<br /></div>
<div>
<br /></div>
<div>
Links:</div>
<div>
<br /></div>
<div>
<a href="http://fastcompression.blogspot.com/2013/12/finite-state-entropy-new-breed-of.html" target="_blank">RealTime Data Compression - Finite State Entropy</a> - FastCompression.blogspot.com</div>
<div>
<a href="http://www.ezcodesample.com/abs/abs_article.html" target="_blank">Asymmetric Numeral System</a> - EZCodeSample.com</div>
<div>
<a href="http://www.codeproject.com/Articles/9021/Simple-and-fast-Huffman-coding" target="_blank">Simple and fast Huffman Coding</a> - CodeProject.com</div>
deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-86863772450416560652015-03-20T01:06:00.000-07:002016-08-11T13:05:44.422-07:00Game Logic Scripting and Networking<br />
I've been very distracted from working on my project, and this blog, since the holiday season. Various circumstances are resolving themselves, finally, and work will resume. I've also been somewhat stalled out trying to wrap my brain around the topic of this post, and thought it wise to take something of a break from wracking my brain in pursuit of the 'ultimate solution'.<br />
<br />
One of the important features of the engine is that it should be easily moddable. My goal is to not only produce a game for people to play, but also a game they can manipulate and customize to further derive enjoyment from. This is also something that I feel affords me maximum engine re-usability insofar as creating and releasing another game is concerned. I have a serious aversion to hard-coding game-specific behavior and logic, because it always gets tangled up in the rest of the engine code, making it a mess to change certain aspects of the engine when trying to build a new game out of it.<br />
<br />
The top priority is allowing the people who host game servers online to customize the game in any way they like without players being required to manually download and install anything externally just to play. Server admins should be able to customize everything about the game that people experience when they join their game. Players should be able to see all game servers running on the same engine, and choose between the different games/mods that each server is running. Being that virtually all of the assets and resources used for generating the game experience are scripted procedurally, clients quickly download these procedures and 'rules' upon connecting and the entire game experience they encounter is dictated by the scripted configuration of the game on the server.<br />
<br />
Games that are almost entirely hard-coded into the engine usually feature customization of the constant values for things like weapon damage amounts, and other little nuanced values like this, but the behavior of the game itself is otherwise 'stuck' the way that it is. Typically they have some sort of text file where the configuration exists, delineating variables and their values for controlling physics and game behaviors. This is simple enough, and plenty sufficient for smaller projects of a less serious nature.<br />
<br />
Most games utilize some form of a scripting language to accomplish the de-coupling of game logic from the game engine itself. There are others which simply incorporate the use of an external compiled binary, e.g. a DLL file. Having a background in reverse engineering and 'hacking' games, I can say that using a DLL is probably the most insecure thing a programmer could do. Operating systems are equipped with all sorts of debugging APIs and features that enable hackers to have a field day with such games.<br />
<br />
Another top priority alongside game customization is the quality of multiplayer networking and the resultant online gameplay. It's pretty standard now to just design a server/client model using all the usual tricks that have been around for the past decade to mitigate internet latency and packet loss, to smooth out the appearance of the gameplay that is actually occurring on the host machine that is being simulated remotely. Everything you see on the screen is a virtual lie, and the typical bag of tricks are designed with the intent to please the player with promises that can't always be kept.<br />
<br />
It is my opinion that the existing techniques are sub-par and that it is time we begin to explore other options, and come up with new ideas. For my project I am turning conventional networking strategy on its head. In my networking model the client has equal authority as the server and other clients as far as the game state is concerned. The server merely maintains the game rules and authority over who can be connected and participating in the game. It also serves to route the game state between clients as it evolves. No single machine retains the absolute state of the game, and all machines are participating equally in the progress and simulation of the game state as it unfolds.<br />
<br />
To make all of this possible, combining a sort of peer-based game state simulation along with client/server networking model, as well as keeping the system for user-made mods in a manageable and user-friendly state, I have opted to use a console-scripted system that is made up of a handful of smaller 'systems' of commands. Everything in the engine is scripted using sets of commands in this fashion.<br />
<br />
There are three components to this setup. At the core we have entity 'types', which are a set of parameters that define a specific entity. Properties that don't change about a type of game object are represented as a 'type'. Things like a model, conditional logic, physics behaviors, etc. Properties that are consistent across all instances of an entity type are thus considered aspects of that type.<br />
<br />
Secondly, we have entity 'functions'. These are small sets of 'operations' to perform on a given entity. Things like playing sounds, spawning particles, or entities, inflicting damage, etc. These functions are referenced by an entity type's conditional logic definitions. Conditional logic is hard-coded into the engine, there are only a certain set of conditions which the engine detects about an entity and, in turn, executes any logic for those conditions as defined in the entity's type. Conditions such as when an entity touches the world, or another entity, or gets damaged or killed, for example.<br />
<br />
Functions can perform a number of operations, but they cannot change anything about the original entity type it is executing upon. However, if something is to change about a specific entity's type, it can simply be changed to a new type with different static properties. If a player is supposed to go from a walking physics to a ragdoll physics, simply change the player entity's type from "player" to "deadplayer", where the physics settings differ accordingly.<br />
<br />
This makes the process and/or job of designing entities pretty simple and painless. They can be edited in notepad, and reloaded in a snap. It becomes easy to create variations of the default game.<br />
<br />
It also simplifies the networking model. The typical setup most games use for networking the game state do so by 'delta-compressing' entity updates, comparing an older state to the present state to determine what aspects or properties changed and need to be transmitted. This allows developers to define any number of entity changes to occur over the evolution of the game state, and have everything reach from one machine to the other over a network connection.<br />
<br />
My implementation boils this same design down around the fact that a lot of times there are many entities which have properties that never change over the course of their existence. These properties can all be lumped together and conveyed in a minimal number of packet bytes by simply indicating which entity should be which type, when that entity becomes that type.<br />
<br />
The actual networking system continuously relays 'events' to the other side. The game logic, in the form of entity functions, is responsible for invoking events which have a networked component to them. Events like particles, sounds, etc. all are 'networkable events' - in that they should be seen by other participants in the game. These are queued up to be transmitted in the next outgoing update. An entity's type being set is an example of an event that is serialized and queued up for network transmission.<br />
<br />
Not all entity function operations have a networked component. Some things are meant to only occur on the local machine, and even networked operations will stay local if the entity type is defined as being local-only (eg: client-side detail entities). If an entity changes into something completely different, and everything about it changes, this is not a large update. The local machine simply indicates which type the entity is now.<br />
<br />
Along with the events queue is a prioritized list of entity positions, velocities, angles, etc.. All the location information about an entity gets tacked onto the update after all the events. Positional updates are 'optional', in that they don't always need to get to the other side the way events are supposed to. Entity positions are prioritized by the entity's proximity to the client's player entity. Entities within a certain distance of the player have their positional information included with every update being sent to them. Once entities become further out, the number of updates they are included in per second begins to lessen down to a bare minimum.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmaI9rsOviQqMxJxkshECtYtgsbdqNiUU081rTJJPguP25p5OYntbQ933gWIshTFOW-PYFxlBbDhCiH3VW9jv3YaHmo758VDIZpx2LDzbxTd8HnbgtYFIbDTxDfjKSprXHd5Pie01nMTk/s1600/revolude4_07.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="384" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmaI9rsOviQqMxJxkshECtYtgsbdqNiUU081rTJJPguP25p5OYntbQ933gWIshTFOW-PYFxlBbDhCiH3VW9jv3YaHmo758VDIZpx2LDzbxTd8HnbgtYFIbDTxDfjKSprXHd5Pie01nMTk/s1600/revolude4_07.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">This is a screenshot of the ill-fated Revolude game, circa 2010.</td></tr>
</tbody></table>
<br />
Now one idea I had, back in the Revolude days, was to perform a similar network conveyance of entity properties, and logic, by sending game logic function indices to clients, telling them what functions to execute to bring an entity's state "up to speed". This made sense in my head, but in practice there was an issue between preventing functions from overlapping or overwriting eachother's changes.<br />
<br />
The solution was to divide up the game logic into a server-logic and client-logic. Sometimes the two had the same pieces of code but used it in different ways. The server's job was to control the actual state of the game, and direct how clients should be simulating their end, which entities are where, and what functions they are executing.<br />
<br />
It never worked out, fully. The Revolude build I still have is wrought with networking bugs. A poorly thought-out event networking system wasn't ensuring all events made it across, in order. Objects can be seen turning topsy-turvy, appearing and disappearing, or never existing (but leaving evidence that they did). It's a nightmare I am happy to never return to.<br />
<br />
The networking in BITPHORIA is awesome, though. I am very happy with how the game is coming along.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj-UTtLMuAodYiZKEsrY1YH6Tv2cvs8QG4axJ91NATOjtD2puh0M3ZxObE9Dhct0nfBSObiJjiDyIYEbaw4MykHkoZoGVkQtJBZk8EaB_5nqkhouk-GcEShLR3yGxrcPVOxoAYJTN6HSKM/s1600/screen009.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="356" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj-UTtLMuAodYiZKEsrY1YH6Tv2cvs8QG4axJ91NATOjtD2puh0M3ZxObE9Dhct0nfBSObiJjiDyIYEbaw4MykHkoZoGVkQtJBZk8EaB_5nqkhouk-GcEShLR3yGxrcPVOxoAYJTN6HSKM/s1600/screen009.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">BITPHORIA, in its current form.</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-50678195040783307852014-11-12T20:42:00.000-08:002016-08-11T13:06:09.492-07:00Procedural Modeling and Animation<div>
<br />
I failed at maintaining at least one post per month, a lot of distractions are abound! I've been trudging away nonetheless. The project is at a point where I am leaping from one milestone to the next, some days being spent refactoring smaller support code and/or adding functionality to various support systems.<br />
<br />
I have just added some code for dealing with quaternions for representing object orientations, as euler angles will just not suffice for the physics interactivity I am striving for. To represent rotational velocities I have also added code to handle axis-angle representations, because quaternions inherently limit rotation to 180 degrees, or in the case of rotational velocity 30 RPM.<br />
<br />
One feature of the engine that I was looking to implement is procedural model scripting. The goal here is to allow a user to easily script a game 'model'. A model consists of vertices defined for the three basic primitives which are points, lines, and triangles. Each of these vertices have a RGBA value for rendering a color.<br />
<br />
The primary advantage of having scripted models are that anybody can edit them, without learning modeling software. All you need is a text editor - which I aim to build into the engine in some capacity (after making a release) to eliminate the need for alt-tab madness. Scripting a model is a matter of cleverly putting together a series of commands that rotate and translate to reach the desired position of each vertex for the desired primitive type.<br />
<br />
Another advantage of using procedurally scripted models is that game server admins can run games with their own custom models, which are quickly and easily transferred over the network to player clients. Clients can then generate the actual models by executing the procedures defined therein. This is huge because it is another goal to allow server admins to run entirely customized games without requiring players to download and install anything manually.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjRYUFBdHIQaZZt9byayTwPEDMwphXYtjIJAP-A7Wlr7wF0zFA7cYYUViekSeiH1rckVdeHFSLs1jbymV6SRiCT5-SwowcbZfZVbMVE54_A-rxYJowWS_z67HgmG_sMTBzaxGiJPmhr5e4/s1600/treescript.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjRYUFBdHIQaZZt9byayTwPEDMwphXYtjIJAP-A7Wlr7wF0zFA7cYYUViekSeiH1rckVdeHFSLs1jbymV6SRiCT5-SwowcbZfZVbMVE54_A-rxYJowWS_z67HgmG_sMTBzaxGiJPmhr5e4/s1600/treescript.png" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">A script for a 'sprial tree' model, demonstrating the use of nested loops.</td></tr>
</tbody></table>
<div>
<br /></div>
Scripts are also afforded the ability to 'randomize' various parameters. Things like translate, loop, rotate, etc. can all be randomized using a minimum value and a range value. Each time an entity requests a renderable model 'instance' the system checks if the model is invariant or not. If the model is not invariant, this means it uses randomized parameters in some way, and must be re-generated for each instance requested. This allows the system to take one model script, and generate many variations using different seed values to generate the randomized parameters for the operations that use them.<br />
<br />
Another feature of the modeling system is the ability to push/pop the current matrix and loop counters. This allows for recursive modeling of hierarchical things like trees, plants, and other fractal shapes.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiqh2JUnI7RcZiN04kh0xfFgtzv-PD7FVIDksjbkuWXCjyL-6ekoH95z47rDc7G5ZWHhU8mOI6nflkrKLjKh4w143mDuYUmMtqre89eD3SPyecT6gmmLuZRvulsdvJx6Gtmm7KQUbS9QDk/s1600/121114.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="356" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiqh2JUnI7RcZiN04kh0xfFgtzv-PD7FVIDksjbkuWXCjyL-6ekoH95z47rDc7G5ZWHhU8mOI6nflkrKLjKh4w143mDuYUmMtqre89eD3SPyecT6gmmLuZRvulsdvJx6Gtmm7KQUbS9QDk/s640/121114.JPG" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Some spiral trees, and dynamic mesh player model.</td></tr>
</tbody></table>
<br /></div>
<div>
Along with scripting individual procedural models I have implemented an animation system that I devised a long while back. This was yet another chance to embark on a journey that strayed from the norm. I love skeletal animation, and inverse-kinematics for making a skeletal model interact seamlessly with the game world, but I do not love the mind-numbing rote-implementation of features that all the creative work has already been done around. To me, programming is about problem solving, the deeper and more abstract the problem, the more rewarding it is to me. Infact, making a game all by itself isn't that rewarding to me (earning a living is good, though). It's the process and the challenges of making something involved that I find rewarding, and I wish more programmers felt the same. At any rate...<br />
<br />
Conventional animation systems involve manipulating a mesh model using a virtual skeleton with virtual joints. Keyframed animation is stored as angles for each joint, and is easily to interpolate for smooth animation between keyframes. Getting any keyframed animation to smoothly and seamlessly interact with the world and external forces like wind, inertia, gravity, collisions, etc.. is tricky in and of itself. It's a challenging problem. Some games only let the model/skeleton interact when the character is dead, allowing the body to flop around in response to external forces. This is known as 'rag-doll' physics. There are various solutions now for handling these sorts of things, both for animating and dead character models. There's even one solution that dynamically generates/modifies keyframed animations for things like walking, so that it looks as though the character is actually negotiating bumpy terrain with strategic footstep positions.<br />
<br />
I did not want to plug in a solution, and I did not want to pursue a solution that was too involved. This is where the dynamesh comes in. Dynamesh is just a abbreviation of the term 'dynamic mesh'. A dynamic mesh is just a spring mesh, where vertices are referred to as 'nodes' and the edges connecting two 'nodes' are called 'links'. This is a simple system where each node is given a size (zero is valid) and a mass, and each link is given a rigidity value that dictates how will it retains it should retain its length.<br />
<br />
This system is simple enough to simulate. It consists of two parts - a particle simulation for the nodes themselves, and an iterative link-constraint system that pushes and pulls the nodes of each link in an attempt to 'settle' the mesh.<br />
<br />
So far, I have determined three uses for this system:<br />
<br />
The first use is entity collision volume representation. Along with using spheres, capsules, axis-aligned bounding-boxes, and the like, it's nice to allow for more detailed collision hulls for bigger more complicated entities.<br />
<br />
The second use of the dynamesh system, which operates in tandem with the first use, is rigid body physics. It is an automatic feature of this system to allow all the nodes to be in any orientation, with no real 'orientation' at any point in the code. Discerning anything like an 'orientation' involves examining the relationship between node positions, and comparing it to the original default orientation. This isn't too hard or expensive to do. Entities can use a dynamic mesh as not just their collision hull, but also to innately handle collision response and resolution. This enables highly interactive entity physics behaviors.<br />
<br />
The third use is animation. Not only can you define a dynamesh that is rigid, but you can define one for a character, or a vehicle, or anything with moving parts. With one pair of nodes you can have a ball and socket type joint. With three nodes you can have a hinge. Through clever use of nodes and links you can create just about anything, and the neat thing is that simulating the nodes as particles that are affected by external forces and collisions allows for highly dynamic interactivity automatically, without any special-case code at all.<br />
<br />
In this case I am using dynameshes for character animation, while allowing for an entity to have one scripted procedural model that is permanently attached to them, as well as one dynamesh, which can have models attached to its links. This makes it simple enough to define a character dynamesh.<br />
<br />
Keyframed animation is a matter of storing node 'goals' for each keyframe, and pushing those nodes toward their goals when a specific animation is playing. In this case I am procedurally generating running animations through some simple manipulation of foot/knee nodes. Dynameshes must define anchor nodes, which are used to fix the mesh to the entity using them. Entities are essentially dragging the mesh around the world, unless the entity type specifies that it is to assume the physical state of the dynamesh, then the anchor nodes are used to dictate an entity's position/orientation.<br />
<br />
<br />
Links:<br />
<a href="http://www.cs.northwestern.edu/~ian/Twig/TCIAIG.pdf" target="_blank">Lightweight Procedural Animation ...</a> by Ian Horswill<br />
<a href="http://nehe.gamedev.net/tutorial/rope_physics/17006/" target="_blank">NeHe Productions: Rope Physics</a><br />
<a href="http://www.quelsolaar.com/confuse/index.html" target="_blank">QUELSOLAAR.COM: Confuce - Procedural Character Animator</a><br />
<br /></div>
deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-44921416424472581432014-09-21T09:59:00.003-07:002019-12-09T08:18:25.541-08:00Collision Detection with Distance Fields<div>
<br />
One of the great things about doing a Minecraft-style voxel engine, where the entire world is made of cubes, is collision detection. It's very cheap to detect which voxels one should compare a game entity's collision hull against, and the math is very simple.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj-2Cy5LcS_BecQS67iIDJSo_kKpcM5Hgca0qxlfQhvssZ079yWJ7gt6JWDJYPAsdlWtp88GL4zhshAO7GNh3yfJVhtQmF_nFNP7U6WCXP1sAI6Yt7OUIYwXG9GVzCl8pmsPvHtU1RESFA/s1600/minecraft.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="376" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj-2Cy5LcS_BecQS67iIDJSo_kKpcM5Hgca0qxlfQhvssZ079yWJ7gt6JWDJYPAsdlWtp88GL4zhshAO7GNh3yfJVhtQmF_nFNP7U6WCXP1sAI6Yt7OUIYwXG9GVzCl8pmsPvHtU1RESFA/s640/minecraft.png" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Minecraft: The cubic-voxel world game of the century.</td></tr>
</tbody></table>
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
One of the lame things about doing a Minecraft-style voxel engine is that Notch did it already (not to mention Minecraft's inspiration: Infiniminer). I am not making a cubic Voxel world. I'm not really even making a voxel world. I'm using voxels as an intermediate representation in generating my world. I don't even plan on making it very big, or modifiable. I'm not using marching cubes to make smooth terrain, and I'm not using boxy voxels either.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh53sTyICui-ny8aEUZepFN-QQp36_boJ9YvRSP5lawVjD6nBmlAO1LyelXzDzzVu-83qgpOC3vQKNOaIR4mIjymHI27Zu14gw21Dj1KNV2TKrJYI_byaKnI9AgW-ks3WEYVa1vMj19ssY/s1600/180914.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="356" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh53sTyICui-ny8aEUZepFN-QQp36_boJ9YvRSP5lawVjD6nBmlAO1LyelXzDzzVu-83qgpOC3vQKNOaIR4mIjymHI27Zu14gw21Dj1KNV2TKrJYI_byaKnI9AgW-ks3WEYVa1vMj19ssY/s1600/180914.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">my voxel game thus far</td></tr>
</tbody></table>
<br />
After all the generation and stuff, the end product is rendered as polygons (triangles). I could handle collision detection between objects and the world in terms of spheres and triangles, or cylinders and triangles, or any mathematically simple shape and triangles. But triangles are yucky, and I don't like dealing with them. I especially do not enjoy the thought of detecting *which* triangles to test intersections on. Back in the days when I was a big fan of BSP trees, this would have been fun, but not anymore. I've moved on to bigger and better things (or, whatever).<br />
<br />
What's nice about cubic voxels is that you can pretty much just index an array using your object's position and see if there are any voxels intersecting. In a Wolfenstein 3D style raycaster this is great for collision detection, it's just so simple to do. That's great and all, but I'm not using cubes. I'm using octahedrons that behave like metaballs and 'merge' when neighboring eachother. This complicates things.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi3z4iSxEM4EzbApQcivJUoS5eCPPDbfz0PCyFutRrIdfaYzHTYBxpS3EAnBYoKCp0W-IHe984sco5NsE-09Tr_xJOgfpRt9-lb_lWuPyx_0_Eoq4yQ_ZzDLMca1LZdpvyf7kgcsbBkk2Q/s1600/primitives.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi3z4iSxEM4EzbApQcivJUoS5eCPPDbfz0PCyFutRrIdfaYzHTYBxpS3EAnBYoKCp0W-IHe984sco5NsE-09Tr_xJOgfpRt9-lb_lWuPyx_0_Eoq4yQ_ZzDLMca1LZdpvyf7kgcsbBkk2Q/s640/primitives.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Collision primitives from Cryengine.</td></tr>
</tbody></table>
<br /></div>
<div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
My first ideas consisted of first assuming every object to be a sphere, or pill shape, and do a quick (hah) spatial search for the closest voxels. Then detect which faces are facing the entity, which edges and planes it intersects, calculate where any 45 degree planes may be (corners/edges) and handle the collision accordingly by moving the position of the object, and calculating a new velocity based on elasticity and friction values. This method seemed ideal for handling collisions using the sparse volume representation in memory only, and the sparse volume structure doesn't store information about the shape of the voxels, they are all just cubes as far as it is concerned. It was important to me that a slope of voxels could be ran up/slid down/rolled across/etc smoothly, without any stair-stepping. I wanted a 45 degree plane to behave like one.<br />
<br />
The most difficult part of this solution was handling large objects with multiple collision points with different groups of voxels around themselves. I quickly realized that a distance field representation of the whole scene would solve all of my problems - efficiently detecting collisions with all object sizes while behaving as though diagonal voxels were one continuous 45 degree surface.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.youtube.com/embed/uiCpS1hBBT0?feature=player_embedded' frameborder='0'></iframe></div>
<br />
In this video of a 2D prototype I did last week you can see the green 'voxels' and the distance field rendered as a blue/red gradient that is generated using a simple distance transform that propagates distances over the 'volume' in two passes. The circle is just a point in space which is used to index the distance field, which the value returned from is compared against the circle's radius. If the circle's radius is greater than the value at its center in the distance field then it is intersecting. Then it's a matter of checking the surrounding distance field values and calculating the gradient of area of intersection to determine a sufficient approximation of the 'normal' to properly de-intersect the circle and reflect its velocity with some amount of elasticity for a bounce effect.<br />
<br />
<br />
As a bonus, there are other uses for distance field representations of a 3D scene. Distance representations are very handy for any line/path tracing, so it lends itself well to calculating lighting and shadowing. It is also useful for AI pathfinding and obstacle avoidance. Fluid dynamics can also benefit from a distance field representation for properly drifting particle effects around the scene realistically. I've already uploaded the same distance field to the GPU as a 3D texture to perform some ambient occlusion in the vertex shader, which has increased the visual depth of the game scene. It will also make the job of illuminating game entities much simpler.<br />
<br />
The only downside to the distance field representation is memory usage. So far I'm just using a flat array, because I don't really see a very worthwhile means of compressing data as incoherent as a distance field. Conventional sparse structures, like octrees, will not be of much use. What would probably work best is a more continuous approach, like a cosine transform. Maybe dividing it up into 8x8x8 blocks and performing a discrete cosine transform (ala JPEG) would be a decent means of representing the data in memory? Each distance field query would then result in performing a bunch of cosine calls though, unless some quantization could negate most of them. Compression artifacts would yield bumpy surfaces, however.<br />
<br />
<br />
Links of interest:<br />
<a href="http://en.wikipedia.org/wiki/Distance_transform" target="_blank">Wikipedia: Distance Transform</a><br />
<a href="http://www.gamasutra.com/view/feature/131598/advanced_collision_detection_.php?page=1" target="_blank">Gamasutra: Advanced Collision Detection Techniques</a><br />
<br /></div>
deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-70826129980106083132014-08-09T12:07:00.000-07:002016-08-11T13:06:38.060-07:00World Representation<span style="font-family: inherit;"><br />It's been a while since my last post, a lot of brain storming (as opposed to actual coding) has been going on. I've been exploring world generation and possible data structures for representing the world in memory. This all required that I make some important defining decisions about the world.</span><br />
<span style="font-family: inherit;"><br />At the time of this writing, I have yet to decide on making the world dynamic (modifiable). This is primarily because of the cost of updating the world, finding a balance between CPU/GPU loads, and doing so over the network between the game server and clients. Another issue involves when a client joins a server, they must download the current state of the world from the server as it exists after any modification that has occurred. There are tricks and techniques for minimizing this data flow, so that the entire world volume doesn't need to be uploaded to each connecting client. Right now this is not a priority, or even a necessary feature for what I am trying to accomplish with this current project. This could change!</span><br />
<span style="font-family: inherit;"><br />Being that the world is volumetric (voxels) it was clear that there needed to be a data structure to hold the volume in memory, not just for rendering, but also physical interactions with entities. Following with Revolude's original design - the world would be procedurally generated (as opposed to loaded) when a game was started, or joined. This data structure would require that it could perform efficient read/write access.</span><br />
<span style="font-family: inherit;"><br />I examined the possibilities of a few different algorithms and data structures for dealing with volumetric data. A lot of algorithms for storing compressed volume data rely on the fact that the data represents direct visual data, like the pixels of an image, in the form of red/green/blue/alpha color channels. This allows for all sorts of neat 'lossy' image-compression style tricks to be used, which forsake exact reproduction of the original compressed data in exchange for smaller size. For applications where the volumetric data represents relatively finely detailed data, these algorithms are great.</span><br />
<span style="font-family: inherit;"><br />For my project the worlds are not that large, or that finely detailed. Voxel size is on par with games like Minecraft and Infiniminer. The primary difference from these games is that voxels will not be cubes. Voxels are also represented using a single byte to store a material-type value, allowing for up to 255 different materials (material zero is empty space). Worlds are also not intended to be infinite. Instead of having worlds extend infinitely through procedural means, they will be a finite size, and wrap around seamlessly. There will be no edge to the map, nor will it go on forever into places that nobody will ever go.</span><br />
<span style="font-family: inherit;"><br />I'm still settling on a specific world size. With the sizes I'm considering, storing the raw voxel data becomes feasible, without the use of any sort of sparse data structure. For example, with a world size of 1024x1024x256 the size of the world data is then 256 megabytes. Each horizontal slice of the world is one megabyte. The only qualm I have with just leaving the data sitting in memory, when virtually every machine capable of running the game has enough memory, is <a href="http://igoro.com/archive/gallery-of-processor-cache-effects/" target="_blank">cache coherency</a>. The larger the volume, the further apart in memory two neighboring voxels could lie. This is not good for performance!</span><br />
<span style="font-family: inherit;"><br />It's arguable that using flat storage for a finite volume won't produce a significant slow-down when the volume is queried for things like collision detection and rendering. Personally, I just don't want to be using more memory than I need to, especially if a byproduct of reducing the size is gained speed. Above all else, I love the challenge implementing a sparse data structure ;)</span><br />
<span style="font-family: inherit;"><br />The first obvious option on everyone's mind is a <a href="http://lmgtfy.com/?q=sparse+voxel+octree" target="_blank">sparse voxel octree</a>. This is a wonderful structure, but can become computationally expensive to iterate as it deepens. One strategy I had a while ago to 'enhance' the octree is to allow each node to have more than 8 children. Instead of 2x2x2, it could be 4x4x4, for 64 child nodes. This would allow an octree with a dimension of 1024 (gigavoxel) and 10 tree levels to only have 5 levels to iterate through from root to leaf. The issue here is that there would be excessive 'empty' child nodes all over a volume. This would be the price for faster iteration.</span><br />
<span style="font-family: inherit;"><br />Another strategy, one which I became quite fond of since my last post, is to store volumes as run-length encoded columns. This is especially attractive because it effectively reduces a volume into a few 2D images, and would work extremely well where voxels represent a limited number of materials (as opposed to highly variable 32-bit RGBA values). Many areas of the volume would be one or two column 'segments'. This approach was almost the one that I finally settled with, but I was having implementation indecision issues, trying to find a 'sweet spot' balance between speed, size, and simplicity. My obsessiveness with finding the perfect solution became a hindrance to progress.</span><br />
<span style="font-family: inherit;"><br />Ultimately, this lead me to fall back on an older idea, which is really an amalgam of a few ideas. At the coarsest level, I represent a volume using a sort of hash-table. This is just a flat array that divides the world up into fixed size 'chunks'. The trick here, then, is to use a sparse voxel octree to represent the contents of each chunk. Using some efficient design, a chunk that is entirely one material (including empty) is stored as four bytes. The rest of the chunks, which I call 'boundary' chunks, are stored using a variable number of octree nodes. Each one has its own pool of octree nodes from which it builds the sparse octree representing the organization of voxels and their material indices. This node pool starts out small, and is reallocated as needed by doubling its size each time it fills up.</span><br />
<span style="font-family: inherit;"><br />Currently I am working with chunks of 32x32x32, which is 32768, or 32k, of voxels. This seems like a nice round number, because I can fit an octree node index into 15-bits (16th bit flags leaf-node status). Now, in an octree, if each voxel could be a different material (32k different materials) then there would be an overflow because inner nodes would require more address space for 4680 nodes, but I am willing to bet that with less than 256 possible voxel materials this 'special' case chunk will never occur. Most chunks will never even see 10k of nodes.</span><br />
<span style="font-family: inherit;"><br />With chunks that are 32k voxels in size, this means that a 64-gigavoxel volume (4096^3) would consist of 32k chunks. The flat array of chunks for a 64-gigavoxel volume would be less than a megabyte. The total size of the chunks themselves could vary, but would average less than 100 megabytes. This is really great for 64 gigavoxels. Now, I'm not going to be representing a world that is 64 gigavoxels, I don't think. I'm thinking smaller, because the goal here is a game that is more of a multiplayer online battle arena than some sort of vast expanse for an MMORPG.</span><br />
<span style="font-family: inherit;"><br />This is actually how all volumes will be represented in memory, in terms of materials, and out of chunks. Some game objects will be displayed using volumes, some using other graphical techniques. This is a global 'voxel' solution.<br /><br /><br />Links of interest:<br /><a href="http://raycast.org/powerup/publications/FITpaper2007.pdf" target="_blank">Visualization of Large RLE-Encoded Voxel Volumes</a></span><br />
<a href="http://0fps.net/2012/01/14/an-analysis-of-minecraft-like-engines/" target="_blank"><span style="font-family: inherit;">An Analysis of Minecraft-like Engines</span></a><br />
<span style="line-height: 20.799999237060547px;"><a href="http://advsys.net/ken/voxlap.htm" target="_blank"><span style="font-family: inherit;">Ken Silverman's Voxlap Page</span></a></span><br />
<br />
<div>
<br /></div>
deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-48648608469282920832014-07-21T04:01:00.000-07:002018-09-17T06:33:47.117-07:00Procedural Content, and Aiming Too High<br />
<div>
At the outset of this project my initial aim was to allow full control over the entirety of a scene, as far as level design and editing is concerned. After sitting down and carefully considering potential routes of implementing functionality to allow users to create custom worlds and other assets, I've come to the decision that the work required to provide an interface for editing such assets will only hinder my ability to actually complete this project.</div>
<div>
<br /></div>
<div>
With my previous project, <a href="http://www.sourceforge.net/projects/revolude" target="_blank">Revolude</a>, it seemed rather slick to procedurally generate the world when the game starts. Whoever was responsible for launching the game server had access to various sliders and coloration options they could play with to customize the 'style' of the world that people would experience while playing in the game they are hosting.<br />
<br />
<div>
</div>
</div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgKDam0UOXZR0ktGX3d3yGLK0Z0UPQdDUA_uB8dGPfI-l9fwKUaO_KqZfgvrnYg9VSVp_f1AiZgyhLy5dF7BTPRIVU4zLx28Lif_nRu9qcbUI3vYkNSa1gkLAAzTRIJ0NkX0O3JAz6L9sQ/s1600/revolude4_03.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="768" data-original-width="1280" height="379" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgKDam0UOXZR0ktGX3d3yGLK0Z0UPQdDUA_uB8dGPfI-l9fwKUaO_KqZfgvrnYg9VSVp_f1AiZgyhLy5dF7BTPRIVU4zLx28Lif_nRu9qcbUI3vYkNSa1gkLAAzTRIJ0NkX0O3JAz6L9sQ/s640/revolude4_03.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">the "start game" menu screen from my previous project 'Revolude'</td></tr>
</tbody></table>
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
<div>
<div>
<br />
This functionality alone was simple enough to implement, and was (to my mind) the happy medium between designing your own level, and choosing an existing one from a list when launching a game. A world seed value (not exposed via the UI menu screen) along with all of the server's world-generation parameters are gathered up and sent off to connecting clients who are joining the game, so that they can generate their own local copy of the entire world for rendering and collision detection prediction.<br />
<br />
One of the primary advantages to this approach is that every server can be running a completely unique world without clients being required to download or store any map geometry from the server. The only thing transferred are some parameters for the procedural generation of the world. This is, in my mind, the great advantage to using procedural methods.</div>
</div>
<div>
<div style="-webkit-text-stroke-width: 0px; color: black; font-family: 'Times New Roman'; font-size: medium; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; orphans: auto; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;">
<div>
<div style="margin: 0px;">
<br /></div>
</div>
</div>
</div>
<div>
What with my non-existent experience concerning floating point determinism, I did run into some serious bugs where trees would sometimes be planted, or not, in specific 'close call' spots that seemed too steep to some CPUs, and not too steep on others. This resulted in worlds that were somehow slightly different on two machines that were playing in the same server. These sorts of issues are resolvable, especially if you are aware of them before writing any code in the first place.</div>
<div>
<br /></div>
<div>
Nonetheless, I really like the idea of providing a user with procedural tools to create a base scene volume, from which they could construct their vision by hand. These worlds would be saved, in compressed form, to disk, and would be uploaded to connecting clients. I envisioned a full built-in editor for flying around sculpting worlds, placing entities and detail props, etc.. Along with a procedural materials editor, an entity voxel volume editor, and possibly a synthesized sound editor (Revolude actually featured console-scripted sound synthesis, with enveloping and a few effects, but required hand-editing sound parameters in an external text editor and alt-tabbing back and forth to listen to the resulting sound).</div>
<div>
<br /></div>
<div>
It just seems like too much work for what I'm trying to accomplish. What I would rather do is give a preview to a user that is starting a game. This would just be a more advanced version of what Revolude does. Storing to disk, and transmitting actual compressed volume data to clients just sounds too expensive, and defeats what I initially hoped to accomplish by utilizing procedural methods in the first place.<br />
<br />
<br /></div>
deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-23489237227414311632014-06-21T12:00:00.000-07:002016-08-11T13:07:08.967-07:00Networking Game State and Events<div>
Distributing the game state among multiple machines can be tricky business, depending on the complexity of your game state itself. Most games treat the game state as an array of entities, sometimes indexed using some sort of a scene graph or other hierarchical organization. At the end of the day, though, it's just a list.</div>
<div>
<br /></div>
<div>
To make designing the game more flexible (what with scripting, etc) usually these entities are represented as merely an entity index, and a chunk of data associated with that entity. This data 'blob' is then structured into meaningful values through the scripting system, which divides it up into entity state variables like origin, velocity, model index, etc.</div>
<div>
<br /></div>
<div>
On the networking side of things the engine discerns what state changes have been made to each entity that are relevant to the machine on the other end of the connection. Typically this consists of the server trying to figure out what information is relevant to each client about all the entities in the game, based on what the server knows the client has received so far, and what the client needs to know for its perspective of the game state.</div>
<div>
<br /></div>
<div>
This entails a complex system of backing up copies of the game state to generate a delta-compressed update unique to each client's situation, based on what the client has acknowledged having received as far as the game state is concerned. Fortunately, things are much simpler for the client, only being required to update the server concerning the player's activities in a much simpler scope.</div>
<div>
<br /></div>
<div>
The system I've been developing the past few days simplifies things, for the most part. It could also potentially simplify peer-to-peer type games, as an emergent property of its design. But for now I'm focusing on utilizing a server for simulating most of the game state, while allowing clients to submit their own simulation states. The goal, for simplicity's sake, is to utilize the same system on both ends for dealing with generating and sharing the game state.</div>
<div>
<br /></div>
<div>
This requires a system that works virtually like two servers talking to one-another. The only difference is that the client has a player manipulating the game state that it's responsible for relaying to the server, which in turn relays them to other clients.</div>
<div>
<br /></div>
<div>
One strategy for simplifying the description of entity types, while simultaneously simplifying serialization of entity states for network transmission, is to utilize entity type templates. Instead of having a scripting language that describes (in some sort of VM-executed code) setting the state of an entity, one variable at a time, why not just have static entities that are essentially copied to real entities based on specific triggers or events occuring?</div>
<div>
<br /></div>
<div>
Then, all that's needed to be transferred over the network when an entity's non-continuous state evolves (eg: entity flags, boolean states, model index, physics type, etc) is an ID value for the specific entity template that the entity should become. This eliminates the need to submit any specific entity properties outside of the continuous state (like origin, velocity, etc). The server could send these templates to clients, so that servers can be running customized games.</div>
<div>
<br /></div>
<div>
Utilizing an event-system, for things like player chat messages, player actions (jump, shoot, kill), entities changing their type/template, etc. allows the game state to be divided into discrete 'frames' that are not based on units of time, but units of actual state change. Each machine, upon generating an event, both executes the event and queues it to be distributed to remote systems.</div>
<div>
<br /></div>
<div>
Clients distribute their locally-generated events to the server (through a Quake3 style re-send until acknowledgement received method) which execute the events, and queue them to be further distributed to all clients. One global queue can be used on the server, in a circular buffer, and each client connected will continuously provide the server with acknowledgement of the last received event.<br />
<br />
Right now I am implementing the events system which consists of an event execution mechanism (giant switch/case) and data parsing. Data comes in as a simple void* pointer, which is either forwarded from a network-parsed event, or from wrapper functions which allow engine code to invoke specific events with actual variable parameters by lumping them into data identical to what can be found in a network event.</div>
<div>
<br /></div>
<div>
Links of interest:</div>
<div>
<a href="http://trac.bookofhook.com/bookofhook/trac.cgi/wiki/Quake3Networking" target="_blank">Quake3Networking - bookofhook</a><br />
<a href="http://gafferongames.com/networking-for-game-programmers/" target="_blank">Game Networking | gafferongames.com</a></div>
deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-51535226078185616052014-06-19T00:26:00.000-07:002016-08-11T13:13:40.436-07:00Multiplayer, Netcode, Etcetera<div>
<br /></div>
<div>
A big giant portion of my game project is multiplayer. Since the days of Quake3 I have found it hard to imagine working on much else than a game which can be played online among friends, enemies, and total strangers. I've also been an avid fan of Counter-Strike since the beta days, and the ultra competitive game play enthuses me to this very day.</div>
<div>
<br /></div>
<div>
Anybody who played QuakeWorld (TF, 2fort4 anyone?) surely remembers the requisite skill of being able to lead your targets in order to actually hit them. This was acceptable to us back then. It was considered a fact of life, and was simply unavoidable. It was also something that could be improved upon.</div>
<div>
<br /></div>
<div>
During what I like to think of as "The Counter-Strike Days" a programmer at Valve Software by the name of Yahn Bernier developed a new approach to networking multiplayer games in the Half-Life engine. It was a two-pronged strategy that consisted of client-side prediction and server-side lag compensation.</div>
<div>
<br /></div>
<div>
Client-side prediction is actually just a technique for disguising the fact that the server is doing all the authoritative simulation and that there is internet-induced latency in communicating the state of the simulation to the player clients which are interacting with it. It goes a long way toward making the game feel more responsive, without actually making the simulated interaction of game objects more responsive. Only superficial aesthetic aspects of the game state can be predicted, and are not necessarily an accurate depiction of the actual game state.</div>
<div>
<br /></div>
<div>
Server-side lag compensation is the closest thing to an actual solution which minimizes the effect network latency has on game play and game simulation responsiveness. More often than not, it works quite well. If you aim directly at an opponent, and fire, the hit will register almost as accurately (but not as immediately) as if the game were being run on the local machine. The server effectively re-winds the game state before performing physics and intersection tests.<br />
<br /></div>
<div>
On paper it sounds great. In practice, it isn't perfect at coping with latency jitter - or, latency variance (fluctuating ping) among players, which can throw off the compensation wildly. This is a fact that is not well-known. It is the heart of clips being emptied point-blank to no avail. It is also a good reason to avoid playing over WiFi connections, which are prone to random interference and ping spikes.</div>
<div>
<br /></div>
<div>
One more strategy, to smoothing out the game experience, is interpolation. In most modern games, this involves the client storing up multiple updates from the server, so that it can play them as smoothly and accurately as possible - by knowing the starting point and following point of an objects motion. In Counter-Strike Source, for example, this is fixed at 100 milliseconds. So that no matter what the actual network latency is, it is always compounded with an extra 100 millisecond delay for the sake of smoothing out object motion.<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjo52ekJ_x5JGqiZNLWH6xH8i9r8w9CRhsO5XmOt0mFoTkj9SXhkG1qmC5NPKMgxgY2Bsyh-BpOb74j7U9YcOdlbL7oMxw5tko4ZI5OcKHgaM2WDHi7wAd6XlttGSjRogRHA42v-JB2KHA/s1600/cstrike_hitboxes.jpeg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="480" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjo52ekJ_x5JGqiZNLWH6xH8i9r8w9CRhsO5XmOt0mFoTkj9SXhkG1qmC5NPKMgxgY2Bsyh-BpOb74j7U9YcOdlbL7oMxw5tko4ZI5OcKHgaM2WDHi7wAd6XlttGSjRogRHA42v-JB2KHA/s640/cstrike_hitboxes.jpeg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">This image of Counter-Strike Source with sv_showhitboxes enabled displays the last known position of an entity as received from the server before the 100 millisecond interpolation delay that is used to smooth movement</td></tr>
</tbody></table>
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
At the end of the day, it seems like this is a lot of work just to make the game as responsive as possible without actually reducing or eliminating the actual latency between clients and servers. As long as the server is the only 'true' simulation of the game, these techniques or minimizing the effects of latency on game play are as good as it's going to get without upgrading the actual internet itself.</div>
<div>
<br />
The reason that many games use the above techniques is because they are the best there is right now, and nobody believes that anything better can be done. Thus, nobody really explores different options. They see server-side simulation authority as imperative, because it's the only way to make the game secure from cheaters hacking and hackers cheating. My strategy is to do everything different than what is considered 'right' by many developers.<br />
<br />
Firstly, I believe in letting the player's simulation occur client-side. I believe that game play can only be furthered by removing the lag component almost entirely from the player interacting with the simulation. This would be the equivalent of affording some game authority to the client-side prediction being used already.<br />
<br />
Of course, the issue of hacking and abuse is the first thought to cross the minds of virtually anybody who understands the difference between client-side and server-side game logic. This is alleviated using a simple array of sanity checks for various 'vulnerable' circumstances. There is always a requisite state, or group of states, which allows a particular following state.<br />
<br />
By closely examining the evolving state of a client's side of the game, on the server, it is easy to determine the likelihood that the game has been altered, hacked, etc. Each client will have a dynamic score indicating the probability of game tampering, and a threshold for this value which will invoke consequences (eg: kickban) once reached.<br />
<br />
So, lets say that we are simulating a player's influence on the game locally on the client, and we have our hacking detection on the server, and everything is peachy. There's still the issue of latency when the game state is sent to clients. Clients will be interacting with an older state than other players are seeing. If two players are in a firefight, and one tries to take cover, he could end up dropping dead after reaching a safe spot simply because the other player could still shoot him where he was out in the open. This is not fun!<br />
<br />
What if we could just predict where other players could be all the time? Or at least guess, so that we're closer to seeing where they actually are, as opposed to where they were? This is called extrapolation. Most games only rely on extrapolation when there is a lag spike, or dropped update packet, and the client's simulation runs out of updates to interpolate between to keep things moving smoothly.<br />
<br />
I propose utilizing extrapolation exclusively to substitute for interpolation. When an update is received it should be used to project where the object currently is (based on latency) and where it will be by the estimated time the next update will be received. In the interim the engine can begin interpolating from wherever the object is at the moment (end of previous update extrapolation) to this predicted position.<br />
<br />
This will not be nearly as accurate at showing the actual path as existing interpolation/delay methods, but since the server isn't the boss anymore this doesn't even matter! Infact, it will be extraordinarily inaccurate for higher pings.<br />
<br />
At a round-trip time of 50 milliseconds (ping), an object will be 25 milliseconds ahead of the position received. If updates are 20hz then we can add another 50 milliseconds (plus or minus whether the update is early/late). So all we need to do is project the received origin out by 75 milliseconds and start interpolating the position to this new spot. A better approach would be to estimate an interpolation vector, scaled so the position will reach the projected position at 75 milliseconds, but continue moving the object with that same vector if the next update doesn't make it in time. When a new update comes, it will cause a smooth correction, and the accuracy no longer is significant as far as manipulating the state of the object locally (shooting and killing it).<br />
<br />
Now objects will be more closely where they really are for clients and the server. At least, they won't be far behind like existing methods force them to be. This allows for firefights and interactions to be far more engaging, because it will drastically reduce the take-cover-and-die phenomenon.<br />
<br />
The one side-effect of extrapolation is rubber-banding. Let's say we are viewing an object that is stationary, and our client has 200 ping (100ms latency). The object begins moving, and we receive the update 100ms later that is has moved a certain amount so far, and is moving at a certain velocity. Now, we take our stationary local copy of the object and have to accelerate it to the position we predict it will be by the next update, moving it <i>faster</i> than it actually was. Once we receive the second update about its motion, we should be pretty synced up, so long as it keeps moving in a straight line, but there will be a noticeable drop in its movement speed, back to its real speed, when that occurs.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.mindcontrol.org/~hplus/misc/extrapolated_interpolation.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="400" src="https://www.mindcontrol.org/~hplus/misc/extrapolated_interpolation.png" width="243" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">This image depicts a player path (moving upwards) and what it looks like linearly interpolated from one predicted position to the next. If one were to also extrapolate velocity this could be smoothed further.</td></tr>
</tbody></table>
<br />
<br />
Inversely, when the object stops moving, we will still be simulating it moving beyond its stopping point, and our simulation will be forced to bounce it back to its resting position. Objects will effectively be racing around to be where they really are, always drifting around. This can be hidden by not allowing abrupt movement changes using low acceleration and friction values, but is not always fun because it lowers the game pace. Further smoothing of positions and velocities can be applied, but ultimately the smoother the result, the less real-time it will be. You will be trading smoothness for delay.<br />
<br /></div>
<div>
Links of interest:</div>
<div>
<div>
<a href="http://www.gamesurge.com/pc/interviews/netcode.shtml" target="_blank">Netcode Interview - with Yahn Bernier</a> (Gamesurge.com, Aug. 2000)</div>
<div>
<a href="https://developer.valvesoftware.com/wiki/Lag_compensation" target="_blank">Lag compensation - Valve Developer Community</a><br />
<br /></div>
</div>
deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0tag:blogger.com,1999:blog-4395646461527310891.post-38891434638602516512014-06-18T10:32:00.001-07:002016-08-11T13:07:35.777-07:00Voxel Polygonization and Game Engines Hi, and welcome to my blog. I wanted to have a place online for the world to see what I am doing, something to keep me focused. Right now I am currently polishing an algorithm I had conceived a few years ago for generating a triangular mesh from a voxel volume. From the algorithm's inception I've understood what I wanted it to output. It wasn't until recently whilst starting on my (latest) game engine project that I finally had the insight to help me solve writing the algorithm.<br />
<div>
<br /></div>
<div>
I was trying to nail down what sort of world I wanted my current game project to present to players. In my previous game engine project, <a href="http://www.sourceforge.net/projects/revolude/" target="_blank">Revolude</a>, the terrain was a simple heightmap and a static ROAM type algorithm was used to perform a simple polygon reduction before dumping each terrain chunk to a display list consisting of triangle strips and fans.</div>
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgMIMDIL06Gk5-Kzk8OWLV5xTYZxKwFbKYPe6eoAwi8MlQqfVsIEkAOLbW8K-iYDKwU6jrCYFl8TZ5ASDQ5IVHgv0r3bSh-zpGspjUEDh6zv90JqzWyFlSqjjjAUsloUDk6j00Uu40RRtc/s1600/revolude4_08.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgMIMDIL06Gk5-Kzk8OWLV5xTYZxKwFbKYPe6eoAwi8MlQqfVsIEkAOLbW8K-iYDKwU6jrCYFl8TZ5ASDQ5IVHgv0r3bSh-zpGspjUEDh6zv90JqzWyFlSqjjjAUsloUDk6j00Uu40RRtc/s1600/revolude4_08.jpg" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Revolude, with some pseudo-trees and buildings, and some missiles hitting the ground.</td></tr>
</tbody></table>
<br />
<div>
I was just about to opt for creating a much more involved version of this same type of terrain when I had a stroke of ingenuity. The algorithm is logically very simple, and produces the equivalent output of what I would like to call the dual of a cubrille mesh. In other words, it takes a Minecraft-esque type mesh and converts every face into a single point, and each point into a triangle. The dual of a cube is an octahedron, and the algorithm I wanted to create essentially produced an octahedral mesh from a voxel volume.</div>
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://upload.wikimedia.org/wikipedia/commons/e/e7/Dual_Cube-Octahedron.svg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="320" src="https://upload.wikimedia.org/wikipedia/commons/e/e7/Dual_Cube-Octahedron.svg" width="317" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">A cube, and its 'dual' - an octahedron. Notice how each of the six cube corners becomes a single triangle, and each cube face is collapsed into its center point.</td></tr>
</tbody></table>
<div>
<br /></div>
<div>
From an outside standpoint, this really seems very simple to do. One approach could be to first generate a simple boxy cube mesh, and then try to brute force convert it into its dual, an octahedral mesh. This became an appealing option after only a while, but I only approach problems with the intent of devising a novel solution, so that was out of the question.</div>
<div>
<br /></div>
<div>
Ideally, I was hoping to create an algorithm in some way similar to marching cubes - where each cube was inspected along with its neighbors to produce geometry. Something that achieved the desired output through such an immediate method was the goal. This approach, however, became much more involved than I had initially believed it could be.<br />
<br />
The end product works in multiple passes, on the volume as a whole. The first pass examines 8 voxels at a time, determining whether vertices should be placed, and where. The second pass connects vertices to form edges. Edges can only exist between vertices where the dot product of their normals is greater than or equal to zero. So, vertices must either be facing in the same direction, or up to 90 degrees apart. The third pass involves detecting triangles formed by edges. This is done by searching for 'complement' edges for each vertex, where two edges connected to the vertex are connected to a third edge.<br />
<br />
The last operation involves detecting all the left-over quads formed by edges, by doing a similar search to the triangle detecting pass. Instead of looking at the edges connected to a point, we look at the edges connected to an edge, and see if there is an opposite edge on the ends of its connected edges.<br />
<br />
These operations, naively implemented, can be extremely slow. Fortunately, it is not too expensive to store extra connectivity information which is used to quickly search edges, vertices, and triangles for related geometry. The end product can then be reduced to simple vertex information and vertex indices for triangles.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.youtube.com/embed/Q7n810k3O64?feature=player_embedded' frameborder='0'></iframe></div>
<br />
Here is a quick video of the algorithm at work with an evolving volume. Forgive the choppiness of the video, my little netbook isn't very awesome. The slowest part of this demo is generating the volume itself. The next step will be to utilize a compressed data structure as the data source, as opposed to raw volume data.</div>
deftwarehttp://www.blogger.com/profile/13361822983119836854noreply@blogger.com0