This software is the result of a series of evolutions dating back to 2009 when I first released the EWMH (Eve Windowed Mode Helper) for EVE Online and it still shares some of the concepts that were implemented in there. For example the windowed mode relocation option in the main tab is a modified version of the one the EWMH had.
Althought I started building everything using Directx 9, it soon became clear that I should not use any parts that were not compatible with the way things are done in the newer Directx versions, and so the engine was built with Dx11 in mind, shaders for everything.
I had no idea on how to make most of this in January 2015, nor the will to do so.
Suddenly an itch developed that made me play TESIV: Oblivion, so I started playing again, and playing means modding, so I went to the Nexus and in there I found a mod that had shaders for the game, ambient occlusion and everything.
It wasn't too bad, but it had a huge problem with foggy scenes, and so I took the shader code and fixed it in like 5 minutes, then I gave the modified part to the original modder and he added me to his list of perpetrators. Nothing too hard in there, my first contact with HLSL (High Level Shading Language).
But the damage was already done.
I was hooked again, the language is so much like c that I could read and understand it easily, I started investigating about shaders, pixel shaders, vertex shaders, how they work, why they work the way they work, and so it all began.
Soon I made my first attemp to integrate ambient occlusion, a depth based ao, using ArKano22's GLSL implementation, toiled with it for like half a year, tried many things, even build a program using Sibenik's model to test everything without having to relay and debug on an injected game, believe me when I say that I tried everything, even martinsh variation with the golden ratio fibonacci spiral sampling distribution.
It wasn't good enough, this approach had no way of compensating for the curvature of the camera's projection, and it had the 'defect' of making huge gray areas that only served to change the overall illumination of any particular scene, and then there were the huge triangles where mesh details on the ground would become very visible, specially in the snowy terrains near the sea of ghosts in Skyrim, the huge contrast triangles on the ground were just unaceptable.
So, the next fase involved looking at how this particular problem was solved in other implementations, and the only real solution I could find was to take the scene mesh normal vectors into consideration, that alone took me a month to get it right, I'm not going to delve into the details but as you can see on the picture, in the end, it was a complete success.
All that time making tests, learning, investigating, searching for information, gave birth to what I call SRBAO, this is a new ambient occlusion pixel shader, one that only needs a depth buffer to work, the normal vectors are indeed calculated in real time, which means saving a whole lot of GPU bandwith, specially at high resolutions.
While in the process of integrating a global illumination shader into the injector I grew tired of the excessive mathematical simulations of approximations, the mathematical virtual constructs that define shaders like HBAO or SAO for example. So I made my own.
Some shaders try to make the occlusion solution scale with distance, as if in doing so it would help to convey more depth to the scene, in my opinion that task belongs to the camera projection, and trying to do it on the occlusion shader just adds another layer of complexity to it, one with a set of problems of its own.
And this is only the part that concerns the simulation of ambient occlusion, as the pixel luminace, its surrounding pixels and the way each game processes fog are also taken into account.
All of this combined affect so much the final solution that it can make it dissappear completely, thus making too many compromises about detection and sampling quality moot, or irrelevant, and so processing speed becomes the primary focus of the whole implementation, not something you would like to spend half of your GPU's processing power into.
So as you can see, after becoming able to make all of this, adding a refactored version of SMAA, Far Depth of Field or Color correction became way too easy. Those parts are there, the color correction by itself can have a much bigger impact on game look and feel than the ambient occlusion, be it as ingenuous as it might be.
That too is a personal take on the matter, one that I feel proud of, one that took many iterations to make it right in my eyes.
You might have noticed how most of this story so far revolves around the ambient occlusion shader, it is because it was a novelty to me, something that catched my eye from the very first time I saw it, and folstered my will to code new things I never did before, to learn how to make programs using Directx, how to tame the raw processing power of new GPUs.
And so I didn't stop there, I wanted more, and building a 3D engine from nothing provided quite the challenge. Granted, the engine that runs inside the counter it's not a full blown engine, but it is a good aproximation of a complete parametric 3D windowing system, not unlike how Windows itself it's built, even the screen where the overlays appear is a special window by itself, very much like the Windows' desktop.
I even made a font texture baker based on Freetype with autokerning, outlines and shadows, a piece of software that took me a tremendous amount of work to make it do what I wanted it to do, it even writes all the c++ class code for the engine. Maybe at some point I'll release it once it actually has an interface of its own. I just couldn't make do with any of the solutions that are floating around the internet, so I built my own.
There was only one thing I really missed, something I wanted from the very begining but was unable to integrate before, realtime video and audio recording, I made several attemps now and then during those months of development, always trying to run away from stuff like ffmpeg or avc and aac, and at the very last moment, I made it, once again, finally after trying to understand how things worked, I succeeded at natively integrating libvpx and opus.
And then, there's the proverbial elephant in the room, all this talk about the engine, its effects, and everything else, but ignoring the fact that we were always talking about a piece of software that it is in the end an injector, a program that has to be embedded inside another game.
This is the hardest part, the integration of the counter's caracteristics inside a game's graphics pipeline.
There are good chances that most of the counter will work inside a non recognized game, fps counting, antialiasing, color correction and video recording are always available, the depth based shaders will not, they are disabled by default because they require to be adjusted in a different way for each game.
Of course, none of this would have been possible without Microsoft's help, as everything has been built using Visual Studio Community 2013.
I've used visual studio since the old days of vc++ 6.0, yes that's 1998, in fact I never upgraded until 2008, the newer versions are even better, the debugger integration never had any equal and it only got better with time, like a good wine. If you are looking into learning how to make your own c/c++ programs, it is the very best, without contest, just remember that the language in itself is only a tiny part of the whole programming equation.
I hope you enjoyed reading this, I think it's a nice story that maybe tells more about me than about the counter itself, but I just wanted to let know how it all happened to all the people using it.