Initial commit

This commit is contained in:
kayomn 2023-08-21 23:29:52 +01:00
commit 789500aead
26 changed files with 706 additions and 0 deletions

3
.gitattributes vendored Normal file
View File

@ -0,0 +1,3 @@
*.png filter=lfs diff=lfs merge=lfs -text
*.ico filter=lfs diff=lfs merge=lfs -text
*.ttf filter=lfs diff=lfs merge=lfs -text

1
.gitignore vendored Normal file
View File

@ -0,0 +1 @@
public/

17
config.toml Normal file
View File

@ -0,0 +1,17 @@
# The URL the site will be built for
base_url = "https://kayomn.net"
taxonomies = [
{name = "programming", feed = true},
{name = "technologies", feed = true},
]
# Features
compile_sass = false
build_search_index = false
[markdown]
highlight_code = true
[extra]

3
content/_index.md Normal file
View File

@ -0,0 +1,3 @@
+++
+++

3
content/blog/_index.md Normal file
View File

@ -0,0 +1,3 @@
+++
title = "Blog"
+++

View File

@ -0,0 +1,258 @@
+++
title = "Hacking Generator Functions in C++"
date = 2022-05-22
description = "Taking a look at all the dark corners regarding lambdas in C++."
[taxonomies]
programming = ["c++"]
+++
# Preface
Since their introduction in the C++11 standards revision, lambdas have been a massive success story for C++ due to their interaction with existing, pre-lambda code - both inside and outside of the standard library implementations. However, features of their specification make them far more powerful than initially obvious, as is explored in this article.
# History
As mentioned well in the preface, C++ lambdas integrate well with existing standard and third-party code, in a manner where picking them up over the previous "functor `struct`s" approach comes with little-to-no friction.
```cpp
struct MyFunctor {
std::string name;
MyFunctor(std::string name) {
this->name = name;
}
void operator()() {
std::cout << "Hello, " << this->name << "!";
}
};
void beforeLambdas() {
MyFunctor myFunctor = MyFunctor("Functors");
myFunctor();
}
```
Achieving "lambda-like" functionality in a pre-C++11 world was feasible; primitive operator overloading and the object model provided by C++ from the get-go allowed for making structs that somewhat visibly behaved like functions at the call-site, although with a lot of associated boilerplate.
Practically, this was no different than writing a highly-specialized and, arguably redundant single-method class. In many cases, this made type-erasing callback interfaces like C's [`qsort`](https://en.cppreference.com/w/c/algorithm/) preferable as a lower-boilerplate solution.
Irrespective of individual developer sentiment, the C++ standard library adopted functors for much of its [algorithm](https://en.cppreference.com/w/cpp/algorithm) and [container](https://en.cppreference.com/w/cpp/container) libraries, where a type-safe callback for customizable logic was necessary.
```cpp
void afterLambdas() {
std::string name = "Lambdas";
auto myLambda = [name]() -> void {
std::cout << "Hello, " << name << "!";
};
}
```
Lambdas bridged the two worlds of ease-of-use and type-safety by introducing new syntax into the language that allowed for expressing a standalone function and data captures as a lambda expression. Typically, these lambdas are compiled down into anonymous structs with their data captures as members and the function as an method overloading the call syntax operator.
Alongside the introduction of general-purpose type erasure in [`std::function`](https://en.cppreference.com/w/cpp/utility/functional/function), this made lambdas either compatible with, or at the very least preferable over, code that was built with functor objects in mind.
# C++ Lambda Characteristics
```cpp
~HashTable() {
auto destroyChain = [this](Bucket * bucket) {
while (bucket) {
Bucket * nextBucket = bucket->next;
bucket->item.key.~KeyType();
bucket->item.value.~ValueType();
this->allocator->Deallocate(bucket);
bucket = nextBucket;
}
};
destroyChain(this->freeBuckets);
uint32_t count = this->count;
for (uint32_t i = 0; count != 0; i += 1) {
Bucket * bucket = this->buckets.At(i);
if (bucket) {
destroyChain(bucket);
count -= 1;
}
}
this->allocator->Deallocate(this->buckets.pointer);
}
```
Through personal practice, I've found them to be incredibly powerful beyond as a substitute for [nested functions](https://dlang.org/spec/function.html#nested) and [local functions](https://docs.microsoft.com/en-us/dotnet/csharp/programming-guide/classes-and-structs/local-functions) in the D and C# programming languages respectively.
The above code demonstrates their use in the aforementioned manner as a simple cleanup function, which removes code duplication between the two units of logic calling `destroyChain`. This same solution could have been achieved through a secondary method in the `HashTable` class, however it then requires the code reader to go elsewhere to find its declaration and further pollutes the class namespace with otherwise unnecessary symbols. With that said, it's important not to lose sight of overhead considerations when using lambdas.
Programmers coming from languages like C# or Java may have a pre-conceived notion that lambdas are inherently more costly compared to methods, however this is not necessarily the case. In part, this is because C++ can reason about the size of a lambda at compile-time because captured values are explicit rather than implicit. In contrast, a Java lambda or C# delegate does not necessarily know the bounds of its stack at compile-time and must dynamically allocate it during execution - introducing the potential for more overhead if it cannot be hoisted onto the stack via [escape analysis](https://en.wikipedia.org/wiki/Escape_analysis).
```
HashTable::~HashTable() [complete object destructor]:
push r13
push r12
push rbp
push rbx
sub rsp, 8
mov rbp, rdi
mov rbx, QWORD PTR [rdi+32]
test rbx, rbx
je .L8
.L9:
mov rsi, rbx
mov rbx, QWORD PTR [rbx+16]
mov rdi, QWORD PTR [rbp+0]
mov rax, QWORD PTR [rdi]
call [QWORD PTR [rax+8]] ; this->allocator->Deallocate(bucket)
test rbx, rbx
jne .L9
.L8:
mov r13d, DWORD PTR [rbp+8]
test r13d, r13d
je .L10
mov r12d, 0
jmp .L14
.L19:
mov ecx, OFFSET FLAT:.LC0
mov edx, 43
mov esi, OFFSET FLAT:.LC1
mov edi, OFFSET FLAT:.LC2
call __assert_fail
.L20:
sub r13d, 1
.L12:
add r12d, 1
test r13d, r13d
je .L10
.L14:
mov eax, r12d
cmp rax, QWORD PTR [rbp+16]
jnb .L19
mov rdx, QWORD PTR [rbp+24]
mov rbx, QWORD PTR [rdx+rax*8]
test rbx, rbx
je .L12
.L13:
mov rsi, rbx
mov rbx, QWORD PTR [rbx+16]
mov rdi, QWORD PTR [rbp+0]
mov rax, QWORD PTR [rdi]
call [QWORD PTR [rax+8]] ; this->allocator->Deallocate(bucket)
test rbx, rbx
jne .L13
jmp .L20
.L10:
mov rdi, QWORD PTR [rbp+0]
mov rsi, QWORD PTR [rbp+24]
mov rax, QWORD PTR [rdi]
call [QWORD PTR [rax+8]] ; this->allocator->Deallocate(bucket)
add rsp, 8
pop rbx
pop rbp
pop r12
pop r13
ret
```
The lack of dynamic allocation overhead with lambdas in C++ can be visibly observed if the disassembly for the previous code sample is viewed, wherein no calls to `malloc`, C++'s `new` implementation, or otherwise. In fact, through closer inspection of the above Clang 11 x86-64 disassembly, the `destroyChain` lambda has been completely elided, instead inlining its operations to each of its two call-sites.
This leaves the intrinsic `__assert_fail` and user-defined `Allocator::Deallocate(void *)` functions as the only remaining `call` instructions in the generated output, the latter of which is a virtual function and is therefore significantly harder to statically inline as opposed to a statically allocated lambda functor.
However, the specification and intrinsics of lambdas in C++ isn't really the concern of this article. Rather, I'm more interested in looking at how lambdas can be broken to implement support for *functionality* they was never intended.
# Generative Functions
For those with no experience with them from other languages or no formal background in computer science, generative functions - also referred to as *"generator"* or *"generating"* functions - are a sub-category of functions that appear in both computing and mathematics. However, much like the differences in terms for functions between computing and mathematics, generators can also be defined differently between both domains.
Because functions in lambda calculus are defined as always producing the same output for any given input, most applications of generator functions in the space of computing would therefore not match that definition. Iterators - one of the most popular applications of generative logic across many computer science domains and languages - are a perfect example of this, as they are designed to move their way through a set of data, producing whatever value is at the current iteration of it.
```js
function* factorial(n) {
let total = 1
while (n > 1) {
total *= n
n -= 1
yield total
}
}
for (let value of factorial(10)) {
console.log(value)
}
```
Even the above example of factorial value generation is flawed by the standards of lambda calculus, as it is producing a new result each proceeding invocation. However, this article is looking at C++, not mathematics, so I digress.
```cpp
int main(int argc, char ** argv) {
auto doThing = [i = (size_t)0]() -> void {
i += 1;
};
}
```
If an attempt at compiling the above source code were made, a compiler error would be raised telling the programmer that `i` cannot be assigned to as it is read-only. Indeed, this is true for any capture, as the underlying anonymous `struct` that is generated by the compiler is always marked `const` with *no exceptions to this*.
```cpp
int main(int argc, char ** argv) {
auto doThing = [i = (size_t)0]() mutable -> void {
i += 1;
}
}
```
Or rather, this is the case until the lambda declaration is annotated with the `mutable` keyword, after which, the source code compiles and the desired behavior is produced.
# Mutable C++ Functors
It's hard to determine what a committee's overall rationale behind making lambdas `const` by default was, however, I can speculate based on my own experience as a programmer in general and specifically with C++.
## Capture Syntax Ambiguity
The difference between capturing a variable by value versus reference is one character: an ampersand. Considering this, the general formatting rules people apply when writing lambdas, and the speed at which your average professional will scan-read source code, it is believable that the committee considered potential typos to be a massive human error source.
```cpp
int32_t i = 0;
auto modify = [i]() {
i += 1;
};
```
Were `i` to be `mutable` by default in the above example, any invocation of `modify` would write to the captured copy of `i` rather than the outer variable that the capture shadows. This would result in `i += 1` having no observable side-effects in the compiled program and a hidden bug that is hard to spot.
## Parallelism Concerns with Shared Functors
The previous concern is complicated further if lambda instances are shared between parallel units of computation, like threads, as this capture could be being modified simultaneously. This would result in unforeseen race conditions, as the lambda could have otherwise been considered [mathematically pure](https://en.wikipedia.org/wiki/Pure_function) in the sense that its inputs were immutable.
## Functional Influences
A catch-all for both of the above without fully understanding the domain of concerns. This hypothesizes that the committee may have settled to make captures `const` by default because "functional languages like `const` a lot". While this is a somewhat naive and very cynical view of the thought-process of hundreds of intelligent individuals, it is common that the sum of a group is a lot dumber than the people that compose them. Furthermore, with with a language that has as many moving parts as C++, oversights have been and continue to be frequent. Therefore, cordoning off an entire area of potential problems would serve as an effective solution until more experimentation with them could be undertaken.
# Concluding
Herb Sutter looked at these three concerns and more in a [whitepaper produced as part of the evolution working group for C++ back in 2012](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2012/n3424.pdf), arguing that - while immutable lambda inputs makes sense in a lot of cases - there are some other unintended side-effects of these decisions that couldn't have been foresaw beforehand. The paper is very much worth a read for a more in-depth look at the hidden rough-edges around lambdas.
```cpp
std::function tokenize(std::string const & str) {
return [str, cursor = (size_t)0]() mutable -> Token {
while (cursor < str.size()) {
// ...
}
return Token{};
};
}
```
Whatever the case may be, for the time-being this is a solution that works and continues to work well for a language as close to the metal as C++. I have successfully used it myself in many projects that required things like simple tokenization and more. I find that, for cases where I would reach for a single-use class, it can be preferable as it avoids creating more nominal types.

147
content/blog/serious-godot.md Executable file
View File

@ -0,0 +1,147 @@
+++
title = "Serious Godot"
date = 2022-07-08
description = "Using the Godot game engine for a serious project."
[taxonomies]
programming = ["gdscript"]
technologies = ["godot"]
+++
Recently, my days at work have been focused around leading production of research-backed software engineering projects for external clients.
One such project, a prototype virtual reality (VR) therapy simulation for the Meta (formerly Oculus) Quest 2, has started to wrap up now as we move into the final patching phase and I felt it worth writing about as we used the Godot game engine to power it.
While the project has not been entirely smooth-sailing, we identified a lot of new workflows unique to Godot and found it to be an overall pleasant experience compared to alternatives tools targeting the Oculus Quest 2.
# The State of Unity in 2022
Initially, we intended to use the Unity game engine for production of the project, as anecdotal evidence from various VR communities suggested it had the most momentum over alternatives like Unreal Engine. Combined with the XR Interaction Toolkit, a virtual reality software development kit (SDK) that I have previously used and estimated was leaps ahead of others, Unity was very quickly pinned as the decided toolkit.
Development with Unity did not start out nor continue to be smooth, unfortunately, as within the initial two weeks of development we identified numerous roadblocks to getting work done.
* Conflicts between the Android Debug Bridge provided by the Android Software Development Kit and Unity itself.
* Failure to connect to the Meta Quest 2 over the debug bridge during debugging via its SDK alone.
* Intermittent failure to deploy to the device or for Unity to even recognize that it is connected via its SDK alone.
Beyond these practical issues, we also faced problems with dependency hell. Since I last used the XR Interaction Toolkit, Unity has migrated everything to their new Input Management System - [a system which has been panned in user communities](https://www.reddit.com/r/gamedev/comments/het6br/unity_has_become_an_absolute_nightmare/) due to the immense amount of boilerplate it requires for the sake of generality across redundant input devices. Irrespective of where you stand on generalizing specific problems, the reality is our target platforms was a set of one: the Meta Quest 2.
Further dependency hell issues cropped up due to the Meta Quest 2 going through somewhat of an awkward phase where it was migrating from the Meta VrApi to the OpenXR standard. By the time we started production, Meta was actively discouraging use of its proprietary API as [it is deprecated and will be unsupported come August 31st 2022](https://developer.oculus.com/blog/oculus-all-in-on-openxr-deprecates-proprietary-apis/). That being said, switching the backend to OpenXR was not a silver bullet - as there were Meta-specific features we needed which, while supported, are not enabled in the Unity OpenXR library build that it deploys to the device.
# Looking Elsewhere
While I had no doubt that we could brute-force our way through the Unity workflow issues, I did not believe that the fundamental hardware-to-software interaction reliability we were experiencing would go away without significant work between ourselves and our IT support team. Consequently, this opened up an important question.
> Would the time cost of sticking with Unity be lower than the cost of migrating to an entirely different engine?
This was the inspiration I needed to start exploring alternatives to Unity before the project gained too much momentum and we had completely left the planning phase behind. Unfortunately, many of the popular alternatives were quickly ruled out due to our inexperience with their virtual reality offerings and the smaller active VR communities where advice could be found - putting us back where we started with Unity.
Feeling burnt out from the state of the project, I spent the weekend looking for ways to escape with personal projects I had been working on in the Godot game engine - an engine that I am personally very fond of but had not ever used on any serious projects. After realizing that the engine had an `ARVRCamera` (now `XRCamera` in Godot 4) class, this made me start looking into what Godot could do with cross-reality.
To my surprise, they already had far smoother integration of the OpenXR and VrAPI backends. Furthermore, the open-source nature of both the engine and its OpenXR plugin asset meant that enabling the necessary Meta OpenXR extensions was a case of modifying the available source code to load them and recompiling the shared object for Android. This was enough to pique my interest in using Godot on a serious project for the first time.
The final consideration made was visual quality. There is no question that - between Godot 3, Unity 2021 LTS, and Unreal Engine 4.27 - Godot has some of the worst quality visuals. That being said, the application being developed for an Android device meant that we were already constrained by how far we could take the quality of visuals regardless. For reference, the Meta Quest supports GLES 3.1 and an implementation of Vulkan that still has performance issues [based on the Unity Vulkan known issues page](https://developer.oculus.com/documentation/unity/unity-vulkan/).
# Migrating to Godot
Migration was not difficult nor time-consuming as there were only a handful of finished assets and very little code had been written by this point. The nature of the Unity XR Interaction Toolkit is that most of the setup work happens in-editor by creating game objects, attaching components, and linking their events together.
Comparatively, the Godot OpenXR plugin does not come with this level of pre-setup; there are no provided VR ray or area interaction components. Instead, it provides primitives like `ARVRController` (now `XRController` in Godot 4) and expects the programmer to implement their own logic for handling spatial interactions. Personally, I prefer this approach as it gave us far more granular control over how the interactions worked and under what conditions they were triggered. However, it is clear to me that this can make setup tedious for hobbyists that want to get something simple going quickly.
After migrating, the next step was re-evaluating all of our best practices. I mentioned that I have used Godot before in hobby projects, but never for something to the scale of this project. Obviously, the same workflows I had employed in Unity would not work here as the architectures between it and Godot were intrinsically different.
## Asset Workflow
For example, Godot 3 uses a single thread for importing assets that locks the entire editor user interface while it processes the resources [^1]. While this is not so much of an issue for individual files, it becomes a major source of pain when importing 8 4k atlases textures - something we did for every environment in the project.
The import time is only compounded by our workstations not supporting roaming profiles. This means that people who worked under a hot-desk arrangement had to wait upwards of 15 minutes at the start of every day while a newly cloned repository generated an import cache for the editor.
That being said, import speeds were not the biggest issue we had - that actually goes to working with Autodesk FBX files. Being an MIT licensed project, Godot is constrained by the kinds of third-party tools it can depend on. One such utility is the the Autodesk FBX SDK - a closed-source, and arguably only reliable, FBX importer and exporter implementation.
Godot 3 currently uses a reverse-engineered FBX importer that _usually_ behaves appropriately. Even so, erroneous import data in things like rigged models happened often enough for it to be a significant impedance to work.
Fortunately, I already had experience dealing with issues like this in both Godot _and_ Unity and knew a few workarounds. Our first attempt at solving our issues was to [disable segment scale compensate](https://knowledge.autodesk.com/search-result/caas/simplecontent/content/turning-segment-scale-compensate-maya-how-to-make-maya-rigs-play-nice-unity.html), however this typically only resolved issues with rigging errors in models.
Eventually, we decided to add an additional step to the import workflow to convert all of our FBX files to GLB via the (now-abandonware) [`FBX2GLTF`](https://github.com/facebookincubator/FBX2glTF) converter created under Meta Incubator (formerly FaceBook Incubator) [^2]. This side-stepped our issues to the extent where we decided to blanket ban FBX files from the project codebase in favor of converting everything to GLB scenes or OBJ meshes.
There are Maya plugins that allow direct export to GLTF/GLB, however we did not see the time investment to get it deployed to all art machines and teach an entirely new export workflow to our artists. Overall, the art workflow was the hardest thing to get right but once we did, it felt very efficient compared to where we had started.
## Project File Structure
Conversely, I felt project structure was the easiest thing to get right out of the gate. With any project based in a games engine, I have typically followed the approach of organizing by file purpose rather than file type, however defining what "purpose" is can sometimes be difficult.
* characters
** actors
** player
* props
** prop1
*** model.glb
*** material.res
*** albedo_map.png
*** normal_map.png
* environments
** environment1
*** model.glb
*** material.res
*** albedo_map.png
*** normal_map.png
*** roughness_map.png
In the solution we had, each folder at the root of the project was its own game "system" or feature; actors served as the folder containing NPC data and assets, props contained items that could be picked up by the player character, and environments held the 3D environment assets.
There were more systems than this, but this example communicates the ideas without getting unnecessarily specific. Common shaders, sound effects, and other miscellaneous, shared assets either went into the root directory or were filtered into a `base` folder to be shared between scenes that inherited from a common "base" scene.
## The Scene Tree
On the note of the scene hierarchy and Godot scenes in general, they were a consistently big win for the project. Godot uses a scene graph system closer to that of a document object model like XML, or 3D model scene like Blender or Autodesk Maya supports. Scenes are composed from hierarchies of single-purpose nodes (i.e. cameras, rigid bodies, meshes, etc.) rather than the more complicated game object / actor models assembled from individual components of Unity, Unreal Engine, and Open 3D Engine.
While this streamlines modelling simple systems, it can quickly become nightmarish to establish complex cross-talk between nodes through scripts alone - as you would either need to hardcode paths between nodes or expose path properties in the editor to be manually set.
To side-step this from being too much of a pain-point, Godot provides a highly dynamic event-based callback system for each object type called "signals" - similar to event dispatchers in Unreal Engine. Through signals, an object can easily send events to trigger logic in others and allows for code reusability.
Finally, just like individual nodes themselves, scenes can be inherited and composed within each other. This is by no means a revolution feature, as Unity has equivalent features with prefabs, the editor ergonomics Godot provides for this makes it trivial to do so and then tweak the instantiated scene properties for further individual control.
This feature was used extensively for assembly our "Actor" non-player characters in the project. Each actor derives from the base actor scene that contains `actor.gd` base functionality script and provides all of the common nodes for voice audio playback, visual feedback queues, and a speech bubble for displaying dialogue text.
## Programming Interface
Sharing the `actor.gd` script between many derived scenes though quickly made it become a monolithic class containing functionality not necessary to be shared between all actors. This presented a common issue that presents itself in object-oriented programming where it encourages composition through hierarchies.
However, the actor hierarchy was one level deep, as we never had an actor inheriting from anything but the base actor. This helped us approach the problem more like an interface generality problem over anything else, and quickly realized the solution was to lift much of these implementations out of the base actor class and then extend it through built-in scripts within the individual deriving actor scenes.
Godot has the notion of "built-in scripts" - scripts that are embedded inside of another resource rather than kept externally in the file system. While this has obvious drawbacks when being able to diff changes to gameplay system logic, for simple single-use scripts like gameplay-level logic we found this to be great for compartmentalizing the less important systems from more important ones.
Actors were one example where we used built-in scripts to inherit a custom node type that we created, however we also used built-in scripts for scene-specific event sequences that embedded implementation details like dialogue. At this level of coupling, we thought it reasonable to make use of the engine-specific ergonomics offered by GDScript.
```gdscript
var actor := $MyActor as Actor
yield(actor.say(HELLO_WORLD_DIALOGUE), "dismissed")
actor.hide()
```
The ability to reference other nodes in a script from within the scene it is instantiated in was immensely useful in avoiding the inspector soup problem where many references to other game objects and components must each be manually assigned one-by-one to various serialized properties.
It also bears mentioning that Godot has a separately released version of its engine that ships with C# support through Mono currently. While the integration serves the purpose of providing a C# programming interface, its integration with the engine is nowhere near as clean as GDScript.
For this project, we exclusively kept to the "vanilla" variant of Godot and utilized a mostly GDScript-based approach. Certainly, shipping everything in C# instead would have better general performance, but C# in Godot also suffers from a significantly higher memory footprint because of how it ships with its own standard library and every Godot-native object reference is wrapped in its own C#-allocated class.
## Autoloads and Global State
The final Godot feature worth mentioning when we adapted to Godot is how it handles global state. A script or scene instance that needs to reside between many root scenes may be defined as an "Autoload" - something loaded and instantiated by the engine automatically when the game launches.
We used Autoloads sparingly for handling two scenarios: player progress and game resource management. The former came out of necessity to transfer player state between scenes easily without making the player scene persistent, while the resource manager was pure necessity as we needed a way to easily and safely load scenes in the background asynchronously while by working around the single-threaded, synchronous asset loading pipeline used by Godot 3.
# Retrospecting
The effort to re-identify some scalable practices took a lot of trial in the first week and then further permeated throughout the rest of the project duration in a more minor capacity through incremental changes fixing mistakes made early on. However, I believe that the experience provided important lessons learned and has better prepared us for using Godot again, which we are currently reviewing for our next project.
My main takeaway from this project was that, if the gripes with Godot are as minor as they are now, it has a bright future with further releases of the engine. The technical execution of the project went well and was completed to specification, and beyond that we have also inspired interest in other internal groups currently evaluating their options for Oculus Quest development.
[^1]: As of writing, Godot 4 alpha uses a multi-threaded import design that significantly decreases import times as each file can be processed in parallel - however, the user interface is still locked while they are processed.
[^2]: As of writing, Godot 4 alpha currently does not ship a built-in FBX importer anymore, instead requiring the user to provide a path to an external FBX2GTF installation.

92
static/base.css Normal file
View File

@ -0,0 +1,92 @@
@font-face {
font-family: "Lato Regular";
src: url("/lato/regular.ttf");
}
@font-face {
font-family: "Lato Light";
src: url("/lato/light.ttf");
}
:root {
font-family: "Lato Light", sans-serif;
font-size: 1.1em;
}
a {
text-decoration: none;
font-family: "Lato Regular", sans-serif;
color: #ac5d48;
}
a:hover {
color: #a5391c;
padding-bottom: 0;
text-decoration: underline;
transition: color 0.15s linear;
}
body {
color: #111;
background: white;
margin: 0;
overflow-y: scroll;
}
body header {
display: flex;
background: #EEE;
padding: 5px 30px;
}
body footer {
margin: 10px 0;
}
body footer {
text-align: center;
font-size: 75%;
}
body main {
margin: 15px auto;
max-width: 70%;
}
body pre {
overflow-x: scroll;
padding: 15px;
border-radius: 4px;
}
body .banner {
background-color: #ac5d48;
border-style: solid;
border-width: 0 0 0 25px;
border-color: #a5391c;
padding: 5px;
border-radius: 0 10px 10px 0;
font-size: 1.5em;
font-family: "Lato Regular";
}
body .banner > *:first-child {
font-size: 1.5em;
}
@media (prefers-color-scheme: dark) {
body {
color: white;
background: #111;
}
body header {
background: #0C0C0C;
}
}
@media (orientation: portrait), (max-width: 900px) {
body header {
justify-content: center;
}
}

BIN
static/favicon.ico (Stored with Git LFS) Normal file

Binary file not shown.

93
static/lato/OFL.txt Normal file
View File

@ -0,0 +1,93 @@
Copyright (c) 2010-2014 by tyPoland Lukasz Dziedzic (team@latofonts.com) with Reserved Font Name "Lato"
This Font Software is licensed under the SIL Open Font License, Version 1.1.
This license is copied below, and is also available with a FAQ at:
http://scripts.sil.org/OFL
-----------------------------------------------------------
SIL OPEN FONT LICENSE Version 1.1 - 26 February 2007
-----------------------------------------------------------
PREAMBLE
The goals of the Open Font License (OFL) are to stimulate worldwide
development of collaborative font projects, to support the font creation
efforts of academic and linguistic communities, and to provide a free and
open framework in which fonts may be shared and improved in partnership
with others.
The OFL allows the licensed fonts to be used, studied, modified and
redistributed freely as long as they are not sold by themselves. The
fonts, including any derivative works, can be bundled, embedded,
redistributed and/or sold with any software provided that any reserved
names are not used by derivative works. The fonts and derivatives,
however, cannot be released under any other type of license. The
requirement for fonts to remain under this license does not apply
to any document created using the fonts or their derivatives.
DEFINITIONS
"Font Software" refers to the set of files released by the Copyright
Holder(s) under this license and clearly marked as such. This may
include source files, build scripts and documentation.
"Reserved Font Name" refers to any names specified as such after the
copyright statement(s).
"Original Version" refers to the collection of Font Software components as
distributed by the Copyright Holder(s).
"Modified Version" refers to any derivative made by adding to, deleting,
or substituting -- in part or in whole -- any of the components of the
Original Version, by changing formats or by porting the Font Software to a
new environment.
"Author" refers to any designer, engineer, programmer, technical
writer or other person who contributed to the Font Software.
PERMISSION & CONDITIONS
Permission is hereby granted, free of charge, to any person obtaining
a copy of the Font Software, to use, study, copy, merge, embed, modify,
redistribute, and sell modified and unmodified copies of the Font
Software, subject to the following conditions:
1) Neither the Font Software nor any of its individual components,
in Original or Modified Versions, may be sold by itself.
2) Original or Modified Versions of the Font Software may be bundled,
redistributed and/or sold with any software, provided that each copy
contains the above copyright notice and this license. These can be
included either as stand-alone text files, human-readable headers or
in the appropriate machine-readable metadata fields within text or
binary files as long as those fields can be easily viewed by the user.
3) No Modified Version of the Font Software may use the Reserved Font
Name(s) unless explicit written permission is granted by the corresponding
Copyright Holder. This restriction only applies to the primary font name as
presented to the users.
4) The name(s) of the Copyright Holder(s) or the Author(s) of the Font
Software shall not be used to promote, endorse or advertise any
Modified Version, except to acknowledge the contribution(s) of the
Copyright Holder(s) and the Author(s) or with their explicit written
permission.
5) The Font Software, modified or unmodified, in part or in whole,
must be distributed entirely under this license, and must not be
distributed under any other license. The requirement for fonts to
remain under this license does not apply to any document created
using the Font Software.
TERMINATION
This license becomes null and void if any of the above conditions are
not met.
DISCLAIMER
THE FONT SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO ANY WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT
OF COPYRIGHT, PATENT, TRADEMARK, OR OTHER RIGHT. IN NO EVENT SHALL THE
COPYRIGHT HOLDER BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
INCLUDING ANY GENERAL, SPECIAL, INDIRECT, INCIDENTAL, OR CONSEQUENTIAL
DAMAGES, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF THE USE OR INABILITY TO USE THE FONT SOFTWARE OR FROM
OTHER DEALINGS IN THE FONT SOFTWARE.

BIN
static/lato/light.ttf (Stored with Git LFS) Normal file

Binary file not shown.

BIN
static/lato/regular.ttf (Stored with Git LFS) Normal file

Binary file not shown.

BIN
static/lato/thin.ttf (Stored with Git LFS) Normal file

Binary file not shown.

BIN
static/logo.png (Stored with Git LFS) Normal file

Binary file not shown.

BIN
static/showreel/afterglow-menu.png (Stored with Git LFS) Normal file

Binary file not shown.

BIN
static/showreel/afterglow.png (Stored with Git LFS) Normal file

Binary file not shown.

BIN
static/showreel/my-energy-game-activity.png (Stored with Git LFS) Normal file

Binary file not shown.

BIN
static/showreel/my-energy-game-intro.png (Stored with Git LFS) Normal file

Binary file not shown.

BIN
static/showreel/protectorate-exterior-editor.png (Stored with Git LFS) Normal file

Binary file not shown.

BIN
static/showreel/protectorate-interior-editor.png (Stored with Git LFS) Normal file

Binary file not shown.

22
templates/base.html Normal file
View File

@ -0,0 +1,22 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8" />
<link rel="stylesheet" type="text/css" href="/base.css" />
<meta name="viewport" content="width=device-width,initial-scale=1" />
<title>[ Kayomn ]</title>
</head>
<body>
<header><a href="/"><img src="/logo.png" /></a></header>
<main>
{% block content %}{% endblock content %}
</main>
<footer>
<span>Website designed and built by myself using the <a href="https://www.getzola.org/">Zola</a> site generator.</span>
</footer>
</body>
</html>

5
templates/index.html Normal file
View File

@ -0,0 +1,5 @@
{% extends "base.html" %}
{% block content %}
hello world
{% endblock content %}

10
templates/page.html Normal file
View File

@ -0,0 +1,10 @@
{% extends "base.html" %}
{% block content %}
<div class="banner">
<div>{{ page.title }}</div>
<div>{{ page.description }}</div>
</div>
<article>{{ page.content | safe }}</article>
{% endblock content %}

14
templates/section.html Normal file
View File

@ -0,0 +1,14 @@
{% extends "base.html" %}
{% block content %}
{% for page in section.pages %}
<a href="{{ page.permalink }}">
<div>
<h2>{{ page.title }}</h2>
<h3>{{ page.description }}</h3>
</div>
</a>
{% endfor %}
{% endblock content %}

View File

@ -0,0 +1,5 @@
{% extends "base.html" %}
{% block content %}
hello world
{% endblock content %}

View File