Compare commits
9 Commits
drone-ci
...
stable-dif
Author | SHA1 | Date |
---|---|---|
ktyl | 5f8a0cefe5 | |
ktyl | 89f56103f3 | |
ktyl | b7c2193ba5 | |
ktyl | 9ca4dd6b62 | |
ktyl | d42897624f | |
ktyl | c306093453 | |
ktyl | 40fbdd93c0 | |
ktyl | ef586f654d | |
ktyl | f31ebc88f4 |
|
@ -0,0 +1 @@
|
||||||
|
*.png filter=lfs diff=lfs merge=lfs -text
|
|
@ -0,0 +1,121 @@
|
||||||
|
# Drone CI
|
||||||
|
|
||||||
|
When it comes to automation, [GitLab CI](https://gitlab.com) has been my go-to for running builds, tests and deployments of projects from static websites to 3D open-world games.
|
||||||
|
This has generally been on a self-hosted installation, and often makes use of physical runners.
|
||||||
|
However, I have some gripes: I mostly only use it for the CI, but it comes with an issue tracker and Git hosting solution too - great for some cases, but overkill in so many others.
|
||||||
|
Because it's such a complete solution, GitLab is a bit of a resource hog, and can often run frustratingly slowly.
|
||||||
|
|
||||||
|
Recently I've been playing with a friend's self-hosted instance of [Drone CI](https://drone.io/) as a lightweight alternative, and I much prefer it.
|
||||||
|
I didn't set up the instance, so that part is out of scope for this post, but in case it's relevant, we're using a self-hosted [Gitea](https://gitea.io/) instance to host the source.
|
||||||
|
You can find out about configuring Drone with Gitea [here](https://docs.drone.io/server/provider/gitea/).
|
||||||
|
|
||||||
|
|
||||||
|
## Yet Another Yaml Config
|
||||||
|
|
||||||
|
Like GitLab, Drone is configured via a YAML file at the project root, called `.drone.yml`.
|
||||||
|
Drone is configured by creating 'steps' to the pipeline, where GitLab uses 'jobs'.
|
||||||
|
|
||||||
|
My first project's automation requirements were small - all I needed for a deployment was to copy all the files in a directory on every push to the `main` branch.
|
||||||
|
This means I needed secure access to the host, and the ability to copy files to it.
|
||||||
|
I didn't want to dedicate any permanent resources to such a small project, so opted for the `docker` pipeline option.
|
||||||
|
|
||||||
|
My pipeline would contain a single `deploy` step which would configure SSH access to the host, and then use it to copy the relevant files from the checked out version of the project.
|
||||||
|
I decided to use `ubuntu` as the Docker image for familiarity and accessibility - there are probably better options.
|
||||||
|
Drone widely supports Docker image registries; I have not used Docker much, but would like to get more experience with it.
|
||||||
|
|
||||||
|
```yml
|
||||||
|
kind: pipeline
|
||||||
|
type: docker
|
||||||
|
name: deploy
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: deploy
|
||||||
|
image: ubuntu
|
||||||
|
when:
|
||||||
|
branch:
|
||||||
|
- main
|
||||||
|
|
||||||
|
commands:
|
||||||
|
- echo hello world
|
||||||
|
```
|
||||||
|
|
||||||
|
## Secrets
|
||||||
|
|
||||||
|
A hugely important aspect of automation is ensuring the security of one's pipelines.
|
||||||
|
Automated access between pipelines is a big risk, and should be locked down as much as possible.
|
||||||
|
For passing around secrets such as passwords and SSH keys, Drone has a concept of secrets.
|
||||||
|
I created a private key on my local machine for the runner's access to the remote host, and added a [per-repository secret](https://docs.drone.io/secret/repository/) to contain the value.
|
||||||
|
This is a named string value which can be accessed from within the context of a single pipeline step.
|
||||||
|
|
||||||
|
I also created secrets to contain values for the remote host address and the user to login as.
|
||||||
|
These are less of a security concern than the private SSH key, but we should obfuscate them anyway.
|
||||||
|
It's also a useful step towards generalising the pipeline for other projects: I can use the same set of commands in multiple CI configurations, and just update the secrets from the project page.
|
||||||
|
|
||||||
|
This block was placed in the same step definition as above, below the `image:` entry:
|
||||||
|
|
||||||
|
```
|
||||||
|
environment:
|
||||||
|
HOST:
|
||||||
|
from_secret: host
|
||||||
|
USER:
|
||||||
|
from_secret: user
|
||||||
|
SSH_KEY:
|
||||||
|
from_secret: ssh_key
|
||||||
|
```
|
||||||
|
|
||||||
|
## Connecting
|
||||||
|
|
||||||
|
To use the SSH key, we need to spin up `ssh-agent` and load our key into it.
|
||||||
|
Since it's passed into the job as an environment variable, this involves first writing it to a file.
|
||||||
|
We also need to disable host key checking (the bit that asks if you're sure you want to connect to a new host) as we're making an automated SSH connection, and therefore won't be there to type 'yes'.
|
||||||
|
|
||||||
|
```yml
|
||||||
|
# configure ssh
|
||||||
|
- eval $(ssh-agent -s)
|
||||||
|
- mkdir -p ~/.ssh
|
||||||
|
- echo "$SSH_KEY" > ~/.ssh/id_rsa
|
||||||
|
- chmod 600 ~/.ssh/id_rsa
|
||||||
|
- ssh-add
|
||||||
|
- echo "Host *\n\tStrictHostKeyChecking no\n\n" > ~/.ssh/config
|
||||||
|
```
|
||||||
|
|
||||||
|
Finally, it's time to run some SSH commands.
|
||||||
|
I had a bit of trouble getting the hang of variable templating here - it took some trial and error to figure out what variables would get expanded and when.
|
||||||
|
Since my `HOST` and `USER` values are defined in secrets, I had to get them from my evironment variables and into a correctly formatted string for the SSH target.
|
||||||
|
As I would be running multiple commands, I also wanted to store this in a variable to keep the SSH commands short in the Drone config.
|
||||||
|
|
||||||
|
What ended up working for me was this:
|
||||||
|
|
||||||
|
```yml
|
||||||
|
# environment variables get expanded (twice?)
|
||||||
|
- host="$${USER}@$${HOST}"
|
||||||
|
# running 'hostname' on the deploy target
|
||||||
|
- ssh $host "hostname"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Images
|
||||||
|
|
||||||
|
It's pretty cool to be able to pass a repository through several Docker images through the pipeline.
|
||||||
|
I have my website's Makefile set up to build off my local machine, which is on Arch.
|
||||||
|
It therefore depends on Arch-specific package names.
|
||||||
|
I didn't want to have to hack around my existing build configuration just to build it automatically, but I also found that the deploy steps I'd already written worked best on Ubuntu.
|
||||||
|
|
||||||
|
For Drone, this is no problem - I can simply specify `image: archlinux` in the build stage, and `image: ubuntu` for the deploy step.
|
||||||
|
My Makefile and local workflow requires no changes at all, but I can still use the more robust deploy steps from Ubuntu.
|
||||||
|
|
||||||
|
## Final thoughts
|
||||||
|
|
||||||
|
I like Drone's minimalist approach to CI.
|
||||||
|
There isn't much in terms of configuration, and the interface is much snappier than Gitlab's.
|
||||||
|
It will take a bit more work to get a full workflow - Gitlab basically has one out the box - but working with more separate components should provide flexibility and resilience in the long run.
|
||||||
|
|
||||||
|
I'd like to explore some more features, like [templates](https://docs.drone.io/template/yaml/) for steps shared between repositories, and spend more time tuning exactly when pipelines run.
|
||||||
|
I also want to try building some more complex projects, such as those using game engines like Godot, and those targeting multiple target platforms.
|
||||||
|
Those are adventures for another day, though.
|
||||||
|
|
||||||
|
That's all for now, thanks for reading and see you next time!
|
||||||
|
|
||||||
|
## References
|
||||||
|
|
||||||
|
* [GitLab CI config to deploy via SSH](https://medium.com/@hfally/a-gitlab-ci-config-to-deploy-to-your-server-via-ssh-43bf3cf93775)
|
||||||
|
|
|
@ -0,0 +1,30 @@
|
||||||
|
# The Prince of Milk
|
||||||
|
|
||||||
|
The Prince of Milk is a science fiction novel by Exurb1a of YouTube fame.
|
||||||
|
|
||||||
|
It follows the story of a fictional village in southern England named Wilthail, which ends up the unwilling venue for the settling of an ancient grudge.
|
||||||
|
Deities ("Etherics") exist alongside the mundanity of 21st century Wilthail, and engage in absurdity, sodomy and violence with its quaint population.
|
||||||
|
The books makes reference to a number of popular philosophical debates, and takes inspiration from a number of classical sci-fi authors.
|
||||||
|
|
||||||
|
A common theme is the idea that power is relative.
|
||||||
|
The Etherics are immortal - their grudge has played out across hundreds of 'Corporic' incarnations - and have power and abilities far beyond the comprehension of their human counterparts.
|
||||||
|
However, they do not necessarily view themselves as gods.
|
||||||
|
This is particularly true of the character Beomus, who frequently plays down their immortality and returns fire with questions about modern humans' relationship to their primitive ancestors, or with ants.
|
||||||
|
This relativity of power recurs plenty, and is reminiscient of Arthur C. Clarke's assertion that sufficiently advanced technology is indistinguishable from magic.
|
||||||
|
As characters in a book, the Etherics are understandably cagey about how any of their abilities work - but broadly refuse to classify them as either magic or technology.
|
||||||
|
|
||||||
|
Reincarnation is viewed as a fundamental way of the world - Chalmers' panpsychism, or the Hard Problem of Philosophy.
|
||||||
|
This goes further than to suggest that people are simply reincarnated as others when they die, rather suggesting that consciousness is a fundamental force of the universe, in just the way electromagnetism is.
|
||||||
|
It's a recursive thing, from the lowliest atom up through rocks, mice, snakes, cats, people, stars and gods.
|
||||||
|
It's a neat and satisfying view, and one that has yet to be disproven by neuroscience.
|
||||||
|
|
||||||
|
The human characters are invariably damaged - mental health issues, broken relationships, toxic parentage, drug use, suicide, difficult histories.
|
||||||
|
This paints PoM's world as realistic, and grounds it through the fantastical happenings in the middle act.
|
||||||
|
It grips the reader with its variety of characters, and follows them all as they confront not only their own personal hells, but the one they now find themselves sharing, in a twisted take on country bumpkinism.
|
||||||
|
|
||||||
|
Overall, I thoroughly enjoyed this book, and am looking forward to reading more of Exurb1a's writing.
|
||||||
|
I am a little biased, as I have already enjoyed the YouTube channel for a number of years.
|
||||||
|
|
||||||
|
There is a short glossary at the end naming and exploring some of the particular concepts explored in the novel, which prompt the reader to explore further.
|
||||||
|
Top marks!
|
||||||
|
|
|
@ -0,0 +1,11 @@
|
||||||
|
# Un Cafe Dans l'Espace
|
||||||
|
|
||||||
|
J'ai acheté ce livre quand j'ai visité la Cité de l'Espace à Toulouse. C'est écrit par Michel Tognini, un astronaute français qui été dans l'espace deux fois. Il a travillé sur la station spatiale de Mir, et sur la navette spatiale pour décoller CHANDRA, une observatoir dans le bas orbite. Depuis, il a selectionné et entrainé de nouveaux astronautes européens.
|
||||||
|
|
||||||
|
Ce livre parle de plusiers subjets en relation à l'espace: de l'entrainement de l'auteur à la Cité des Étoiles en Russie, de les échecs et défis dans l'espace, aux réalisations des sociétés privés comme SpaceX, Blue Origin et Virgin Galactic. Comme d'autres astronautes, Tognini a étudié comme pilote de chasse, et puis comme pilote d'essai. Il a rejoint l'agence spatiale française CNES avant la formation de l'ESA, qui existe encore aujourd'hui.
|
||||||
|
|
||||||
|
J'ai trouvé que je connaissais déjà beaucoup des histoires dans ce livre, parce que j'ai toujours eu une adoration pour l'espace, et c'est écrit pour une audience générale. Ma première raison de lire ce livre est que c'était mon premier français! Cela m'a pris quelques mois, mais c'etait une experience agréable. Au début, j'avais besoin de rechercer plusiers mots à chaque page, mais à la fin j'ai trouvé que je pouvais lire beaucoup d'aisance.
|
||||||
|
|
||||||
|
Je recommende ce livre aux francophones qui sont interessés par l'espace, mais qui sont peut-être moins familiers avec le jargon comme moi.
|
||||||
|
|
||||||
|
Encore, merci à mon cher Ethel pour m'aider avec mon français ! <3
|
Binary file not shown.
|
@ -0,0 +1,128 @@
|
||||||
|
# Local Stable Diffusion
|
||||||
|
|
||||||
|
![astronaut rides horse](astronaut_rides_horse.png)
|
||||||
|
|
||||||
|
Stable diffusion (SD) is an AI technique for generating images from text prompts.
|
||||||
|
Similar to DALL-E, which drives the popular [craiyon](https://www.craiyon.com/), SD is available as an [online tool](https://huggingface.co/spaces/stabilityai/stable-diffusion).
|
||||||
|
These web tools are amazing, and easy to use, but can be frustrating - they're often under high load, and impose long waiting times.
|
||||||
|
They use a good chunk of computational resources, specifically GPUs and so have generally been out of reach for even people with powerful personal machines.
|
||||||
|
|
||||||
|
Now, however, SD has reached the point it can be run using (admittedly, high-end) consumer video cards.
|
||||||
|
Stability AI - the model's developers - recently [published a blog post](https://stability.ai/blog/stable-diffusion-v2-release) open-sourcing SD 2.
|
||||||
|
There's a README for getting started [here](https://huggingface.co/stabilityai/stable-diffusion-2/blob/main/README.md), but it has a couple of gotchas and assumptions which plenty of people (like myself) won't have known if they're not already familiar with the technologies in use, such as Python and CUDA.
|
||||||
|
|
||||||
|
This post is descibes my experience setting up SD 2 on my local workstation.
|
||||||
|
For hardware, I have an i7-6700k, RTX 2080 Super and 48GB of RAM.
|
||||||
|
If you have an AMD video card, you won't be able to use CUDA, but you may be able to use GPU acceleration regardless using something ROCm.
|
||||||
|
In this post I'm using Arch Linux, but I have successfully set it up on Windows too.
|
||||||
|
Python is an exceedingly portable language, so it should work wherever you're able to get a Python installation.
|
||||||
|
|
||||||
|
This post assumes that you already have a working Python installation.
|
||||||
|
|
||||||
|
## Install CUDA
|
||||||
|
|
||||||
|
CUDA needs to be installed separately from Python dependencies.
|
||||||
|
It is quite large, and as with all NVIDIA driver installations, can be a bit confusing.
|
||||||
|
On Linux, it's straightforward to install it from your distribution's package manager.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo pacman -Syu
|
||||||
|
sudo pacman -S cuda
|
||||||
|
```
|
||||||
|
|
||||||
|
On Windows, you will need to go to NVIDIA's site to download the correct version of CUDA.
|
||||||
|
At time of writing, the SD 2 script expects CUDA 11.7, and will not work if you install the latest 12.0 version.
|
||||||
|
To get older versions, go to their [download archive](https://developer.nvidia.com/cuda-toolkit-archive) and select the appropriate one.
|
||||||
|
|
||||||
|
## Set up a virtual environment an PyTorch
|
||||||
|
|
||||||
|
Python can be installed at a system level, but it's usually a good idea to set up a virtual environment for your project.
|
||||||
|
This isolates the project dependencies from the wider system, and makes your setup reproducible.
|
||||||
|
I will use [`pipenv`](https://pipenv.pypa.io/en/latest/index.html) as it's what I'm familiar with.
|
||||||
|
|
||||||
|
PyTorch is a deep-learning framework, used to put together machine learning pipelines.
|
||||||
|
|
||||||
|
To get a command to install the relevant dependencies, go to [PyTorch's site](https://pytorch.org/get-started/locally/) and choose the options for your setup.
|
||||||
|
In my case, I replaced `pip3` with `pipenv` as I want to install dependencies to a new virtual environment instead of to the system.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
mkdir stable-diffusion && cd stable-diffusion
|
||||||
|
pipenv install torch torchvision torchaudio
|
||||||
|
```
|
||||||
|
|
||||||
|
## Install Stable Diffusion
|
||||||
|
|
||||||
|
SD 2 is provided by the `diffusers` package.
|
||||||
|
We can install it in our virtual environment as follows:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pipenv shell
|
||||||
|
pip3 install git+https://github.com/huggingface/diffusers.git transformers accelerate scipy
|
||||||
|
exit
|
||||||
|
```
|
||||||
|
|
||||||
|
We use `pipenv shell` to enter a shell using the virtual environment, before using the `pip3` command described on their README.
|
||||||
|
After installing dependencies, we can leave the virtual environment shell and return to our original one.
|
||||||
|
`transformers` and `accelerate` are optional, but used to reduce memory usage and so are recommended.
|
||||||
|
|
||||||
|
## Create a Python script
|
||||||
|
|
||||||
|
Python does have an interactive envronment, but so save our fingers let's use a `stable-diffusion.py` script to contain and run our Python code.
|
||||||
|
Here I'll mostly copy the Python included in their README:
|
||||||
|
|
||||||
|
```python
|
||||||
|
import torch
|
||||||
|
from diffusers import StableDiffusionPipeline, EulerDiscreteScheduler
|
||||||
|
|
||||||
|
model_id = "stabilityai/stable-diffusion-2"
|
||||||
|
|
||||||
|
# Use the Euler scheduler here instead
|
||||||
|
scheduler = EulerDiscreteScheduler.from_pretrained(model_id, subfolder="scheduler")
|
||||||
|
pipe = StableDiffusionPipeline.from_pretrained(model_id, scheduler=scheduler, revision="fp16", torch_dtype=torch.float16)
|
||||||
|
pipe = pipe.to("cuda")
|
||||||
|
pipe.enable_attention_slicing()
|
||||||
|
|
||||||
|
prompt = "a photo of an astronaut riding a horse on mars"
|
||||||
|
image = pipe(prompt, height=768, width=768).images[0]
|
||||||
|
|
||||||
|
image.save("astronaut_rides_horse.png")
|
||||||
|
```
|
||||||
|
|
||||||
|
I've made two additions here.
|
||||||
|
First, I've added `import torch` at the top - I'm not sure why the code in the README omits this, but it's needed to work.
|
||||||
|
|
||||||
|
I've also added `pipe.enable_attention_slicing()` - this is a more memory-efficient running mode, which is less intensive at the cost of taking longer.
|
||||||
|
If you have a monster video card, this may not be necessary.
|
||||||
|
|
||||||
|
At this point, we're done - after running the script successfully, you should have a new picture of an astronaut riding a horse on mars.
|
||||||
|
|
||||||
|
## Some nice-to-haves
|
||||||
|
|
||||||
|
In this basic script we only have the one, hardcoded prompt.
|
||||||
|
To change it, we need to update the file itself.
|
||||||
|
Instead, we can change how `prompt` is set, and have it read from command-line parameters instead.
|
||||||
|
|
||||||
|
```python
|
||||||
|
# at the top of the file
|
||||||
|
import sys
|
||||||
|
|
||||||
|
...
|
||||||
|
|
||||||
|
prompt = " ".join(sys.argv[1:])
|
||||||
|
```
|
||||||
|
|
||||||
|
While we're at it, we can also base the filename on the input prompt:
|
||||||
|
|
||||||
|
```python
|
||||||
|
image.save(f"{prompt.replace(" ", "_")}.png")
|
||||||
|
```
|
||||||
|
|
||||||
|
## Wrapping up
|
||||||
|
|
||||||
|
And that's it!
|
||||||
|
Enjoy making some generative art.
|
||||||
|
My favourites so far have been prefixing "psychedelic" to things.
|
||||||
|
I've also been enjoying generating descriptions with [ChatGPT](https://chat.openai.com/chat) and plugging them into SD, for some zero-effort creativity.
|
||||||
|
As always, if anything's out of place of if you'd like to get in touch, please [send me an email!](mailto:me@ktyl.dev).
|
||||||
|
|
||||||
|
|
Loading…
Reference in New Issue