Learn Creative Coding (#20) - Recording and Exporting: GIF, PNG Sequence, Video
You've built some beautiful things over the last ten episodes. A particle galaxy that reacts to your mouse. Physics simulations where springs bounce and boids flock. Sound-reactive particle fields that pulse with bass and shimmer with treble. But right now, all of those things only exist inside your browser tab. Close the tab, they're gone. Want to post a clip on social media? Want a hi-res image for a print? Want a video for your portfolio? You need to get your work OUT of the browser and into a file.
This episode isn't glamorous. It's plumbing. But it's essential plumbing, because creative coding that nobody sees is creative coding that doesn't exist. We'll cover single-frame PNG export, frame-by-frame capture for perfect animation recording, real-time video capture with the MediaRecorder API, GIF creation, high-resolution export for prints, SVG output for vector work, and how to design animations that loop seamlessly. By the end you'll be able to export anything you build in whatever format you need :-)
The simplest export: a single PNG
Every canvas element can export itself as an image. One function call, one download:
function saveFrame() {
let link = document.createElement('a');
link.download = 'frame.png';
link.href = canvas.toDataURL('image/png');
link.click();
}
// trigger on keypress
window.addEventListener('keydown', (e) => {
if (e.key === 's') saveFrame();
});
canvas.toDataURL() converts the current canvas contents into a base64-encoded PNG data URL. We stuff that into a temporary link element, set its download attribute, and click it programmatically. The browser triggers a file download. Done.
In p5.js it's even simpler:
function keyPressed() {
if (key === 's') {
saveCanvas('my-sketch', 'png');
}
}
Press S, get a PNG. This is great for grabbing a still frame of something that looks good. But for animation -- a running sketch, a looping particle system, a sound-reactive piece -- a single screenshot doesn't cut it. You need to capture multiple frames over time.
Frame-by-frame capture: the reliable method
The most reliable way to export animations is frame by frame. Save each frame as a numbered PNG, then stitch them together into a video afterwards with ffmpeg. It's slower than real-time recording but gives you perfect quality with zero dropped frames.
In p5.js, use saveCanvas() with a frame counter:
let recording = false;
let frameNum = 0;
function draw() {
background(20);
// your beautiful sketch code here
for (let i = 0; i < 50; i++) {
let angle = frameCount * 0.02 + i * 0.4;
let r = 100 + i * 3;
let x = width/2 + cos(angle) * r;
let y = height/2 + sin(angle) * r;
fill(100 + i * 3, 200, 255, 150);
noStroke();
ellipse(x, y, 8);
}
if (recording) {
saveCanvas('frame_' + nf(frameNum, 4), 'png');
frameNum++;
if (frameNum >= 180) { // 3 seconds at 60fps
recording = false;
console.log('Done! ' + frameNum + ' frames saved.');
}
}
}
function keyPressed() {
if (key === 'r') {
recording = true;
frameNum = 0;
console.log('Recording started...');
}
}
Press R to start recording. Each frame saves as a numbered PNG -- frame_0000.png, frame_0001.png, frame_0002.png, all the way up to frame_0179.png. The sketch will run slowly during recording because saving PNGs to disk takes time. That's fine and expected. The output video will play at full speed regardless.
The nf(frameNum, 4) function pads the number with leading zeros (0001, 0002, etc). This is important because ffmpeg needs sequentially numbered files to assemble them into video.
For vanilla Canvas, use canvas.toBlob() instead of toDataURL():
let recording = false;
let frameNum = 0;
function captureFrame() {
if (!recording) return;
canvas.toBlob(blob => {
let a = document.createElement('a');
a.href = URL.createObjectURL(blob);
a.download = 'frame_' + String(frameNum).padStart(4, '0') + '.png';
a.click();
URL.revokeObjectURL(a.href);
}, 'image/png');
frameNum++;
if (frameNum >= 180) {
recording = false;
console.log('Recording done: ' + frameNum + ' frames');
}
}
// call captureFrame() at the end of your draw loop
One thing to watch out for: browsers may throttle rapid downloads. Chrome starts asking for permission after a bunch of quick downloads. For short clips (under ~200 frames), direct download works fine. For longer sequences, you'll want to use CCapture.js which bundles everything into a single .tar file, or consider using the File System Access API to write directly to a folder. But for most creative coding export needs, the direct download approach is more than enough.
CCapture.js: frame-perfect recording
CCapture.js is a library specifically designed for this. It hijacks requestAnimationFrame so that each frame gets exactly the right timestamp, regardless of how fast or slow your sketch actually renders. No dropped frames, perfect timing:
<script src="https://cdn.jsdelivr.net/npm/ccapture.js@1.1.0/build/CCapture.all.min.js"></script>
let capturer = null;
function startRecording() {
capturer = new CCapture({
format: 'png', // or 'webm', 'gif'
framerate: 30,
verbose: true
});
capturer.start();
}
function stopRecording() {
capturer.stop();
capturer.save(); // downloads a .tar with all PNGs
capturer = null;
}
function draw() {
background(20);
// ... your animation code ...
if (capturer) capturer.capture(canvas);
}
function keyPressed() {
if (key === 'r') startRecording();
if (key === 'e') {
stopRecording();
console.log('Saved!');
}
}
In p5.js, the canvas element is document.getElementById('defaultCanvas0'), so the capture call becomes capturer.capture(document.getElementById('defaultCanvas0')).
The key advantage of CCapture is that it controls time. If your sketch is too heavy to run at 30fps, CCapture doesn't care -- it waits for each frame to finish rendering before advancing the timestamp. The result is a perfectly smooth recording regardless of actual render speed. This is the approach I use when I want pixel-perfect output from anything computationally heavy -- like that galaxy from episode 15 with its 3000 particles.
Assembling frames into video with ffmpeg
Once you have a folder full of numbered PNGs, ffmpeg turns them into video. ffmpeg is a command-line tool -- if you haven't installed it yet, it's one of those "install once, use forever" tools that every creative coder should have. It handles basically every audio and video format that exists.
# basic MP4 (H.264, widely compatible)
ffmpeg -framerate 30 -i frame_%04d.png -c:v libx264 -pix_fmt yuv420p output.mp4
# higher quality (CRF 18 = near lossless)
ffmpeg -framerate 30 -i frame_%04d.png -c:v libx264 -crf 18 -pix_fmt yuv420p output.mp4
# with transparency (WebM + VP9)
ffmpeg -framerate 30 -i frame_%04d.png -c:v libvpx-vp9 -pix_fmt yuva420p output.webm
The -framerate 30 tells ffmpeg to interpret the PNGs as 30fps. The -pix_fmt yuv420p flag ensures compatibility with most video players -- without it, some devices and social media platforms can't play the file. The -crf 18 controls quality: lower numbers = better quality = bigger file. 18 is visually lossless for most content. 23 is the default and still looks great. 28 starts to get noticeable.
You can also add audio to your video:
ffmpeg -framerate 30 -i frame_%04d.png -i music.mp3 \
-c:v libx264 -c:a aac -shortest -pix_fmt yuv420p output.mp4
The -shortest flag stops encoding when the shorter of the two inputs (video or audio) ends. Useful for pairing a 5-second animation loop with a longer music track.
If you built a sound-reactive sketch like we did in episode 19, recording the audio alongside the visual frames lets you create a combined video where the visuals and music are perfectly synchronized. The approach: record audio separately, capture frames with CCapture, then merge with ffmpeg. The timing matches because CCapture uses the exact framerate you specify.
Real-time recording with MediaRecorder
For quick captures where absolute perfection isn't critical, the browser's built-in MediaRecorder API records the canvas in real-time. No external libraries, no post-processing:
function startRecording(canvas, durationMs) {
let stream = canvas.captureStream(60); // 60fps capture
let recorder = new MediaRecorder(stream, {
mimeType: 'video/webm;codecs=vp9'
});
let chunks = [];
recorder.ondataavailable = e => chunks.push(e.data);
recorder.onstop = () => {
let blob = new Blob(chunks, { type: 'video/webm' });
let url = URL.createObjectURL(blob);
let a = document.createElement('a');
a.href = url;
a.download = 'recording.webm';
a.click();
URL.revokeObjectURL(url);
};
recorder.start();
setTimeout(() => recorder.stop(), durationMs);
}
// usage: click to record 5 seconds
function mousePressed() {
startRecording(document.querySelector('canvas'), 5000);
}
Call it and you get a 5-second WebM clip. The tradeoff: it records in real-time, so your sketch needs to actually run at full frame rate during capture. If your sketch is too heavy for 60fps, frames will be dropped in the recording. For heavy sketches -- anything with thousands of particles or complex noise calculations -- use the frame-by-frame method instead.
WebM is great for web playback but some platforms want MP4. Quick conversion with ffmpeg: ffmpeg -i recording.webm -c:v libx264 -pix_fmt yuv420p output.mp4. Two seconds and you're done.
Creating GIFs
GIFs are everywhere. Limited (256 colors, often large file sizes) but universally supported. Every social platform, every messaging app, every forum. For short creative coding loops, a GIF is often the best way to share your work.
Method 1: gif.js (in-browser)
let gif = new GIF({
workers: 2,
quality: 10,
width: canvas.width,
height: canvas.height
});
// during your animation loop, add frames:
gif.addFrame(canvas, { delay: 66, copy: true }); // 66ms = ~15fps
// when your loop is complete:
gif.on('finished', function(blob) {
let url = URL.createObjectURL(blob);
let a = document.createElement('a');
a.download = 'animation.gif';
a.href = url;
a.click();
});
gif.render();
The quality parameter (1 = best, 30 = worst) controls how much time the encoder spends optimizing the color palette per frame. 10 is a good balance. The delay in milliseconds controls frame duration -- 66ms gives roughly 15fps which is standard for GIFs. Going higher than 20fps in a GIF usually just makes the file enormous without visible improvement.
Method 2: ffmpeg two-pass palette (best quality)
If you've already captured PNG frames, ffmpeg can create much better GIFs than any browser-based tool. The trick is the two-pass palette approach:
# pass 1: generate an optimal palette from your frames
ffmpeg -framerate 15 -i frame_%04d.png \
-vf "fps=15,scale=400:-1:flags=lanczos,palettegen" palette.png
# pass 2: use that palette to create the GIF
ffmpeg -framerate 15 -i frame_%04d.png -i palette.png \
-filter_complex "fps=15,scale=400:-1:flags=lanczos[x];[x][1:v]paletteuse" \
output.gif
The difference is dramatic. Without the palette pass, ffmpeg uses a generic 256-color palette and you get ugly banding and dithering, especially in gradients and dark tones -- exactly the kind of subtle color work our creative coding tends to produce. The two-pass method analyzes your actual frames and picks the 256 colors that best represent YOUR content. The result looks orders of magnitude better.
The scale=400:-1 resizes to 400px wide (height proportional). Smaller GIFs = smaller files = faster loading. For social media sharing, 400-500px wide is usually plenty.
Designing looping animations
GIFs and social media posts look best when the animation loops seamlessly. This means the last frame needs to connect smoothly to the first frame. Design for it from the start:
let totalFrames = 120; // 4 seconds at 30fps
let frame = 0;
function draw() {
let t = frame / totalFrames; // 0 to 1 over one complete loop
background(20);
// use t * TWO_PI with sin/cos -- completes exactly one cycle
let x = width/2 + sin(t * TWO_PI) * 150;
let y = height/2 + cos(t * TWO_PI) * 100;
fill(100, 200, 255);
noStroke();
ellipse(x, y, 30);
frame++;
if (frame >= totalFrames) frame = 0;
}
The key: t * TWO_PI with sin/cos. When t goes from 0 to 1, the trig functions complete exactly one full cycle. Frame 0 and frame 119 are adjacent in the loop, and because sin(0) equals sin(TWO_PI), the motion is seamless. No jump, no stutter. This is the same trig from episode 13 -- circular motion is inherently loopable because circles have no beginning or end.
For multiple elements with different speeds, use multiples of TWO_PI:
let t = frame / totalFrames;
// slow orbit (1 revolution per loop)
let x1 = width/2 + cos(t * TWO_PI) * 100;
let y1 = height/2 + sin(t * TWO_PI) * 100;
// fast orbit (3 revolutions per loop)
let x2 = x1 + cos(t * TWO_PI * 3) * 30;
let y2 = y1 + sin(t * TWO_PI * 3) * 30;
Both complete whole-number multiples of a full revolution, so both loop perfectly. Mix speeds freely as long as they're integer multiples.
Looping noise
Noise-based animations are trickier to loop. Regular noise has no inherent periodicity -- calling noise(t) at t=0 and t=1 gives you completely different values. The trick: sample noise along a circle in noise-space, so the start and end meet:
let t = frame / totalFrames;
// walk a circle through noise-space
let noiseX = cos(t * TWO_PI) * 0.5 + 0.5;
let noiseY = sin(t * TWO_PI) * 0.5 + 0.5;
let value = noise(noiseX, noiseY);
Instead of walking a straight line through noise (which never returns to its starting point), we walk a circle. When t goes from 0 back to 0, our noise coordinates trace a complete loop. The noise values at the start and end are identical because they're the same point on the circle. Seamless. This was harder to figure out than it should have been -- I spent way too long trying to blend the last few frames before I realized the circle trick. Sometimes the elegant solution is hiding right there in the math :-)
Remember the Perlin noise we built from scratch in episode 12? Same concept applies there. Your custom noise function works the same way -- just feed it coordinates that trace a circle and the output loops perfectly.
High-resolution export for prints
Screen resolution and print resolution are completley different worlds. A 600x400 canvas looks fine on screen but would print as a blurry mess at any reasonable size. A 30x40cm print at 300 DPI needs about 3500x4700 pixels. The approach: render to a bigger canvas, save that, done.
In p5.js, use createGraphics() for an offscreen buffer:
function exportHighRes(scaleFactor) {
let pg = createGraphics(width * scaleFactor, height * scaleFactor);
pg.scale(scaleFactor);
// draw your scene onto pg instead of the main canvas
pg.background(20);
pg.fill(100, 200, 255);
pg.noStroke();
// ... all your drawing code, prefixed with pg.
for (let i = 0; i < 50; i++) {
let angle = frameCount * 0.02 + i * 0.4;
let r = 100 + i * 3;
let x = width/2 + cos(angle) * r;
let y = height/2 + sin(angle) * r;
pg.fill(100 + i * 3, 200, 255);
pg.ellipse(x, y, 8);
}
pg.save('highres.png');
pg.remove(); // clean up
}
function keyPressed() {
if (key === 'h') {
exportHighRes(4); // 4x resolution: 600x400 becomes 2400x1600
}
}
A 600x400 canvas at 4x scale becomes 2400x1600 -- print-ready at reasonable sizes. At 6x (3600x2400) you're getting into large-format territory.
In vanilla Canvas, create a temporary offscreen canvas:
function exportHighRes(scale) {
let tempCanvas = document.createElement('canvas');
tempCanvas.width = canvas.width * scale;
tempCanvas.height = canvas.height * scale;
let tempCtx = tempCanvas.getContext('2d');
tempCtx.scale(scale, scale);
// redraw your entire scene on tempCtx
tempCtx.fillStyle = '#141414';
tempCtx.fillRect(0, 0, canvas.width, canvas.height);
// ... your drawing code using tempCtx ...
tempCanvas.toBlob(blob => {
let a = document.createElement('a');
a.href = URL.createObjectURL(blob);
a.download = 'highres.png';
a.click();
}, 'image/png');
}
One important thing: make sure your sketch uses relative positioning (percentages of width and height, or calculated proportions) rather than hardcoded pixel values. If you wrote ellipse(300, 200, 50) with fixed coordinates, the high-res render will draw that circle in the exact same position regardless of scale -- stuck in the top-left corner of your now-massive canvas. Use ellipse(width/2, height/2, width * 0.08) instead and it scales correctly.
Also watch your stroke weights. A 1px stroke that looks crisp at screen resolution becomes a barely-visible hairline at 4K. Scale your strokes proportionally: strokeWeight(1 * scaleFactor) inside the high-res render. Same for text sizes. This catches people off guard the first time they try print export -- everything is positioned right but the lines are invisibly thin.
SVG export: infinite resolution
For vector output, you draw to SVG instead of canvas. SVG files are resolution-independent -- they scale infinitely without pixelation. Perfect for pen plotters, laser cutters, vinyl cutters, and high-quality print production.
p5.js supports SVG output with the p5.svg library:
function setup() {
createCanvas(600, 400, SVG); // requires p5.svg.js
}
function draw() {
background(240);
stroke(0);
strokeWeight(1);
noFill();
for (let i = 0; i < 50; i++) {
let x = width/2 + cos(i * 0.3) * (50 + i * 4);
let y = height/2 + sin(i * 0.3) * (50 + i * 4);
ellipse(x, y, 10 + i * 0.5);
}
}
function keyPressed() {
if (key === 's') {
save('output.svg');
}
}
SVG output only works for vector content -- lines, shapes, curves, text. If your sketch uses pixel manipulation (like the image processing from episode 10) or relies on semi-transparent overlaps for trail effects (like our galaxy from episode 15), SVG won't capture that correctly. Those effects depend on raster rendering. For SVG export, think in strokes and fills, not pixels and alpha blending.
There's a whole creative coding community built around pen plotters -- machines that draw your SVG output with actual pens on real paper. Physical output from digital algorithms. We'll touch on this more later when we get into building output for different mediums, but for now just know that SVG export opens the door to getting your code off the screen and onto physical media.
Choosing your export format
Quick reference for when you're staring at your sketch wondering "how do I share this":
PNG -- single frames, portfolio stills, print production. Lossless, perfect quality. Use saveCanvas() in p5 or canvas.toDataURL() in vanilla JS. For print, render at 3x-6x your screen resolution.
GIF -- social media, short loops, messaging. 256 color limit means gradients suffer, but GIFs play everywhere without a video player. Best for simple palettes and short sequences (2-5 seconds). Keep it under 500px wide to manage file size.
Video (MP4/WebM) -- portfolio reels, YouTube, longer animations. Full color depth, smooth playback, reasonable file sizes. Use frame-by-frame + ffmpeg for quality, or MediaRecorder for quick captures.
SVG -- pen plotters, laser cutters, scalable graphics. Infinite resolution. Only works for vector content (lines, shapes, curves), not pixel effects or transparency tricks.
Think about the output format before you start coding. If you're targeting SVG for a plotter, you need to think in strokes from the start. If you're targeting a seamless GIF loop, build the animation around t * TWO_PI from the beginning. If you're making a 4K print, design with scalable coordinates. Retrofitting an export format onto an existing sketch is doable but annoying -- forward planning saves you from refactoring.
My typical export workflow
For what it's worth, here's how I usually handle export:
- Develop at a comfortable screen size (600x400 or 800x600)
- Test the loop -- does the animation seamlessly repeat? Use the
t * TWO_PIpattern and verify the last frame connects to the first - Record with CCapture at the target framerate (30fps for video, 15fps for GIF)
- Convert to MP4 with ffmpeg for portfolio and video platforms, GIF for quick social sharing
- High-res export at 4x-6x for any still frames I want to print or use in a portfolio
Keep your recording code ready but commented out during development. Uncomment when you're happy with the sketch. I like to wire it to keyboard shortcuts -- R to start recording, E to stop, H for high-res export, S for single frame. Having these always available means you never miss that perfect moment when the generative output looks just right.
One more thing that'll become important down the line: if you're building generative art that produces different output each time (think: random seeds generating unique compositions), save your random seed alongside every export. Write it to the filename or log it to console. When someone wants a print of "that one version," you need to regenerate the exact same output. randomSeed() and noiseSeed() in p5, or seed-based random number generators in vanilla JS, make this possible. Deterministic output isn't just a nice-to-have -- it's a professional requirement for anyone selling editions or prints. We'll dig much deeper into seed-based reproducibility soon, but start thinking about it now.
Putting it together: a recordable looping sketch
Allez, let's build a complete example. A looping animation with recording built in -- press R to record, E to stop, H for high-res still, S for a single frame:
let capturer = null;
let totalFrames = 120; // 4 seconds at 30fps
let frame = 0;
function setup() {
createCanvas(600, 400);
colorMode(HSB, 360, 100, 100, 100);
}
function draw() {
let t = frame / totalFrames;
background(220, 30, 10);
for (let i = 0; i < 60; i++) {
let angle = t * TWO_PI + i * 0.15;
let r = 50 + i * 2.5;
let x = width/2 + cos(angle) * r;
let y = height/2 + sin(angle * 1.5) * r * 0.6;
let hue = (t * 360 + i * 6) % 360;
let size = 4 + sin(t * TWO_PI * 2 + i * 0.3) * 3;
fill(hue, 70, 90, 80);
noStroke();
ellipse(x, y, size);
}
// draw connecting lines between nearby dots
// (skipping for brevity -- you get the idea)
frame = (frame + 1) % totalFrames;
if (capturer) capturer.capture(document.getElementById('defaultCanvas0'));
}
function keyPressed() {
if (key === 'r') {
capturer = new CCapture({ format: 'png', framerate: 30 });
capturer.start();
frame = 0;
console.log('Recording...');
}
if (key === 'e' && capturer) {
capturer.stop();
capturer.save();
capturer = null;
console.log('Saved!');
}
if (key === 'h') {
exportHighRes(4);
}
if (key === 's') {
saveCanvas('still_' + frame, 'png');
}
}
Everything loops on t * TWO_PI. The hue rotates through the full spectrum over one loop. The particle sizes pulse with sin(t * TWO_PI * 2) -- two complete oscillations per loop, both starting and ending at the same value. Every animated property returns to its starting state when frame wraps back to 0. Record 120 frames, feed them to ffmpeg, and you get a perfectly looping 4-second video.
This is the kind of skeleton I start with for any piece I know I'll want to export. The export infrastructure is baked in from the start, not bolted on afterwards. The recording code adds maybe 15 lines to your sketch and saves you from scrambling later.
A note on performance during recording
Frame-by-frame recording slows your sketch down. Sometimes a lot. A sketch that runs at 60fps might chug at 2fps while saving PNGs. This is normal and fine -- the output video will be smooth because each frame is captured perfectly regardless of how long rendering took.
But there's a subtlety. If your animation uses deltaTime for frame-rate-independent motion (like we discussed briefly in episode 16), the slow render speed will mess up your physics. Objects that should move a few pixels per frame will move huge distances because deltaTime reports a massive gap between frames. The fix: during recording, use fixed timesteps instead of deltaTime. CCapture handles this automatically -- it overrides the clock so your sketch thinks it's running at the target framerate even when it isn't. Another reason CCapture is the go-to for serious export work.
Where this matters
Export isn't just about sharing on social media. It's about your creative coding practice having output. A portfolio. Print editions. Video documentation. Without export skills, your work only exists as a live browser demo that nobody else will ever run.
The tools we covered today are the bridge between "I made something cool" and "here, look at this thing I made." That bridge matters more than you'd think. I've seen incredible creative coding work that basically doesn't exist because the artist never exported it -- it lived and died in a browser tab during development. Don't let that happen to your work.
And once you can reliably export your sketches, interesting questions start appearing. What makes a generative piece reproducible? If you export the same sketch twice, do you get the same result? Should you? How do you create editions -- variations of the same algorithm with different seeds? How do you build output that's deterministic enough to sell as a unique digital artifact? Those questions are where we're heading in the next phase of this series. But first -- we've got one more topic in Phase 3, and it's going to change how you think about rendering entirely. The GPU has been sitting there the whole time, waiting for us to talk to it :-)
't Komt erop neer...
canvas.toDataURL('image/png')exports a single frame -- good for screenshots- Frame-by-frame recording (numbered PNGs + ffmpeg) gives perfect quality with no dropped frames
- CCapture.js controls time so every frame is captured perfectly, even if rendering is slow
MediaRecorderAPI records in real-time -- quick and easy but drops frames if your sketch is heavy- ffmpeg converts PNGs to MP4:
ffmpeg -framerate 30 -i frame_%04d.png -c:v libx264 -pix_fmt yuv420p output.mp4 - Two-pass palette generation makes GIFs look dramatically better than default encoding
- Design looping animations with
t * TWO_PI-- trig functions complete exact cycles so frame 0 and frame N connect seamlessly - Loop noise by sampling along a circle in noise-space instead of a straight line
- High-res export: create a larger offscreen canvas, scale up, redraw, save. Don't forget to scale stroke weights too
- SVG export gives infinite resolution for vector content -- perfect for plotters and physical output
- Save your random seeds with every export. Reproducibility is a professional requirement.
Phase 3 wraps up here. We started with smooth motion and easing (episode 16), added structured behavior with state machines (17), gave things physical weight with springs, friction, and flocking (18), wired visuals to sound (19), and now we can capture and share all of it. Our sketches move naturally, respond to input, feel alive -- and we can export them in any format we need. Next up: shaders. We're going to write code that runs directly on the GPU, and it's a completely different way of thinking about drawing. Instead of saying "draw a circle at x, y" you say "for every pixel on the screen, what color should it be?" It's weird, it's powerful, and it's beautiful.
Sallukes! Thanks for reading.
X