All right, folks, how about I follow through on what I said a month ago, and post something about #WebGL?
WebGL is not only verbose, it's also designed the way people thought graphics ought to be drawn twenty years ago; using it is hard! So most people who make 3D web content build on top of a library, like ThreeJS. I did! Zero shame in it, it's perfectly sensible to slingshot your project past the "wtf is gltexsubimage2d" phase of 3D web graphics, and into the cool zone.
For instance, my Matrix demo was originally a ThreeJS project. But as we'll soon see, it's changed quite a bit, and so have I 😁
When I first made this project in '18, I tested it in a few Mac browsers and then set it aside. My recent switch to Android got me checking my projects in various browsers across devices, and I realized something was going wrong in Mobile Safari.
This might surprise uh NO ONE, but even in a Khronos spec based on a 16 year old industry standard, there is enough wiggle room for human error to creep in and cause browser implementation differences. In this case, it's due to floating point textures.
So, let's say you want to apply special effects to your graphics, like a badass green glow. Like most digital graphics techniques, special effects boil down to doing math on pixels. And if you want your special effects to look good, you need your pixels to store precise information.
By default, a pixel in a WebGL texture is just four bytes— one byte per channel of RGBA. That's a non-starter.
But WebGL has optional extensions for textures that are four single or half precision floats per pixel!
The difference between a byte and a single precision float is huge. You can represent almost any number you can think of as a float.
Wherease a byte can only store a WHOLE NUMBER between 0 and 255.
Screw that, let's go with the floats, right? Support for floating point textures is immensely popular on graphics hardware nowadays, and it's reasonable to expect it in WebGL too.
But even the oldest WebGL/OpenGL extensions are *optional*— and if a browser vendor doesn't like one, it's out.
There are two routes to enable floating point textures in WebGL 1: single precision float, and half precision float. While half float textures are widely supported, float textures aren't. Why is this?
Reason #1: mobile device hardware has wider support for half floats. They're lower precision, but they save tons of space.
Reason #2: they FUCKED UP the rollout. 😳 By mistake, the float textures extension spec was written too ambiguously, so they patched it with other specs and made a mess.
I'm not kidding. Due to differences between the OpenGL ES and WebGL spec processes, crucial definitions of floating point texture support were left out of the extension spec by mistake, and had to be monkey patched with weird followup extensions. I've read followup threads in the WebGL mailing lists, and people are cursing each other out over this.
At least they sorted all this out in WebGL 2.0, so, we can just wait for that to roll out, right?
Wrong. Apple won't enable WebGL 2.0 yet in any form of Safari.
Why? Well maybe they're revolted by what happened with WebGL 1.0. Maybe they'd rather work on the upcoming replacement API, WebGPU, that's a better fit for modern graphics problems. Maybe they may want to limit WebGL's successful use cases, to bolster their native libraries, the jerks!
At least half precision floats are supported on every browser I can find. But they forgot to specify UPLOADING them! So Safari doesn't support THAT!
The good news is, we rarely have to upload float textures to the GPU; normally we just draw things to them. It's just, wow, what a mess, right?
Let's get back to the good stuff. With a half float support, we can represent a wide range of data in a texture. To produce special effects, we basically use half float textures to store our in-between steps, and copy the last one to the screen.
But what if we choose to never draw to the screen? What if we draw from A to B, then back to A, and so on?
This feedback loop of textures is a mode of general purpose GPU computation. We can configure WebGL to do plain old math for us!
But why would we do this— send our homework to the video card, just to crunch some numbers and send them back? Well, some math is very GPU friendly. But also, some math is ONLY used to produce graphics. So we might as well get it all done in one place.
Surprise surprise, my Matrix demo computes the entire effect on the GPU. It's just a picture that redraws itself. 🔄
ThreeJS has implementations of special effects passes (the "EffectComposer" example) and general purpose GPU computation (the "GPUComputationRenderer" example), but as far as I can tell, EffectComposer doesn't enforce floating point precision, and GPUComputationRenderer expects the browser to support SINGLE precision floats, which, as we've discussed, aren't as widely supported as half floats.
We can stick with ThreeJS, or we can try and do better. No offense! But what else can we try?
REGL is a WebGL wrapper that takes a functional approach to drawing things. While ThreeJS and A-Frame's strengths lie in getting your 3D scene/cameras up and running, REGL's strength is in its flexible drawing paradigm.
For instance, my Matrix code has no scene or camera; ThreeJS insists I make both. EffectComposer hides one of each inside it! But REGL only insists that I have a vertex shader, a fragment shader, and the data to feed them with.
There's lots more going on in the matrix demo— MSDFs, a homemade bloom pass, and color dithering— but I've already rambled quite a bit. So if you've got any questions for me about anything, I'd be happy to answer them 😁, but I'll save the other topics for future threads
Revel in the marvels of the universe. We are a collective of forward-thinking individuals who strive to better ourselves and our surroundings through constant creation. We express ourselves through music, art, games, and writing. We also put great value in play. A warm welcome to any like-minded people who feel these ideals resonate with them. Check out our Patreon to see our donations.