loading pixels tweaking margins squinting eyes edging borders shifting elements correcting my posture

Learning OpenGL Shaders and Shader Language

30 September 2024 · Career

It’s been a wild ride into the world of shaders recently. I’ve been working on expanding my skills beyond the standard materials in three.js, getting my hands dirty with GLSL (OpenGL Shading Language) and shader materials. While I’ve already got a bit of shader knowledge under my belt from Unity’s Shader Graph and Blender’s Shader Nodes, it’s been an entirely different ball game working directly in shader code. It’s been challenging, but also incredibly rewarding!

Screenshot of 3D THREEjs project that utlising GLSL Shader Materials to displace heights and animate arclines

My Shader Journey So Far

If you’ve dabbled with Unity’s Shader Graph or Blender’s Shader Nodes before, you’ll know that they provide a fairly intuitive visual way to work with shaders. You can piece together nodes to manipulate UV coordinates, modify colours, or distort geometry. But GLSL? It’s pure code. That’s where things got difficult, but in a good way.

In my case, a good chunk of my foundational shader knowledge comes from game development, specifically when I started working on my own open-world game project during lockdown. The game is based on an expansive ocean filled with randomly generated islands. To make the ocean feel dynamic and alive, I needed to dive into displacement techniques, landmass generation and wave simulations.

From generating procedural land meshes in runtime using Perlin Noise to create height maps for islands, to simulating Gerstner waves to achieve “realistic” ocean water movement. These were my first experiences using vertex displacement for geometry manipulation.

What I’ve Learned So Far

Here are a few things I’ve picked up as I’ve ventured into shaders in both Unity and GLSL:

1. UV Coordinates

Working with UV coordinates was one of the first hurdles. You might remember from 3D modelling that UVs are used to map 2D textures onto a 3D surface. In shaders, UV coordinates become crucial when manipulating textures, lighting, or other effects across a surface. I found this particularly useful when creating procedural textures or when animating textures to simulate water movement.

2. Fragment Shaders for Colour Manipulation

Fragment shaders are what you use to control the colour of each pixel on the screen, and run entirely on the GPU to handle many calculations at once. This became invaluable when working with materials like water. In Unity, I learned how to manipulate colour to simulate effects like refraction and reflection, and that knowledge transferred nicely into GLSL.

3. Vertex Shaders for Displacement

As I mentioned, one of the most exciting things was learning how to displace vertices to create wave simulations. Unity’s Shader Graph gives you an easy visual way to do this, but when coding it directly in GLSL, you have more control and can fine-tune every aspect. I started off using basic sine waves for wave motion, and then explored Gerstner waves to get a more natural, less predictable flow of the water.

4. Shaders in Three.js

Transitioning into GLSL for three.js, it was a bit like stepping from a guided path to the wilderness. While Shader Graph shields you from a lot of the low-level complexity, GLSL forces you to think deeply about things like matrix transformations, normals, and lighting models. But, on the flip side, this also means I now have a lot more flexibility to create custom effects beyond what Unity’s node-based system offers. In three.js, it’s easier to integrate custom shaders into the framework, giving me a lot more creative freedom to design visuals that fit my project’s needs.

What’s Been Hard

It’s not all smooth sailing though—there have been some hard lessons along the way.

1. The Maths (Honestly, who knew there was so much of it!)

I think anyone who’s tackled shaders has run into this one. There’s a lot of linear algebra involved, especially when you start moving vertices around in 3D space. Things like dot products, cross products, and matrix multiplications are part of everyday shader work. While I’ve had some experience with these in Unity, dealing with it all in raw GLSL has definitely been a challenge.

2. Debugging Shaders

Debugging OpenGL shaders is… awkward. In Unity, you have some tools to help you visualise what’s going on. But in GLSL, it’s a different story. You don’t have the luxury of print statements, and finding out where things go wrong often means staring at a black screen or garbled mess of pixels until you figure out why. I’ve found myself going through more trial and error than I’m used to.

3. Performance Considerations

Writing shaders that look cool is one thing, but writing shaders that perform well on the web is a whole other story. I’ve learned the hard way that overcomplicating shaders can tank performance, especially on lower-end devices. Managing things like how many calculations you’re doing per frame, and optimising the number of texture lookups, has been a steep learning curve. It’s a balancing act between achieving the visual effect I’m going for, all while making sure there isn’t noticeable lag.

What’s Next?

As I continue down this shader rabbit hole, my next step is to refine my understanding of blending modes and post processing effects. I’ve only scratched the surface of how colours can mix with GLSL shaders. While Unity does a lot of heavy lifting for you in terms of post processing, in GLSL, it’s evidently up to me to implement my own processing from scratch. I want bloom!

Let's talk - 3D rendered graphic, bubble text

15 years of experience in one inbox, ready for your message.

I'm a Kent-based freelance brand designer and website developer with 15 years of experience, bringing creativity and technical expertise to clients across the UK and beyond.