Is "shaders" a name that doesn't work anymore? I always perceived them as after effect functions to apply to a scene to do stuff. Ie. Shading. But they seem to be little programs you can run in your GPU? I see more and more examples of using shaders to do stuff that really isn't about taking an output scene and making it pretty.
Or did shaders always have this extended role?
There was never a time they weren't turing complete (e.g. you could always just run the same NAND shader iteratively on the same pixel buffer and compute whatever you like using pixels as binary storage) but once GPUs moved away from fixed pipelines it became easier to do whatever you wanted with them. Nowadays there are also compute type shaders which are kind of what you are talking about, running little functions on the GPU with no intention of actually shading anything with it or even working with graphical entities for that matter.
Shaders have always been little programs that run on your GPU, it's just that their original intention was to operate on and then output a texture/set of pixels, which you would render to your scene. However a texture/pixel set is just bits at the end of the day, which can be used to represent other data than colors/alpha as well.
Shaders are the highly parallel part of graphics programming, so they are where non-graphics programs so their highly parallel work on GPUs, which are traditionally usually designed and used for graphics work but sometimes used for other applications like neural nets and cryptocurrency mining. When these got popular, GPU designers started supporting non-graphics non-shader parallel programming with models like CUDA.
https://gamedev.stackexchange.com/questions/136029/whats-the...