[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
How do WebGL and GLSL interact with each other? Like, does WebGL
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /v/ - Video Games

Thread replies: 55
Thread images: 2
How do WebGL and GLSL interact with each other?
Like, does WebGL tell GLSL "draw these pixels with your shader", or does it tell GLSL "Here's some information now draw whatever pixels you want"?
>>
By the way, just to make things more confusing, people appear to be rendering 3D scenes using nothing but GLSL: http://glslsandbox.com/e#32799.3

Speaking of which, is "uniform float time;" something that only works for that particular website, or is it fundamental in GLSL?
>>
>>338334460
>people appear to be rendering 3D scenes using nothing but GLSL
Wouldn't that be really innefficient?
>>
>>338334460
>is "uniform float time;" something that only works for that particular website, or is it fundamental in GLSL?
Its GLSL code. uniform is the storage qualifier and its defining a variable float called 'time' https://www.opengl.org/wiki/Uniform_%28GLSL%29
>>
>>338335086
http://glslsandbox.com/e#32783.0
I don't even have a great graphics card, yet I can get this to run smoothly at 1-pixel-per-pixel at 1080p.

>>338335243
Does that mean that when I write "uniform float time" I reference the same value that the clock uses, some absurdly large number that increases by 1 per second?
>>
>>338333847
gl interfaces with lifetime data. glsl does per frame things with the data per frame.

for example >>338334460 you take program life time data :float label it "time" in gl tell go to use it as uniform data for all shaders. gl then takes the shader and runs the shader for a frame with that input data.
>>
>>338335584
Lol I just happened to be playing around with that one. Its pretty amazing. There are limits to doing it with just GLSL code though. I might be wrong but I think texturing is impossible with only GLSL.

>>338335584
I think you might be a bit confused with it. If you have experience with C its sorta just like defining a variable like:
float time;
So its not actually getting the time or anything there. How this is defined is purely up to the creator of the program. You could just say
time=1.0 and leave it at that
Or you could get it as an input from C or something (I have zero experience with WebGL so idk what you'd use to pass in variables to shaders).
>>
>>338336032
That doesn't explain very much about how WebGL and GLSL communicate.
Like, is a polygon declared by WebGL the window in which GLSL operates and is limited to, or does WebGL describe the polygon and GLSL is written to draw it?

>>338336087
So I can do "float time" and time will not change, but by adding "uniform" it will check to see if there's an inbuilt time variable?
>>
>>338336548
>So I can do "float time" and time will not change, but by adding "uniform" it will check to see if there's an inbuilt time variable?

I'm not entirely sure. Maybe time is a WebGL thing but if that was with just OpenGL you would be passing time in as input from your C/C++ code if you wanted to use it. You would have to calculate the time yourself in the C++ code and then use that.
>>
>>338337140
Basically the uniform thing is pretty much saying that you can't set that value yourself. If you look at that GLSL code for the sick ass ocean thingy you'll see that they are also defining a variable in their main called "time" but that is just local to the main function. If they had tried to re-assign to the global uniform time variable there it wouldn't have compiled.
>>
>>338336548
there is no such thing. its just data and arrays/streams of data

you do with it what you want in shaders. glsl shaders are just convention of opencl shaders. minimally vertex then fragment as convention. in reality its just shader->shader->shader->shader->shader->etc.

concepts of polys are just virtual concepts. that being said data can be conceptualized either in gl or in shader. you can throw in an array of floats in gl then tell gl to conceptualize them in chunk lengths of 3/4 then in the shader think of them as vectors. or use convenience/context lables ie vec3. it will feed each chunk then run the shader on them.
or you can feed the entire array to a shader that consumes 3/4 chunks then feed it to the next shader and call that a geomentry shader. etc its really up to you.
>>
>>338337457
I noticed that, but does that mean that if they didn't have "float time" in main they wouldn't be able to reference the uniform time either in main, or would it mean that they would reference uniform time and what is happening is that non-uniform time defined locally takes priority?
>>
>>338337823
If instead of:
float time = time * 0.3 + mouse.x*0.5;
they had:
time = time * 0.3 + mouse.x*0.5;
Then they would be attempting to assign to the global uniform variable time. GLSL does not allow this and would result in a compiler error. Try it in that ocean thingy if you want.
Sidenote: I'm still trying to figure out what exactly that line does, changing it doesn't seem to do anything.
>>
>>338337683
So the GLSL script runs for every single pixel on the canvas every frame, and it's up to GLSL to look at the data WebGL gave it and figure out when the pixel corresponds to a polygon?
Sounds like it would be difficult to efficiently figure out if a pixel has a corresponding polygon.
>>
>>338338130
only on the fragment(last) shader. at that point its just color/out_color/gl_FragColor label (whatever label, the latter being the builtin label)
and thats assuming fragment uv == resolution uvs
>>
>>338338130
In GLSL there are a few types of shaders. The two basic ones is the Vertex Shader:
https://www.opengl.org/wiki/Vertex_Shader
and the Fragment Shader (known as the pixel shader in Direct3D):
https://www.opengl.org/wiki/Fragment_Shader

You can see the others here (https://www.opengl.org/wiki/Shader#Stages) but those two are what you need for a basic undestanding for how it works.

A vertex shader is run per vertex from the polygon that you input into it. A fragment shader is run for each pixel on that polygon.

The type of shader being used in that ocean example is a fragment shader. There are no polygon definitions at all in that example.
>>
>>338338596
That's useful info, but it doesn't answer what I asked!
>>
>>338339186
Sorry yeah I got carried away. All the fragment shader does is compute a colour for that pixel, it just sort of 'knows' where it is on screen/which polygon it belongs to. From what I can tell in the ocean thingy thats only dependent on the time, resolution and where the mouse position is.

gl_FragColor is required to be defined by the fragment shader and corresponds to the returned colour value.

Pretty sure I've gone off on a tangent again but there have some shit cunt.
Also I just realised I'm on /v/, the fak. Since when was I here.
>>
>>338339678
Ah, so the vertex shader is if you want something simple that has a colour gradient between 3 or 4 vertices (is it only 3 or 4?), and the fragment shader is if you want something slower but more complex?

In which case, is glslsandbox a thing that just does a big gl_FragColor polygon across the entire screen?
Also, if one polygon is completely hidden behind another, will that be handled automatically too, or will both render (only showing the one in front) thus slowing performance?
>>
>>338339186
there is no 1:1 correlation to hardware with shaders until you bind them.

you can feed in 4 vec2 and a uniform color. thats all they will be. absolutely meaningless. until you feed in more data like resolution can they mean anything on screen. at that point you can bind them to 4 uvs on resolution, then they exist on the screen.
or you can calculate them against time and have time represent 4 fragments of color that happen to be certain pixels dependent on time.
or you can throw output data in another shader that turns each of the 4 vec2s and a color into 12 vec2s of certain color for each of the original 4 vec2s (or polygons)
>>
>>338333847
WebGL is meant to be a way to tell the browser how to create and handle a graphics context. This includes loading model information, textures, and other control items like uploading uniforms and shaders. If your browser was the car, then WebGL is the everything in the cabin (like the steering wheel, ingition slot, shifter, radio controls, etc.).

GLSL is a language is like a scripting language. You feed WebGL GLSL shader microprograms. Those shaders are then compiled and sent to the graphics hardware.

As mentioned earlier in the thread, GLSL provides fine-tuned control for different parts of the rendering pipeline. Tesselation shaders will modify raw meshes. Vertex shaders will manipulate the geometry of a mesh before it is rendered to do stuff like scale the mesh, rotate it in space, and translate it's location. Often, the model-view-matrix is applied in a vertex shader to do the above.

Finally, fragment shaders perform the actual rendering of an object. You can think of fragments as pixels. Each individual pixel has it's value computed in some way or form; lighting, for instance, applies a per-pixel adjustment to the brightness of a pixel based on some calculation.

In this way, GLSL is the engine in your car. When WebGL steps on the gas, GLSL starts to rev. The movement of the car are the pixels being displayed on screen.

To summarize, WebGL is a graphics-resource control system. GLSL is a rendering control system. Each would be able to communicate with one another, but WebGL and GLSL operate on different pieces of hardware (CPU vs. GPU), and so don't have direct access to one another's resources.

>Like, does WebGL tell GLSL "draw these pixels with your shader", or does it tell GLSL "Here's some information now draw whatever pixels you want"?

Both are correct assumptions.

WebGL will tell graphics hardware when render. WebGL will also send information about things to render, including the shaders and resources themselves.
>>
>>338340053
A vertex shader is often called the "geometry shader." It processes the actual points that make up a mesh.

When a mesh is rendered without any input from a vertex shader, the mesh will be rendered without any rotation, size-scaling, and in the exact center of the rendering space -- "all zero's". If the camera isn't looking at the origin, then the mesh will not appear on screen.

The vertex shader looks at each and every vertex -- every point -- and moves it based on information sent to it from the CPU-side control software, WebGL in this case. So, if you have a simple three sided pyramid (four vertices), the vertex shader will run four times for that mesh.

Vertex Shaders and Fragment/Pixel Shaders are often used at the same time. Some games will use a common Vertex Shader for most objects, so you'll often see people just writing fragment shaders and letting the game engine handle everything else.
>>
>>338340960
>A vertex shader is often called the "geometry shader
Nope vertex shader and geometry shader are two different types of shaders.
>>
>>338340960
As far as I can tell, vertex shaders and fragment shaders have no difference between them aside from what is coded by the programmer.
I'm trying to learn from this unfortunately rather uninformative tutorial: http://www.webglacademy.com/courses.php?courses=0|1|20|2|3|4|23|5|6|7|10#1
>>
>>338333847
what game is this?
>>
File: what is ____.png (47 KB, 400x287) Image search: [Google]
what is ____.png
47 KB, 400x287
>>338341523
>>
>>338341358
https://aerotwist.com/tutorials/an-introduction-to-shaders-part-1/

look at one and two. that should be able to break the misconceptions you have about how shaders work. a change in perspective will make thing clearer. then try looking at opencl. its the broader view of how shaders work
>>
>>338333847
The latter

>>338336548
A uniform is a value that is the same for each time a shader program runs from a glDraw call (at least in OpenGL). The website is passing a value to this uniform variable using a glUniform1f call.
>>
>>338339678
>it just sort of 'knows' where it is on screen/which polygon it belongs to

Remember that modern computers process information with several different layers of memory. The mesh currently being rendered is replicated in common cache, transformed, then fragments are interpolated for a given triangle and the cores are set loose to run pixel shaders.

>>338341174
Good call. In any case, vertex shaders can be used to manipulate mesh geometry. Skinning and animation can be done in a vertex shader.

>>338341358
There's a discernible difference between the two.

Vertex shaders prepare a naked mesh for rendering. The mesh is scaled to the desired size, rotated, and moved around the scene. Other things can be done. If you want a mesh to be rendered far away in the distance, the Vertex Shader handles that.

Fragment Shaders determine what color a given on-screen dot should be. If you wanted a mesh to be rendered in flat red, the Fragment Shader will do that.

It's important to note that shaders are typically expected to return an output. Vertex shaders are expected to return a 4-dimensional vector (X,Y,Z,W), while fragment shaders are expected to return a four-dimensional vector (R,G,B,Alpha).

That's not entirely correct; vertex shaders typically return a struct with the four-dimensional position and a couple of other parameters that nobody really cares about. Some other shader systems have more complex systems that allow you to return more complex or customized structs so that its easier to move information around.
>>
>>338341358

This is entirely incorrect.

A vertex shader runs once per vertex.

A fragment shader runs once per every mapping of some element to an individual pixel in the frame buffer.
>>
>>338342548
Do vertex shaders return a 4-dimensional vector for every pixel on the screen?
And what is done with those vectors?
>>
>>338341815
Bloodsport.tv
>>
>/v/ is actually full of people who know graphics programming yet every thread about graphics is full of terrible misinformed opinions

What the fuck?
>>
>>338342715

>Do vertex shaders return a 4-dimensional vector for every pixel on the screen?

No, this is what the fragment shader does. The vertex shader runs once for every vertex you are rendering.
>>
>>338342767
You get some anomalies like that.
Like, if you make a Ring Runner thread, you've got a 10% chance that you'll find another Ring Runner player, but once I had a Ring Runner thread reach 5 unique IPs.

>>338341523
Ring Runner.

>>338342741
I tried going to that URL, got nothing.

>>338342874
Alright, but what do you do with the output vector?
Does it just tell WebGL what section of the screen to apply the fragment shader to?
>>
>>338343075

>Does it just tell WebGL what section of the screen to apply the fragment shader to?

Pretty much. There is actually a step inbetween the vector and fragment shader called the geometry shader that actually constructs the triangles, the geometry shader then passes this information to the fragment shader and then it fills in those triangels with pixels so to speak
>>
>>338342767
threads full of comedy
>>338343075
>Does it just tell WebGL what section of the screen to apply the fragment shader to?
its actually in the IO structs in between the shaders. thats where the unifroms and varying come from and where setting output data goes to. also why out labels must match in labels
>>
>>338343275
>>338343532
In that case, where do other properties of the triangle/quad come from?

Say that I try to render a red triangle and a green triangle; if the vector shader says to the fragment shader "render inside these two sets of three vertices", where does the colour of the triangles come in?
>>
>>338344073
You pass that as a uniform to the vertex shader which passes it along to the fragment shader.
>>
>>338344173
(assuming you want the same color for all fragments in that mesh)

Really I'd just recommend going through https://open.gl once, it wont make you a wizard or anything but should help you get the fundamental concepts of OpenGL which in turn probably translates to WebGL. It's kind of a pain in the ass but that's simply what graphics are.
>>
>>338344073
assuming in webgl you sent them as 2 sets of 3 verts each with a separate color.

if you dont tell webgl that they are triangles the shader will run twice. once for each set and in the shader you will have to output 3 verts and their color.

if you tell the shader through webgl that these are triangles. the vert shader runs once per vert . you need to output a position vec2 that calculate and pass out the color
this passes to the fragment shader. assuming the resolution matches the framebuffer. the fragment shader runs on every fragment/pixel inside the declared triangle( or dyi shader calculations) and outputs the gl_Frag_Color. this may or may not match to a specific pixel. but well just say we want to stop here and each output fragment is predefined to match per pixel.

if you did something in the frag shader like color = color * (uv.y/in_pos.y);
each fragment that is closer renderbuffer.y = 0 will be darker. top or bottom depends on how you have it set up.
>>
Are there any examples of advanced WebGL usage in games?
I've never seen anything impressive in games, but GLSLsandbox truly is a wonder.
>>
>>338346174
Never seen any "real" games targetting WebGL directly, but Bastion can/could run in your browser using it.
>>
>>338342767
>/v/ is 8 people

>>338344073
Here's a secret; how your mesh is represented in memory is entirely up to you. You can put anything into graphics memory that you want so long as you feed hardware what it's looking for -- which is just three points for reach vertex. Everything beyond that is techno-wizardry. You can arrange your mesh information as a big string of characters and choose only to feed it some small section of that string as your "triangle information."

For instance, meshes these days are often handles as a sort of "complex vertex." A simple triangle might look like this in memory (the brackets are there to separate vertices and are entirely for readability):

[0, 1, 2, 0.5, 0.7, 0.5, 0.00, 0.00], [1, 2, 0, 0.5, 0.7, 0.5, 0.00, 0.00], [2, 0, 1, 0.5, 0.7, 0.5, 0.00, 0.00]

What the programmer is actually thinking is this:

[X, Y, Z, R, G, B, U, V], [X, Y, Z, R, G, B, U, V], [X, Y, Z, R, G, B, U, V]

The programmer then feeds the graphics card a series of indices that indicate which points are used in a certain triangle. This is called a "buffer data" array. It'll look like this:

[0,1,2],[8,9,10],[16,17,18]

These are the indices for the different values in memory. They line up with the X,Y,Z components in our complex vertices.
[0, 1, 2, 3, 4, 5, 6, 7],[8, 9, 10, 11, 12, 13, 14, 15]],[16, 17, 18, 19, 20, 21, 22, 23]

Since all this is still located in graphics memory, we can still access all that other information. We just need to tell WebGL to keep track of it by defining another "buffer data" array, which we can then summon and access inside of a shader.

Look up Vertex Buffer Objects (VBO).
>>
>>338346450
>mfw I forgot normals, tangents, and secondary uv coordinates
>>
I'd be fucked if I ever had to explain this stuff in an interview.

But /v/, you're different. I went to school to learn this. You're the only who understands.
>>
>>338346174
in games no. game's visual trickery yes. but still based on old techniques.
the problem is getting the data back out of the card to do something with it. its very heavy.

in the future that will change as more people use opencl. they can leave the data in the card as they can futz with it there.
>>
>>338344073
I reread your question; the answer is a bit of a hydra. Lots of ways to do it.

>>338344173
This is one way. It's the most flexible method.

Essentially, a Uniform is some form of data that you're sending to the graphics card. A fragment shader might have a section where is requests that Uniform by name, which then allows you to receive that information from the CPU. Uniforms can include textures, colors, magic numbers that describe how shiny an object is, whatever.

Textures are a very interesting way to pass around information. Look up toon ramps and normal mapping. You can effectively pass per-pixel 4D vertices to a pixel shader through a texture, the actual data being whatever the hell you want it to be.

>>338346450
This is another method, but it is generally reliant on conventions in how files are exported from modelling programs.

>>>>

A third method, and one that is terrible, is to just tailor-craft special fragment shaders to use certain colors. You can write a "red shader," a "green shader," and a "blue shader" and interchange them as you desire. If you want more colors, you write more shaders.

Not a good method, but would have better performance and memory usage than a shader that relies on uniforms or information supplied with a data buffer.
>>
>>338347582
Was OpenCL a typo, or is it something I haven't heard of?
>>
>>338348261
OpenCL is a parallel computing thing, basically allowing you to write regular programs to be executed on your GPU. So you might use it for AI or computer vision stuff or physics, etc.
>>
>>338348261
OpenCL is a thing, but he probably meant OpenGL.

If you want to have a bad time, start reading about CG shaders.

The Unity reference has some decent material on generic shader writing. However, Unity uses a custom shader compilation system that's compatible with CG shaders or raw GLSL/HLSL shaders, so it can get a little confusing as a learning tool.

http://docs.unity3d.com/Manual/SL-SurfaceShaderExamples.html
>>
>>338348261
Not the guy you're talking to, but OpenCL is essentially a library that allows you to access the general purpose computing capabilities of your GPU more easily in combination with other processing units in your computer.
>>
>>338347582
I'm kind of confused, you say that it's hard to apply to games, but I don't see why it should work well on GLSLsandbox but not in an actual application.
>>
>>338348792
Most people don't actually like doing manual draw calls. That's why everyone's moving into engines; let someone else handle that complexity.

WebGL is sort kind if like a peacock in a chicken coop. Very few people who write websites are experienced graphics programmers, so the power and flexibility that WebGL affords is sort of wasted because it would only be used by people who don't really want to deal with anything that ornery and low level.

Again, most people just stick to engines.
>>
>>338348792
because its all visual. there's no data backing it. no collision no physics no data. none of it is touching the backing data that its built from.
Thread replies: 55
Thread images: 2

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.