ARB_vertex_program and EXT_vertex_shader differences

Ostsol

New member
I've finally managed to get my fragment program working properly. However, I'm still using EXT_vertex_shader and I want to try and implement ARB_vertex_program. The problem is that it is quite different. . . With EXT_vertex_shader and ATI_vertex_array_object I am able to include two non-standard vertex attributes: the binormal and tanget vectors, which are used to find the light vector in tangent space. The problem is that I'm not sure if there's an easy way to do this with ARB_fragment_program. Any suggestions?
 
Argh. . . It looks like VertexAttribPointerARB(..) is involved in this, but there's next to no information on what to do with it. :( Looks like I'll be using the same method as you. . .
 
Well I got it working, but with one issue:

I was thinking that since there already is functionality in ATI_vertex_array_object to create an array for normal vectors, I would use that. Unfortunately it gave me nothing but errors when I tried this. :( While it is really no problem to have the normal vector as a texture coordinate, it seems rather strange I can't create a normal array.
 
I'm not sure I fully understand the problem, but this piece of code from a demo I'm working on might perhaps help:

Code:
struct TexVertex {
	Vertex vertex;
	Vertex normal;
	float s, t;

	Vertex sVec; // Tangent
	float ws;

	Vertex tVec; // Binormal
	float wt;
};

...

<snip>

...

	if (GL_ATI_vertex_array_object && GL_ATI_vertex_attrib_array_object){
		glArrayObjectATI(GL_VERTEX_ARRAY,        3, GL_FLOAT, sizeof(TexVertex), vaoBuffer, 0);
		glArrayObjectATI(GL_NORMAL_ARRAY,        3, GL_FLOAT, sizeof(TexVertex), vaoBuffer, sizeof(Vertex));
		glArrayObjectATI(GL_TEXTURE_COORD_ARRAY, 2, GL_FLOAT, sizeof(TexVertex), vaoBuffer, 2 * sizeof(Vertex));
		
		glVertexAttribArrayObjectATI(9,  4, GL_FLOAT, GL_FALSE, sizeof(TexVertex), vaoBuffer, 2 * sizeof(Vertex) + 2 * sizeof(float));
		glVertexAttribArrayObjectATI(10, 4, GL_FLOAT, GL_FALSE, sizeof(TexVertex), vaoBuffer, 3 * sizeof(Vertex) + 3 * sizeof(float));

	} else {
		glVertexPointer(3,   GL_FLOAT, sizeof(TexVertex), vertices);		
		glNormalPointer(     GL_FLOAT, sizeof(TexVertex), ((char *) vertices) + sizeof(Vertex));
		glTexCoordPointer(2, GL_FLOAT, sizeof(TexVertex), ((char *) vertices) + 2 * sizeof(Vertex));

		glVertexAttribPointerARB(9,  4, GL_FLOAT, GL_FALSE, sizeof(TexVertex), ((char *) vertices) + 2 * sizeof(Vertex) + 2 * sizeof(float));
		glVertexAttribPointerARB(10, 4, GL_FLOAT, GL_FALSE, sizeof(TexVertex), ((char *) vertices) + 3 * sizeof(Vertex) + 3 * sizeof(float));

	}
	glEnableClientState(GL_VERTEX_ARRAY);
	glEnableClientState(GL_NORMAL_ARRAY);
	glEnableClientState(GL_TEXTURE_COORD_ARRAY);
	glEnableVertexAttribArrayARB(9);
	glEnableVertexAttribArrayARB(10);
 
Doh. . . nevermind about the normals. . . It looks like it doesn't like being declared as a four componant vector. :hmm: I switched it to three and now everything's working properly.

Ok. . . now what about that glVertexAttribArrayObjectATI(..) method? I have never seen that one before. . . Is it something new? Also, how exactly do you use glVertexAttribPointerARB(..)? I found the specs rather vague on that. . .
 
It the new GL_ATI_vertex_attrib_array_object extension. It's not documented yet (at least I don't have any docs), but it's not too hard to figure it out. I used some guesswork and got it working on the first try :) Just compared it to GL_ATI_vertex_array_object and glVertexAttribPointerARB().
 
Hmm. . . well there's a problem: I have no idea what goes in the last parameter of glVertexAttribPointerARB(..). . .

EDIT: Okay, I understand glVertexAttribArrayObjectATI(..) -- it's not much different than using Variants. However, glVertexAttribPointerARB(..) still looks strange. . . Is ((char *) vertices) simply a reference to the start of a vertex array?
 
Last edited:
Yes, it's the start of the vertex array. I typecast it to a char * simply because of the way C++ does pointer aritmetics. "vertices" is a TexVertex *, so doing vertices + 1 would add a whole sizeof(TexVertex) on the pointer. So for adding byte offsets I typecast it to a char *.
 
Okay. . . I think I understand that. . . So this is meant for EXT_vertex_array, right?

If so, it doesn't look like I'll be using it much since I don't use EXT_vertex_array in my program, yet. Right now I have only ATI_vertex_array_object and manual rendering using glVertex(..) and such. . . Looks like glVertexAttribArrayObjectATI(..) is the way to go, for now. Thanks for the info on it! :)
 
NitroGL said:
Why not use multitexture arrays? It's a lot easier than managing attribs.

True, and I've got it working with multitexture arrays, but I'd like to learn other methods as well. :)
 
Back
Top