Hard core engine magic change! Realize Graphics 2D&3D textured drawing

The Graphics component that comes with Cocos Creator provides us with a series of drawing interfaces, but sometimes we need to find ways to achieve some special needs. Today, I will share with you some simple functions that I modified after inheriting graphics. After the magic change, it will be realized:

  • 2D with texture to draw various paths and other effects;

  • 3D can draw various paths, graphics and other effects with a high degree of freedom of the program;

  • Specifically, it can be applied to: real-time drawing of character path lines in games, magic brushes that can draw with special textures, real-time generation of 3D objects that players want in 3D games/generation of 3D objects with high degrees of freedom according to gameplay, etc.

41265a42fbb764ed9ec5ee08369abcbb.png

2D effect preview

b4c46a9b9477a62c47c06f73c7b3a44d.gif

3D effect preview

The engine version used this time is Cocos Creator 3.4.1, the following is my magic modification idea, the same idea can also be used in other magic modification components.

1. 2D textured

Although the source code of the engine is concise and tidy, it still looks foggy. What should I do? There is no shortcut, try to chew it down, what if it works! And when I looked down the folders by category, it was really not as difficult as I imagined (because I saw a super-highly coupled project code before, it was hell-level reading difficulty, and later I felt that the code logic was not too complicated. strenuous).

Read the source code of the engine to know the drawing principle of graphics:

  • Implement drawing components in graphics.ts and collect drawing information through various interfaces;

  • Implement the vertex rendering data assembler in graphics-assembler.ts;

  • In impl.ts, the storage and processing of drawing path points are realized.

So, how to make magic changes? I thought of the method of inheriting + overloading the graphics component:

  • The _flushAssembler method of the graphics component is where the vertex rendering data assembler is obtained, so the vertex data rendering assembler can be rewritten in this method.

  • If you want to add texture, you need the length of the line in shader, and the two data of line length and line width can form uv coordinates to obtain the pixel points of the texture.

  • In the onLoad method, replace the waypoint storage processor impl.ts with the waypoint storage processor implemented by yourself.

With the idea, start working!

First inherit the graphics component, and then overload the _flushAssembler method according to the source code. Considering that the assembler method of the v3.x version is an object and not a class that cannot be inherited, we simply create a new object without doing anything else, shamefully named it superGraphicsAssembler, and replace the original The methods of the assembler are all assigned to the new assembler.

Because our purpose is to add a line length data to the vertex data of the component, we need to do things in the _flattenPaths method that implements the path data sorting function in the assembler.

Rewrite it first (in fact, copy the source code of this method and modify it). As for the place where the error will be reported, the import that should be imported, if it cannot be imported, use the declaration such as __private._cocos_2d_assembler_graphics_webgl_impl__Impl its type. If it still doesn’t work, use the any type.

If there is a type of object that needs to be new and cannot be imported from the engine, rewrite this class, such as const dPos = new Point(p1.x, p1.y); This line, you can copy the Point class of the engine and change the name to Point2, and add your own material lineLength to this point class by the way code> line length. Then use pts[0]["lineLength"] = lineLength; to calculate the line length from the initial point to each point and assign it to the path point data. When it is time to assemble the vertex data It can be obtained in the same way.

Up to here, our path points have all carried line length data, but it is useless to have path points alone. We need to add this data to the vertex data and pass it to the shader for use. So we focus on the _expandStroke method of assembling wire vertex rendering data. Copy it and change it again, and pass an additional parameter lineLength wherever the _vSet method is called to set the vertex data – that’s right, we just got it from the path point object The length of the extracted line.

But then we found out that setting the data in the _vSet method is achieved by setting the element value of the corresponding subscript in the buffer array, so we need to modify the vertex data next Format, so that the data stored in the buffer after adding new members can be read by the shader downstream of the rendering pipeline. Find it, its vertex data format is defined in the graphics.ts file:

const attributes = vfmtPosColor.concat([
new Attribute('a_dist', Format.R32F),
]);

Jump in vfmtPosColor to see, it turns out to be:

export const vfmtPosColor = [
new Attribute(AttributeName.ATTR_POSITION, Format.RGB32F),
new Attribute(AttributeName.ATTR_COLOR, Format.RGBA32F),
];

Each new item in the buffer array is an additional data. The three array elements of 32-bit float in a_position are one data, and the four elements of 32-bit float in a_color An array element is a piece of data, and an array element of 32-bit float in the newly added a_dist in the graphics file is a piece of data. I believe that some students have already discovered the pattern (for a reason, please read on).

Let’s copy it over and add another piece of data to it:

const attributes2 = UIVertexFormat.vfmtPosColor.concat([
new gfx.Attribute('a_dist', gfx.Format.R32F),
new gfx.Attribute('a_line',gfx.Format.R32F),
]);

Yes, it is the line length, a 32-bit float element is enough, no more waste. Then we assign all the codes that use attributes in the source code and change them to attributes2 defined by ourselves, and do the same for the codes that use these two:

const componentPerVertex = getComponentPerVertex(attributes);
const stride = getAttributeStride(attributes);

As for what are these two? Jump into the generation function in the source code to see the total number of occupied elements and the total byte length of a single vertex data.

Now we go back to the _vSet function. At this time, we found that after modifying the vertex data format, there is space to put the line length data into the buffer, so under vData[dataOffset + + ] = distance; Add another line vData[dataOffset + + ] = lineLong;.

In addition, after the _vSet function is changed, all places where the _vSet function is used must be changed to add line length data, so we will use all the _vSet functions in the source code. The methods of the code>_vSet function are all copied and added with the line length parameter.

It’s perfect this time!

Can you try the effect now? No, don’t worry, I only changed the upstream of the rendering pipeline to make the pipe thicker, and the downstream pipe is not compatible yet and the pipe will burst. In line with the principle of conscientiousness, copy the downstream shader pipe to the default shader of graphics to create a new “material and effect”), and name it as pathLine at will, in the vertex function of the shader Here emulate:

in float a_dist;
out float v_dist;

Also write a:

in float a_line;
out float v_line;

This a_line is the shader pipeline that takes over the a_line line length data in the upstream rendering data assembler (just like a water pipe), out means to let it flow into the next water pipe (fragment coloring function), of course, there are two water pipes between the two water pipes to undertake (vertex data is connected to triangles, and rasterization cuts each triangle into countless pixel grids ), the two water pipes in the middle don’t need to be ignored as long as you know their functions. Then compose the line width and line length into uv coordinates in the fragment coloring pipe to get the pixels of the texture:

vec2 uv0 = vec2(v_line,(v_dist + 1.)/2.);
uv0.x = fract(uv0.x);
uv0.y = fract(uv0.y);
o *= CCSampleWithAlphaSeparated(texture1,uv0);

Where did this texture come from, add it now:

properties:
texture1: { value: white }

Add uniform sampler2D texture1; to the fragment coloring pipe, and then add the place to set the material and texture in the SuperGraphics defined by yourself:

@ccclass('SuperGraphics')
export class SuperGraphics extends Graphics {
@property(Texture2D)
lineTexture:Texture2D = null;
@property(Material)
myMat:Material = null;

onLoad(){
if (this. lineTexture) {
this.lineWidth = this.lineTexture.height;
lineC = this.lineWidth/ (this.lineTexture.height * 2 * this.lineTexture.width);
}
if (this. myMat){
this.setMaterial(this.myMat,0);
if (this. lineTexture)
this.getMaterial(0).setProperty("texture1", this.lineTexture);
}

  super.onLoad();
}

onEnable(){
if (this. myMat){
this.setMaterial(this.myMat,0);
if (this. lineTexture)
this.getMaterial(0).setProperty("texture1", this.lineTexture);
}
}
  • Final Effect

49bf0daf2d29bf177708a4df121bcc9e.png

3a0877670feab57adfde0ea04ef39979.png

cce6b827f6f65a13e4fff8fd4742a063.png

Note: If the current code uses close to draw, it will cause abnormal display, and the lazy method can not use close.

2. 3D with or without texture

With the previous experience, let’s upgrade the experiment and change the magic of graphics to 3D.

We need to add a z coordinate to it, then add moveTo3d, lineTo3d to graphics on the previous basis > Wait for the interface, and then imitate the source code to copy the path point storage processing class impl.ts and rewrite it, adding z coordinates to the places with 2D coordinates .

In the onLoad of our Graphics3D component, assign the data of the original impl object to the new G3DImpl object, and then set the All the codes that use the impl object in the source code are copied and changed to use their own G3DImpl object.

Since a_position in the vertex data structure always has the z coordinate storage position, so the vertex data structure after adding the line length above is used. Finally, you can get the joy of using programs to draw in 3D with a high degree of freedom!

  • The material attached to the 3D rendering component can check the depth writing and depth testing, and the effect is better.

837a11edd9f015b3c58599bd325990f6.png

  • 3D drawing components can be textured or not

44e995956d74b42c180f90bc35381f36.png

  • Final Effect

19d6c38347fe9207af28bddbe130aae5.gif

a0934d0b72969438d50cec440f90eb45.gif

0bb82337b90c43cfe016017117e2dff6.gif

Welcome to click 【Read the original text】 at the end of the article to go to the forum post to discuss together. The complete source code of the project is placed in the open source warehouse for you to download. I hope it can be helpful to everyone!

Full source code

https://gitee.com/XiGeSiBoSeZi/study.git

Forum Post

https://forum.cocos.org/t/topic/131608

Wonderful past

0874254254a48c82b7f477a9031a566a.png

55a9bcd9c4a8168ce984a0633a839a0d.png

333c1ce12b447b36af95644921119cfc.png

d83ada0c6e325b3ed746b23c9b033f72.gif