Directory
Geometry Vertex Position Data
Point line defines geometry vertex data
Mesh Model Definition Geometry Vertex Data
Vertex normal data
Implement array cube and camera adaptation
Introduction to Common Geometries
Geometry rotation, scaling, and translation methods
geometry vertex position data
This article mainly explains the vertex concept of geometry, which is relatively low-level, but after mastering it, you will have a deeper understanding of the geometry and model objects of three.js. Before understanding the vertex data, we need to understand, three.js cuboid BoxGeometry, sphere SphereGeometry, etc. Geometry is built based on the BufferGeometry class. BufferGeometry is an empty geometry without any shape. We can customize any geometry through BufferGeometry. Specifically, Define vertex data .
point line defines geometry vertex data
Create a set of xyz coordinate data through the JS typed array Float32Array to represent the vertex coordinates of the geometry.
// Create an empty geometry top object const geometry = new THREE. BufferGeometry() // add vertex data const vertices = new Float32Array([ // Write vertex coordinate data in the array 0,0,0, // Vertex 1 coordinates 50,0,0, // Vertex 2 coordinates 0,100,0, // Vertex 3 coordinates 0,0,10, // Vertex 4 coordinates 0,0,100, // Vertex 5 coordinates 50,0,10 // Vertex 6 coordinates ])
The three.js geometry vertex data is represented by the attribute buffer object BufferAttribute of three.js.
// BufferAttribute attribute buffer object represents vertex data const attribute = new THREE.BufferAttribute(vertices,3) // 3 as a group
Set the value of the geometry vertex position property:
// Set the position attribute of the geometry attribute attribute geometry.attributes.position = attribute
Set point material to define point model: Add point model to the scene:
// set point material const material = new THREE. PointsMaterial({ color:0xffff00, size:0.2 , }) // define point model const points = new THREE. Points(geometry, material) scene. add(points)
Set the line material to define the line model: Add the line model to the scene (of course there are the following line methods, you can try it yourself):
// set the line material const material = new THREE.LineBasicMaterial({ color:0xffff00, // yellow lines }) const line = new THREE.Line(geometry,material) // open line // const line = new THREE.LineLoop(geometry,material) // closed line // const line = new THREE.LineSegments(geometry,material) // Non-continuous lines scene. add(line)
Mesh model defines geometry vertex data
The mesh model Mesh renders the vertex coordinates of the custom geometry BufferGeometry. Through this model, everyone will understand the concept of triangular faces. The mesh model Mesh is actually a splicing of triangles (faces). Using the mesh model to render the geometry means that all the vertex coordinates of the geometry form a group of three to form a triangle. Multiple groups of vertices form multiple triangles, which can be used to simulate the surface of an object.
// mesh model rendering geometry const material = new THREE. MeshBasicMaterial({ color:0x00ffff, // side:THREE.FrontSide, // The front side is visible, but the back side is invisible // side:THREE.BackSide, // The back side is visible, but the front side is invisible side:THREE.DoubleSide // can be viewed on both sides }) const mesh = new THREE. Mesh(geometry,material) scene. add(mesh)
For example, I want to make a rectangular plane. Just modify the vertex data:
Geometric vertex index data: After the geometry BufferGeoMetry corresponding to the mesh model Mesh is split into multiple triangles, it is possible that the vertex position coordinates of many overlapping triangles are the same. If we want to reduce the amount of vertex coordinate data, we can Implemented with geometry vertex index geometry.index:
// Create an empty geometry top object const geometry = new THREE. BufferGeometry() // add vertex data const vertices = new Float32Array([ // Write vertex coordinate data in the array 0,0,0, // Vertex 1 coordinates 1,0,0, // Vertex 2 coordinates 1,1,0, // Vertex 3 coordinates 0,1,0, // Vertex 4 coordinates ]) // BufferAttribute attribute buffer object represents vertex data const attribute = new THREE.BufferAttribute(vertices,3) // 3 as a group // Set the position attribute of the geometry attribute attribute geometry.attributes.position = attribute // typed array to create vertex data const indices = new Uint16Array([ 0,1,2,0,2,3 ]) // definition of geometry vertex index geometry.index = new THREE.BufferAttribute(indexs,1)
vertex normal data
The concept of normal in mathematics: For example, a plane, the normal is the vertical line of the plane, if it is a smooth surface, the normal of a certain point is the normal of the cut surface of the point. The concept of normal in three.js is similar to the normal in mathematics, but it is more flexible when defining and will be adjusted as needed:
// The normal data of each vertex corresponds to the vertex position data one by one const normals = new Float32Array([ 0,0,1, // vertex 1 normal 0,0,1, 0,0,1, 0,0,1, ]) // Set the geometry's vertex normal math geometry.attributes.normal = new THREE.BufferAttribute(normals,3)
Realize array cube and camera adaptation
Create a column of models through a for loop, and then I array cubes through a double for loop, the code is as follows:
// Add objects, create geometry const cubeGeometry = new THREE.BoxGeometry(1,1,1) // set the geometry size const cubeMaterial = new THREE.MeshLambertMaterial({color:0xff0000}) // set the geometry material // Array of multiple cube mesh models for(let i = 0;i < 10;i ++ ){ for(let j = 0;j < 10;j ++ ){ // Create objects from geometry and materials const cube = new THREE. Mesh(cubeGeometry,cubeMaterial) cube.position.set(i*2,0,j*2) // Array along the xz axis scene. add(cube) cube.position.y=2 } }
Next, I can see a larger observation range by zooming out the position of the camera, as follows:
Introduction to common geometry
three.js provides many geometry APIs, as follows:
const cubeGeometry = new THREE.BoxGeometry(1,1,1) // cube const cubeGeometry = new THREE.SphereGeometry(1) // sphere const cubeGeometry = new THREE.CylinderGeometry(1,1,1) // cylinder const cubeGeometry = new THREE.PlaneGeometry(2,1) // rectangular plane const cubeGeometry = new THREE.CircleGeometry(1) // circular plane
Of course, when we set the rectangular plane and the circular plane, by default, only the front side can be seen, and the back side cannot be seen. If you want to set the back side to be visible, you need to set the material additionally, as follows:
Of course, it can also be displayed in grid form:
const geometry = new THREE. SphereGeometry(1) // Mesh model rendering geometry const material = new THREE. MeshBasicMaterial({ color:0x00ffff, wireframe: true, // display in grid form // side:THREE.FrontSide, // The front side is visible, but the back side is invisible // side:THREE.BackSide, // The back side is visible, but the front side is invisible side:THREE.DoubleSide // can be viewed on both sides }) const mesh = new THREE. Mesh(geometry,material) scene. add(mesh)
Rotation, scaling, and translation methods of geometry
BufferGeometry can scale, translate, and rotate the geometry itself through the following methods, which essentially change the vertex data of the geometry.
The specific practical operation has been more or less mentioned in the previous article. If you want to know more, you can refer to the official document
The relevant knowledge is explained here, ok gives the note code of this article:
import * as THREE from 'three'; // import track controller import { OrbitControls } from 'three/examples/jsm/controls/OrbitControls' // 1. Create a scene const scene = new THREE. Scene(); // 2. Create a camera const camera = new THREE. PerspectiveCamera(75, window. innerWidth/window. innerHeight, 0.1, 1000) // Set the x, y, z axis coordinates, that is, set the camera position camera.position.set(0,0,10) // add the camera to the scene scene. add(camera) // 3. Add objects and create geometry // // Create an empty geometry top object // const geometry = new THREE. BufferGeometry() // // add vertex data // const vertices = new Float32Array([ // Write vertex coordinate data in the array // 0,0,0, // Vertex 1 coordinates // 1,0,0, // Vertex 2 coordinates // 1,1,0, // Vertex 3 coordinates // 0,1,0, // Vertex 4 coordinates // ]) // // BufferAttribute attribute buffer object represents vertex data // const attribute = new THREE.BufferAttribute(vertices,3) // 3 as a group // // Set the position attribute of the geometry attribute attribute // geometry.attributes.position = attribute // // typed array to create vertex data // const indices = new Uint16Array([ // 0,1,2,0,2,3 // ]) // // Definition of geometry vertex index // geometry.index = new THREE.BufferAttribute(indexs,1) // // The normal data of each vertex corresponds to the vertex position data one by one // const normals = new Float32Array([ // 0,0,1, // vertex 1 normal // 0,0,1, // 0,0,1, // 0,0,1, // ]) // // Set the geometry's vertex normal math // geometry.attributes.normal = new THREE.BufferAttribute(normals,3) // // set point material // const material = new THREE. PointsMaterial({ // color: 0xffff00, // size:0.2 , // }) // // define point model // const points = new THREE. Points(geometry,material) // scene. add(points) // // Set the line material // const material = new THREE.LineBasicMaterial({ // color:0xffff00, // yellow lines // }) // const line = new THREE.Line(geometry,material) // open line // // const line = new THREE.LineLoop(geometry,material) // closed line // // const line = new THREE.LineSegments(geometry,material) // Non-continuous lines // scene. add(line) const geometry = new THREE. SphereGeometry(1) // Mesh model rendering geometry const material = new THREE. MeshBasicMaterial({ color:0x00ffff, wireframe: true, // display in grid form // side:THREE.FrontSide, // The front side is visible, but the back side is invisible // side:THREE.BackSide, // The back side is visible, but the front side is invisible side:THREE.DoubleSide // can be viewed on both sides }) const mesh = new THREE. Mesh(geometry,material) scene. add(mesh) // 4. Initialize the renderer const renderer = new THREE. WebGLRenderer() renderer.setSize(window.innerWidth,window.innerHeight) document.body.appendChild(renderer.domElement) // add an ambient light const ambient = new THREE. AmbientLight(0xffffff,0.9) scene. add(ambient) // add axis helper const axesHelper = new THREE.AxesHelper(5) // The value represents the length of the line scene.add(axesHelper) // add to the scene // create track controller const controls = new OrbitControls(camera, renderer.domElement) // Set the controller damping to make the controller more realistic, but must call .update() in the animation loop controls.enableDamping = true export function render(){ // Call the stats update method every time the loop renders to refresh the time controls. update() renderer.render(scene,camera) // Periodically execute the rendering function of the camera and update the content on the canvas requestAnimationFrame(render) // The render function will be called when the next frame is rendered } // start rendering render()