Author: Nikola Lukić 📧 zlatnaspirala@gmail.com 📅 Version: 1.8.6 2026
Logo includes the official WebGPU logo. WebGPU logo by W3C Licensed under Creative Commons Attribution 4.0
This project is a work-in-progress WebGPU engine inspired by the original matrix-engine for WebGL.
It uses the wgpu-matrix npm package as a replacement for gl-matrix to handle model-view-projection matrices.
Published on npm as: matrix-engine-wgpu
- ✔️ Support for 3D objects and scene transformations
- ✔️ Ammo.js physics integration
- ✔️ Networking with Kurento/OpenVidu/Own middleware Nodejs -> frontend
- ✔️ Bloom post processing
- 📦 Based on the
shadowMappingsample from webgpu-samples - ✔️ Web GUI(online) Editor [app exec graph] with Visual Scripting (Named: FlowCodexVertex)
- ✔️ Web GUI(online) Editor [shader graph] with Visual Scripting (Named: FlowCodexShader)
- ✔️ Dynamic shadow cast (done also for skinned meshes)
- ✔️ VertexShader displacment (done also for skinned meshes), nice for water effect
- (Preparing API DOCs)[https://github.com/zlatnaspirala/matrix-engine-wgpu/wiki/Visual-Scripting-API]
- 🎯 Test linux OS -> Editor creates and manages files internally (Windows tested only!)
- 🎯 Add editor nav arrows in editor mode
- 🎯 Test others physics libraries [same interface/drive system]
- 🎯 Sync npm version and make editor posible from
npm i matrix-engine-wgpu - 🎯 Sync npm version for matrix-engine-wgpu wrapper (me-webgpu-react)[https://github.com/zlatnaspirala/me-webgpu-react]
EditorX has two main parts:
- Frontend (
./src/tools/editor) - Backend (
./src/tools/editor/backend)
Before running anything, install dependencies with
npm i:
- in the root folder
- and also inside
./src/tools/editor/backend
The backend is built using Node.js 🟢
- Editor creates and manages files internally (Windows tested only!).
- Scene container [adding objs -> auto save]
- SceneObject property container [selected object] [auto save]
- Assets toolbar added (bottom panel)
- Add GLB or OBJ files (also mp3) from the asset toolbox by selecting them.
- Top menu for adding primitives (Cube / Sphere) with or without physics ⚙️
- Integrated Visual Scripting system 🧠 FluxCodexVertex
- Add Math nodes, events / custom methods, variable popup, SceneObject access
- Get SceneObject → set position → bind
onTargetReachevents - SetTexture, setMaterial,
- Fetch, GetArray, forEach, Print, IF, Math, compare, string operation etc...
- Custom func editor - Function Manager after creating use it from visual scripting.
- Generator physics bodies in sequence pos in choosen geometry in world pos (Pyramid, wall, in place).
- onDraw Event node - called on every frame.Can be multiply used and set skip value. More skip less calls.
- Audio reactive node Audio to pos , rot, scale or geometry or any... Outputs low, mid, high, energy and beat.
- Run the graph
▶️ - Stop the graph Just basic - clear dinamic created object and stops onDraw calls.
- Save graph
- Saved to file direct also cached on LocalStorage
- For final builds, becomes a real JS object injected into the app flow.[DONE]
- Export graph to JSON
- Import graph from JSON
Visual Scripting is only available when running the engine from source
(not from npm i matrix-engine-wgpu).
You must clone or download the engine source from the GitHub repository.
- Run the editor with:
npm run editorxfrom the engine root directory. EditorX is an alias for FluxCodexVertex (needed three words to keep the name unique) Run the scene by pressing F6 or by clicking Run in the left panel If you delete all objects from the scene, you must refresh the page and add at least one object again Before importing a graph, delete all nodes from the FluxCodexVertex graph Saving is still based on LocalStorage After deleting everything, click Save to store an empty [] array All changes in graph must be saved manually/clicking for now 💾 (no autosave for graphs).
-
Canvas is dynamically created in JavaScript—no
<canvas>element needed in HTML. -
Access the main scene objects:
app.mainRenderBundle[0];
or by name:
app.getSceneObjectByName("Sphere1");
-
Add meshes with
.addMeshObj(), supporting.objloading, unlit textures, cubes, spheres, etc. -
Destroy sceneObj:
app.removeSceneObjectByName("Sphere1");
-
Also interest for clearing physics body and render part:
app.destroyByPrefix("towers"); // OR app.destroyBySufix("_001"); // Destroy objects become true after calling flushDestroyQueue function. app.flushDestroyQueue();
-
Cleanly destroy the scene:
app.destroyProgram();
For now translation is only with WASD keyboard keys.
Supported types: WASD, arcball
WASD also use 'c' and 'v' for up and down camera position.
mainCameraParams: {
type: 'WASD',
responseCoef: 1000
}Best way for access physics body object: app.matrixAmmo.getBodyByName(name) also app.matrixAmmo.getNameByBody
Control object position:
app.mainRenderBundle[0].position.translateByX(12);Teleport / set directly:
app.mainRenderBundle[0].position.SetX(-2);Adjust movement speed:
app.mainRenderBundle[0].position.thrust = 0.1;
⚠️ For physics-enabled objects, use Ammo.js functions —.positionand.rotationare not visually applied but can be read.
Example:
app.matrixAmmo.rigidBodies[0].setAngularVelocity(new Ammo.btVector3(0, 2, 0));
app.matrixAmmo.rigidBodies[0].setLinearVelocity(new Ammo.btVector3(0, 7, 0));Manual rotation:
app.mainRenderBundle[0].rotation.x = 45;Auto-rotate:
app.mainRenderBundle[0].rotation.rotationSpeed.y = 10;Stop rotation:
app.mainRenderBundle[0].rotation.rotationSpeed.y = 0;
⚠️ For physics-enabled objects, use Ammo.js methods (e.g.,.setLinearVelocity()).
Manipulate WASD camera:
app.cameras.WASD.pitch = 0.2;💡 Lighting System
Matrix Engine WGPU now supports independent light entities, meaning lights are no longer tied to the camera. You can freely place and configure lights in the scene, and they will affect objects based on their type and parameters.
Supported Light Types
SpotLight – Emits light in a cone shape with configurable cutoff angles.
(Planned: PointLight, DirectionalLight, AmbientLight)
Features
✅ Supports multiple lights (4 max), ~20 for next update. ✅ Shadow-ready (spotlight0 shadows implemented, extendable to others)
## Important Required to be added manual:engine.addLight();Access lights with array lightContainer:
app.lightContainer[0];Small behavior object.
- For now just one ocs0 object Everytime if called than updated (light.position[0] = light.behavior.setPath0()) behavior.setOsc0(min, max, step); app.lightContainer[0].behavior.osc0.onmaximum_value = function() {/ what ever*/}; app.lightContainer[0].behavior.osc0.on_minimum_value = function() {/* what ever_/};
Make light move by x.
loadObjFile.addLight();
loadObjFile.lightContainer[0].behavior.setOsc0(-1, 1, 0.01);
loadObjFile.lightContainer[0].behavior.value_ = -1;
loadObjFile.lightContainer[0].updater.push(light => {
light.position[0] = light.behavior.setPath0();
});With last glb feature materials become part of engine also.
material: {type: 'standard'}
material: {type: 'pong'}
material: {type: 'power'}
material: {type: 'water'}
material: {type: 'metal'}- Standard is fully supported with lights shadow cast down (not for anims yet)
- Pong
- Power - no shadows cast
// Also for addMeshObj
TEST_ANIM.addGlbObj({
material: {type: 'power'},
...
}, null, glbFile);Change only textures (no recreation of pipeline)
await app.mainRenderBundle[0].loadTex0(["res/icons/editor/chatgpt-gen-bg.png"]);
app.mainRenderBundle[0].changeTexture(app.mainRenderBundle[0].texture0);Setup Blend (For water mat use blend!)
app.mainRenderBundle[0].setBlend(0.5);Examples for setup water params:
app.mainRenderBundle[0].updateWaterParams(
[0.0, 0.1, 0.3], // Deep: navy
[0.2, 0.6, 0.9], // Shallow: blue
2.0, // Wave speed: fast continuous
1.8, // Wave scale: rolling waves
0.5, // Wave height: tall active waves
1.5, // Fresnel: strong
50.0 // Specular: soft highlights
);
app.mainRenderBundle[0].updateWaterParams(
[0.0, 0.3, 0.5], // Deep: medium blue
[0.3, 0.8, 1.0], // Shallow: bright cyan
1.2, // Wave speed: gentle continuous (changed from 0.6)
2.5, // Wave scale: smooth ripples (changed from 5.0)
0.3, // Wave height: visible movement
2.5, // Fresnel: moderate reflection
100.0 // Specular: sharp sparkles
);Activete with :
app.activateBloomEffect();Manipulate with app.bloomPass:
setKnee
setIntensity
setThreshold
setBlurRadius
The raycast returns:
{
rayOrigin: [x, y, z],
rayDirection: [x, y, z] // normalized
}Manual raycast example:
window.addEventListener("click", event => {
let canvas = document.querySelector("canvas");
let camera = app.cameras.WASD;
const {rayOrigin, rayDirection} = getRayFromMouse(event, canvas, camera);
for (const object of app.mainRenderBundle) {
if (
rayIntersectsSphere(
rayOrigin,
rayDirection,
object.position,
object.raycast.radius
)
) {
console.log("Object clicked:", object.name);
}
}
});Automatic raycast listener:
addRaycastListener();
// Must be app.canvas or [Program name].canvas
app.canvas.addEventListener("ray.hit.event", event => {
console.log("Ray hit:", event.detail.hitObject);
});Engine also exports (box):
- addRaycastsAABBListener
- rayIntersectsAABB,
- computeAABB,
- computeWorldVertsAndAABB,
import MatrixEngineWGPU from "./src/world.js";
import {downloadMeshes} from "./src/engine/loader-obj.js";
export let application = new MatrixEngineWGPU(
{
useSingleRenderPass: true,
canvasSize: "fullscreen",
mainCameraParams: {
type: "WASD",
responseCoef: 1000,
},
},
() => {
addEventListener("AmmoReady", () => {
downloadMeshes(
{
welcomeText: "./res/meshes/blender/piramyd.obj",
armor: "./res/meshes/obj/armor.obj",
sphere: "./res/meshes/blender/sphere.obj",
cube: "./res/meshes/blender/cube.obj",
},
onLoadObj
);
});
function onLoadObj(meshes) {
application.myLoadedMeshes = meshes;
for (const key in meshes) {
console.log(`%c Loaded obj: ${key} `, LOG_MATRIX);
}
application.addMeshObj({
position: {x: 0, y: 2, z: -10},
rotation: {x: 0, y: 0, z: 0},
rotationSpeed: {x: 0, y: 0, z: 0},
texturesPaths: ["./res/meshes/blender/cube.png"],
name: "CubePhysics",
mesh: meshes.cube,
physics: {
enabled: true,
geometry: "Cube",
},
});
application.addMeshObj({
position: {x: 0, y: 2, z: -10},
rotation: {x: 0, y: 0, z: 0},
rotationSpeed: {x: 0, y: 0, z: 0},
texturesPaths: ["./res/meshes/blender/cube.png"],
name: "SpherePhysics",
mesh: meshes.sphere,
physics: {
enabled: true,
geometry: "Sphere",
},
});
}
}
);
window.app = application;This example shows how to load and animate a sequence of .obj files to simulate mesh-based animation (e.g. walking character).
import MatrixEngineWGPU from "../src/world.js";
import {downloadMeshes, makeObjSeqArg} from "../src/engine/loader-obj.js";
import {LOG_MATRIX} from "../src/engine/utils.js";
export var loadObjsSequence = function () {
let loadObjFile = new MatrixEngineWGPU(
{
useSingleRenderPass: true,
canvasSize: "fullscreen",
mainCameraParams: {
type: "WASD",
responseCoef: 1000,
},
},
() => {
addEventListener("AmmoReady", () => {
downloadMeshes(
makeObjSeqArg({
id: "swat-walk-pistol",
path: "res/meshes/objs-sequence/swat-walk-pistol",
from: 1,
to: 20,
}),
onLoadObj,
{scale: [10, 10, 10]}
);
});
function onLoadObj(m) {
console.log(`%c Loaded objs: ${m} `, LOG_MATRIX);
var objAnim = {
id: "swat-walk-pistol",
meshList: m,
currentAni: 1,
animations: {
active: "walk",
walk: {from: 1, to: 20, speed: 3},
walkPistol: {from: 36, to: 60, speed: 3},
},
};
loadObjFile.addMeshObj({
position: {x: 0, y: 2, z: -10},
rotation: {x: 0, y: 0, z: 0},
rotationSpeed: {x: 0, y: 0, z: 0},
scale: [100, 100, 100],
texturesPaths: ["./res/meshes/blender/cube.png"],
name: "swat",
mesh: m["swat-walk-pistol"],
physics: {
enabled: false,
geometry: "Cube",
},
objAnim: objAnim,
});
app.mainRenderBundle[0].objAnim.play("walk");
}
}
);
window.app = loadObjFile;
};💡 GLB binary loading bvh(rig)animations.
- See examples glb-loader.js (build with npm run glb-loader)
- Next update materials improvements!
- Light affect just for first frame or t-pose.
- For npm service import uploadGLBModel.
From 1.6.0 glb support multi skinned mesh + mutli primitives cases.
Limitation: glb loader not handled for non animation case. Use obj loader for static mesh.
Must powerfull call is new class MEMeshObjInstances. MEMeshObj is good now for optimised call(less conditionals). You can add instanced draws and modify basic color for each individualy also transformation good for fantazy or any game dev.
Example:
var glbFile01 = await fetch(p).then(res =>
res.arrayBuffer().then(buf => uploadGLBModel(buf, this.core.device))
);
this.core.addGlbObjInctance(
{
material: {type: "standard", useTextureFromGlb: true},
scale: [20, 20, 20],
position: {x: 0, y: -4, z: -220},
name: this.name,
texturesPaths: ["./res/meshes/glb/textures/mutant_origin.png"],
raycast: {enabled: true, radius: 1.5},
pointerEffect: {enabled: true},
},
null,
glbFile01
);
// access - index -0 is BASE MESH ! I added maxLimit = 5 you can change this from engine source.
// added lepr smoot translate , also color+.
app.mainRenderBundle[1].instanceTargets[1].position[2] = 10;
// This recreate buffer it is not for loop call space
app.mainRenderBundle[1].updateInstances(5);TEST.loadVideoTexture({
type: "video", // video , camera //not tested canvas2d, canvas2dinline
src: "res/videos/tunel.mp4",
});For canvasinline attach this to arg (example for direct draw on canvas2d and passing intro webgpu pipeline):
canvaInlineProgram: (ctx, canvas) => {
ctx.fillStyle = "black";
ctx.fillRect(0, 0, canvas.width, canvas.height);
ctx.fillStyle = "white";
ctx.font = "20px Orbitron";
ctx.fillText(`FPS: ${Math.round(performance.now() % 60)}`, 10, 30);
};| Scenario | Best Approach | | ------------------------------ | ---------------------------------- | | Dynamic 2D canvas animation | `canvas.captureStream()` → `video` | | Static canvas snapshot | `createImageBitmap(canvas)` | | Replaying real video or webcam | Direct `video` element |
If this happen less then 15 times (Loading procces) then it is ok probably...
Draw func (err):TypeError: Failed to execute 'beginRenderPass' on 'GPUCommandEncoder': The provided value is not of type 'GPURenderPassDescriptor'.It is possible for 1 or 2 warn in middle time when mesh switch to the videoTexture. Will be fixxed in next update.
Dimension (TextureViewDimension::e2DArray) of [TextureView of Texture "shadowTextureArray[GLOBAL] num of light 1"] doesn't match the expected dimension (TextureViewDimension::e2D).From 1.7.0 engine powered by networking. Used kurento&Openvidu server for backend. Very good for handling streams, channels etc...
See example code at ./examples/games/rpg/
Buildin net sync basic: Lets say app is engine root object and net is networking object. webRTC tech with openvidu server middleware server
app.net = new MatrixStream({
active: true,
domain: "maximumroulette.com",
port: 2020,
sessionName: "forestOfHollowBlood-free-for-all",
resolution: "160x240",
isDataOnly: urlQuery.camera || urlQuery.audio ? false : true,
customData: forestOfHollowBlood.player.data,
});
// `customData` if you wanna pass some extra meta data on connection-created event
// No need always camera and mic we can use is like data signaling only.How to use buildin network operations:
// Activate emiting remote position, on remote side adapted on scene object with same name
sceneObject.position.netObject = sceneObject[0].name;
// For now net view for rot is axis separated - cost is ok for orientaion remote pass
sceneObject.rotation.emitY = sceneObject.name;
// If you need oposite remote/local situation. For example:
// you friendly object is enemy object at remote machine that just setup another flag
sceneObject.position.netObject = sceneObject[0].name; // we still need this setup!
sceneObject.position.remoteName = sceneObjecOposite[0].name;Use toRemote arg prop in send pass, if not it is default [] emit to all.
Intelegent emit for teams (two teams implemented). Position will be emited for teams[0] and received for uniq scene obj name. Position will be emited for teams[1] and received for oposite (eg. enemy) uniq scene obj name. In this case toRemote is overrided (Don't pass it). Used for RPGMOG project.
mesh.position.teams[0] = [connId0, connId1];
mesh.position.teams[1] = [connId2, connId3];
// emiter in core engine file
if (this.teams.length > 0)
if (this.teams[0].length > 0)
app.net.send({
toRemote: this.teams[0], // default null remote conns
sceneName: this.netObject, // origin scene name to receive
netPos: {x: this.x, y: this.y, z: this.z},
});
if (this.teams.length > 0)
if (this.teams[1].length > 0)
app.net.send({
toRemote: this.teams[1], // default null remote conns
remoteName: this.remoteName, // to enemy players
sceneName: this.netObject, // now not important
netPos: {x: this.x, y: this.y, z: this.z},
});Buildin Url Param check for multiLang. MultiLang feature is also buildin options.
Load multilang json file data.
- ?lang=en
Access from code:
urlQuery.lang;main.js is the main instance for the jamb 3d deluxe game template.
It contains the game context, e.g., dices.
What ever you find here onder main.js is open source part. Next level of upgrade is commercial part.
For a clean startup without extra logic, use empty.js.
This minimal build is ideal for online editors like CodePen or StackOverflow snippets.
Uses watchify to bundle JavaScript.
"main-worker": "watchify app-worker.js -p [esmify --noImplicitAny] -o public/app-worker.js",
"examples": "watchify examples.js -p [esmify --noImplicitAny] -o public/examples.js",
"main": "watchify main.js -p [esmify --noImplicitAny] -o public/app.js",
"empty": "watchify empty.js -p [esmify --noImplicitAny] -o public/empty.js",
"build-all": "npm run main-worker && npm run examples && npm run main && npm run build-empty"All resources and output go into the ./public folder — everything you need in one place.
This is static file storage.
🎲 The first full app example will be a WebGPU-powered Jamb 3d deluxe game.
Features done:
- Navigation mesh
- Hero class
- GLB animations
- Automatic team selection (South vs North)
- Homebase stone (tron / enemytron)
- Invertory (construct from 2 or 3 new upgraded item)
Install it on your desktop with one click.

Invest in Forest Of Hollow Blood 9,660$ See more details at FOHB Wiki
-
💲💲💲 Support JAMB 3D this project on itch.io
-
MOBA Forest Of Hollow Blood Live
-
CodePen Demo → Uses
empty.jsbuild from: https://maximumroulette.com/apps/megpu/empty.js
Performance for Jamb game:
Special licence for MOBA example:
Creative Commons Attribution 4.0 International (CC BY 4.0)
You are free to share and adapt this project, provided that you give appropriate credit.
Attribution requirement:
Include the following notice (with working link) in any distributed version or about page:
"Forest Of Hollow Blood — an MOBA example made with MatrixEngineWGPU (https://github.com/zlatnaspirala/matrix-engine-wgpu)"
Run editor
npm run editorxNavigate to matrix-engine.html it is landing page for editor.
After create new project or load project page will be redirect to
./public/<PROJECT_NAME>.html
Source location : ./projects/<PROJECT_NAME>
Features :
-
Create new project/ load project - only on landing page
-
Create cubeMesh
-
Properties box for selected sceneObj
-
Events system (Create func and attach to sceneObj)
-
Resource navigation
-
Visual Scripting
@Note License for fluxCodexVertex.js (MPL 2.0) Affect only this file! Just leave comment licence part in header of file.
YT video promotion :
About 'In fly' regime: Editor can be activated even without backend node but in that case no saves.
You may use, modify, and sell projects based on this code — just keep this notice and included references intact (whole licence paragraph).
-
Engine design and scene structure inspired by: WebGPU Samples
-
OBJ Loader adapted from: http://math.hws.edu/graphicsbook/source/webgl/cube-camera.html
-
Dice roll sound
roll1.wavsourced from: https://wavbvkery.com/dice-rolling-sound/ -
Raycasting logic and glb loader assisted by ChatGPT.
-
GLTF Loader: https://github.com/Twinklebear/webgpu-gltf, improved with chatgpt.
-
Music by Mykola Sosin from Pixabay
-
Characters used from great mixamo.com -✅What you can do You can use Mixamo characters and animations royalty-free in commercial, personal, or non‑profit projects (games, films, prints, etc.).You own your creations / how you use them.No requirement to credit Adobe / Mixamo (though allowed). -🚫What you cannot do You cannot redistribute or sell the raw Mixamo character or animation files “as is” (i.e. as standalone assets) to others.You can’t use Mixamo content to create a competing library of characters / animations (i.e. you can’t just package them and sell them to others). You can’t use Mixamo’s content (or outputs) to train AI / machine learning models.
-
Music used: BLACK FLY by Audionautix | http://audionautix.com Music promoted by https://www.free-stock-music.com Creative Commons Attribution-ShareAlike 3.0 Unported
-
Used free assets from great https://craftpix.net Magic icons : https://craftpix.net/freebies/free-rpg-splash-game-512x512-icons/
-
For background music in rpg template used: Music by Dvir Silverstone from Pixabay Sound Effect by Crunchpix Studio from Pixabay Music by Emmraan from Pixabay
'Ruined rock fence' (https://skfb.ly/6RLwN) by VladNeko is licensed under Creative Commons Attribution (http://creativecommons.org/licenses/by/4.0/).
// test "fantasy rock" (https://skfb.ly/oHXAz) by duckcracker02 is licensed under Creative Commons Attribution (http://creativecommons.org/licenses/by/4.0/).
"Fantasy Rock" (https://skfb.ly/oHZSq) by lalune is licensed under Creative Commons Attribution (http://creativecommons.org/licenses/by/4.0/).
Invertory images https://djinnbestiary.itch.io/ancient-oddities-vol-1-13-free-potions
Top level main.js instance (Jamb 3d deluxe)
WebGPU Ray Tracing ChatGPT , claude ai







