We have been asked quite a lot about the Oriented Particles approach to physically based simulation, where particles besides position and velocity also have an ellipsoid shape, an orientation and an angular velocity. So we decided to do a small c++ demo in 2D of how it can be done in practice.

Basically what the demo show is a somewhat simplified version of our Oriented Particles Christmas Card. The demo is a standard GLUT/OpenGL application but it can also be cross-compiled to javascript/WebGL using emscripten, the result of which can be seen here. In the following we will assume that the reader has read the original paper and thus we will just give a brief introduction to the demo code.

In the demo we want to be able to toss this guy around:

Also we want to make it look like the guy has bones in his body. For this purpose we have drawn a “skeleton” and used OpenCV to identify individual bones and fit ellipses to these. The result can be seen below with the ellipses shown in red. Our little OpenCV program outputs code directly which has been inserted into the example code.


Because of some technicalities with emscripten we wanted to avoid loading files from the example, so the shaders have been inlined as strings in c-headers. Also the image of the guy has been saved to a c-header using GIMP and included directly in the example. Source code for the example can be found here.

The class PositionBasedDynamics encapsulates a basic generalized Position Based Dynamics simulation loop, and two constraints, StayAboveLineConstraint and GeneralizedShapeMatchingConstraint, have also been included in the project. In main.cpp particles are attached to nearby particles using implicit (generalized) shape matching constraints in the function CreateObject and the object is inserted to the physics system in the function UploadOrientedParticlesObjectToPhysicsSystem.

For visualization a grid of vertices is created covering the area of the simulated particles. The vertices are triangulated and supplied with texture coordinated and used for rendering. As the particles each have an orientation they each constitute a two-dimensional coordinate system and can thus be used for skinning the grid mesh. For this purpose a parametrisation of vertices is made in the function GenerateGrid2D. Each vertex can be skinned from up to 3 particles and the parametrisation information is stored as 3D texture coordinates where the integer part of each component describes which particle to skin from and the fractional part describes the weighting to use for this particle. Skinning matrices are computed and uploaded to a float RGBA texture and the actual skinning is done in the vertex shader.

To get some motion in the system the little guys bag is moved around the scene. The example is not optimal – for instance the grid mesh could be based on index buffers which would save some calculations. Another problem with the 2D example is that for shape matching involving only 2 particles we sometimes find an optimal rotation that mirrors the particle. A quick fix to remedy this would be to check for mirroring matrices in the 2D matrix class. Also as you might know there are some issues with floating point textures in WebGL – especially when we want to access the texture from vertex shader. So to improve compatability for WebGL/GLES it might be a good idea to do the skinning on the CPU instead of in the vertex program – this was the solution we used when porting the system to iOS.

Hope you’ll have fun with the example!



GLSL and WebGL pathtracing benchmark
Elements - Workshop on result-oriented design