Neat stuff, but so far I don’t see anything hasn’t already been demo’d with PhysX or OpenCL. Also, notice that the particle counts are bit skinny, leading me to wonder if it’s GPU-accelerated. Smoke, fire and fluids look increasingly realistic as you ratchet up the particle counts. Seeing high and low particle-counts side-by-side helps one appreciate the difference. (It’s kinda’ like HD versus standard resolutions.) For instance, compare the dust in the turbulence demo below to similar sequences from the Lagoa vid.[pullquote_right]The challenge, as with 3D interfaces, is to make the physics component truly useful and usable, something more than one-time wow factor.[/pullquote_right]Actually, I may be comparing apples to oranges—and unfavorably so for Lagoa. (The rendering credits on the Lagoa vid suggest that it isn’t real-time.) In any case, OpenCL, PhysX and the like will bring interactive million-particle counts to life. So how will interaction designers leverage the power? The low-hanging fruits are the hypnotic visualizations that video DJs produce. Beyond that, I want to see how we can build information landscapes and fluid user interfaces that take advantage of real-time physics. The Expression folks have already hinted as to how physics behaviors can enhance an interface. Information boids are another technique that would benefit from real-time physics. The challenge, as with 3D interfaces, is to make the physical dynamics truly useful and usable, something more than one-time wow factor.
What are your thoughts on applications of real-time physics to everyday user experiences?