Faking It: Inverse Force Feedback for Satisfying VR Interactions

Photo by Giu Vicente on Unsplash

One of the most well-known problems with physically-inspired interactions in virtual reality is that the virtual world provides no physical feedback. When you hammer a nail in the real world, the hammer bounces off the nail with a satisfying bang; but in VR, even if we play bang.mp3, the hammer doesn’t bounce, and it doesn’t really feel like you hit something. Likewise, when you fire a laser pistol in VR, there’s no recoil; the game may show an effect and play a sound, but the lack of force feedback detracts from the impact, and the whole interaction can often feel rather hollow. VR experience designers commonly work around this problem by either ignoring it (trusting to audio-visual feedback and haptics to make interactions satisfying) or by avoiding it altogether (designing the experience to steer users away from scenarios where force feedback would be expected and missed).

But what if there were another way? What if there were something we could do to evoke in VR some portion of a physically-inspired action’s inherent satisfaction? You may have heard, as I often have, that smiling can help cause you to feel happier and that many studies explore the ways our minds are influenced by our bodies. With such things in mind, I recently began to wonder whether basic VR interactions might feel more satisfying if they required users to “fake” the corresponding force feedback. In other words, if we force the motion in VR to mimic the real motion — if a hammer doesn’t hammer unless it bounces back, or if a laser pistol doesn’t shoot properly unless it recoils — does that make the resulting interaction feel better?

I hadn’t the faintest idea. But, with Babylon.js, I reckoned it wouldn’t be too difficult to find out; so, arming myself with a rather clumsy name for the idea — inverse force feedback — I decided to try it out. To the Playground!

Whack.

Hammering Virtual Nails

Hammering a nail is a very familiar physical experience, and as such I’ve found it to be one of the motions that, in virtual reality, feels most awkward. Generally speaking, hitting anything with anything in VR tends to feel disassociated and weird, but because I know exactly how a hammer hitting a nail is supposed to feel, the VR representation feels all the stranger. Given these considerations, I never really hoped that inverse force feedback would be able to make this interaction feel right; I did, however, want to see whether I could make it feel less wrong.

For this, I made a (VR-only) Playground representing the desired scenario using recycled assets and the most glorious dev art you can ever hope to see. (Why yes, the comically-oversized rubber mallet does make gunshot sounds when it strikes the nail. What’s your question?) The core function of the Playground is to detect when the hammerhead is sufficiently near to the nail, then play a sound and move the nail into the block as though driven. This is the behavior if the REQUIRE_RECOIL flag at the top of the Playground is set to false.

However, if REQUIRE_RECOIL = true, a new set of constraints are enabled which check not only that the hammerhead is close to the nail, but that the motion which brought it there resembled the motion of swinging a real hammer, including the hammer’s follow-up bounce. In effect, this flag “turns on” inverse force feedback as an interaction requirement. The relevant code sample is below; let’s walk through the logic.

The first ten lines of this code sample find the low point, the high point, and the most recent point (at positionIdx) of the hammer’s motion in the past few frames (stored in positions). What we want to detect is that the hammer started high, dropped to roughly the position where it needs to be to strike the nail, then immediately started rising again following roughly the same arc it used to descend; this motion is what we consider to be a hammer stroke. The check on line 12 confirms that the low point of the hammer’s motion was sufficiently close to the target position, which is all that’s needed if REQUIRE_RECOIL = false. If REQUIRE_RECOIL = true, however, three other checks are enabled. First, on line 17, we check that the low point of the stroke occurred at least three frames ago, meaning that if positions does contain a hammer stroke, we’re at least two frames into the rebound. Then, on line 22, we check that the current height of the hammerhead is at least 20cm above its height at the low point, which sets a minimum on the real-world size of a hammer stroke: you must swing the hammer at least 20cm for the motion to count. Finally, we use cross products to check that the down-stroke and the up-stroke are roughly coplanar, which is characteristic of the motion of hammering a nail. If all these checks are passed, then the motion in positions is recognized and accepted as a hammer stroke, so we play a sound and drive the nail.

If you have a VR device that can do so, I definitely recommend giving this demo a try, both with and without the REQUIRE_RECOIL flag enabled. (You might need to unmute the Playground as the Oculus browser may auto-mute it when it plays a sound.) For me, the resulting experience was quite a bit more compelling with inverse force feedback than without, though neither experience is very similar to real hammering. Interestingly, despite the fact that manually creating your own hammer-bounce is actually less realistic than simply swinging the hammer down, the knowledge that I needed to move as though I was really hammering somehow made the experience feel more substantial. This was a pattern I also observed when trying out my second experiment involving…

Pew!

Dual-Stage Laser Pistol Recoil

No, I don’t know why a laser pistol would have recoil. Maybe it’s pushing on the quantum vacuum.

But in terms of interactions, a VR laser pistol without recoil feels peculiarly unsatisfying to use. As I was trying to figure out how to incorporate inverse force feedback into a laser pistol demo/experiment, I came upon the idea of a two-stage laser mechanism that instantly fires a weak laser, then turns that into a second, much stronger laser if a recoil motion is detected. The metaphor being explored here is that, if your hand moved back “due” to inverse force feedback, then the shot must have been powerful; and if not, then the shot must not have been powerful. Coded up, this became another Playground with a different — and much simpler — motion detection mechanism. (Also, fun fact: every single object in this scene was made from Babylon MeshBuilder cubes. #DevArt.)

Unlike the hammer stroke detector, this detector doesn’t need to check motion that goes both down and up. All that’s required for recoil is to check that, after a very small delay after firing (arbitrarily chosen to be 100ms), the laser pistol has been moved in a way reminiscent of a recoil motion. For this demo, I chose to emulate a highly stylized and exaggerated recoil of the kind seen in old Western films: the laser pistol is expected both to tilt upward and to kick back by at least 10cm. If this motion is detected, the demo waits an additional 400ms for dramatic effect, then shows a much larger laser and a louder sound effect than those originally presented.

Again, if you have a VR device that can do so, I definitely recommend giving this demo a try (it, too, features a REQUIRE_RECOIL flag to disable inverse force feedback) and making your own observations about it. For me, as before, I found the interaction with inverse force feedback a good bit more satisfying than without, though the difference was again not transformative. As before, the unrealistic nature of the “faked” motion did nothing to worsen how the interaction felt. However, along with the improved feel, I noticed a few other changes to the interaction that I’d seen hints of in the hammer demo as well.

  1. The interaction was much more tiring. Whereas firing a VR laser pistol before was as easy as pointing at something, having to quickly yank my arm back for the inverse force feedback became tiring surprisingly quickly.
  2. The interaction was much more difficult. Specifically, it became much more difficult to hit my target with inverse force feedback enabled than it was without. This was partly due to the fact that making sure I pulled the trigger before I started moving required careful timing. More interestingly, though, it was also due to the fact that the movement between every shot made it much more difficult to improve accuracy in “follow-up” shots by moving to compensate for misses I observed.
  3. Inverse force feedback completely changed my perspective on each individual interaction. Without inverse force feedback, laser shots were so cheap and easy that I didn’t really think of them individually; if I wanted to hit the cube, I would just take a shot, then adjust and take more shots until, very soon, I was hitting the cube every time while pulling the trigger as fast as I could. However, when I tried that again with inverse force feedback, that changed completely. Each shot became individual because (1) each shot required effort, (2) each shot was almost completely disconnected from prior shots, and (3) I was firing far fewer shots because the motion required precluded “spamming the trigger.”
If all it takes is a button press, why wouldn’t you press it as fast as possible?

What This All Implies

In brief: I haven’t the faintest idea. I’m excited to find out, though.

I always love learning about a new way to use motion and interaction, even when the new learning is relatively small. I don’t think inverse force feedback is particularly novel, and some of its attributes (particularly its physically tiring nature) could serve as drawback that would preclude its use from some applications. But for specific scenarios, and particularly for actions in an XR experience that you want to make significant and special, I think inverse force feedback could be powerful. I’m especially intrigued by the way my perspective on the individual interactions shifted; perhaps such effects might have an interesting application as game mechanics.

And there may be applications beyond pure entertainment that I wouldn’t even know how to explore. Perhaps there are places where the advantages of inverse force feedback are transformatively important, or perhaps there are places where its disadvantages are advantages. Perhaps if a VR activity can be made sufficiently satisfying, it might be useful for therapy, providing a low- or variable-impact way to emulate activities that might otherwise be inaccessible. Perhaps if a simulated interaction is physically tiring, it could play a role in empathy experiences, helping people half a world away glimpse what it would be like to mine by hand, or to subsistence farm. There are so many places VR development and technology could take us in the near future. Might inverse force feedback play a part in that future?

I haven’t the faintest idea!

Photo by insung yoon on Unsplash

Justin Murray — Babylon.js Team

https://twitter.com/syntheticmagus

Babylon.js: Powerful, Beautiful, Simple, Open — Web-Based 3D At Its Best. https://www.babylonjs.com/