Jump to content

Taking user inputs from Keyboard / Mouse / Gamepad


Recommended Posts


I am starting to use BabylonJS and can't find an input manager in the documentation ?

Let's say I have a main character that can jump and I want my game to be playable on PC, mobile without/with a controller.

I would like the main character to jump whenever the player presses the xbox controller's button "A" or the keyboard's key "space" or tap on the screen.

Is it possible to register these events under the same name "Jump" and write something like the following?

if inputManager.GetButtonJustPressed("Jump"){



Thank you very much for the help :D 

Link to comment
Share on other sites

If I understand, you would manage the keyboard with the ActionManager and the gamepad with the GamepadManager/Observable?

The documention says:


There are two features of Babylon.js that handle events, actions and observables. It is also possible to use plain Javascript.

It seems to me that:

- there is no actions in the ActionManager related to the gamepad,

- there is no observable related to the Keyboard

- there are actions in the ActionManager related to the keyboard,

- there is an observable related to the Gamepad


Maybe it could make sense to have actions/observables for both?

Furthermore, it could be nice to have actions/observables that do not care from where the event is trigger.

For instance:

I would attach an event "Jump" to the Button A of my Gamepad and to the space key of my keyboard.

Then, I would either use an Observable or the ActionManager to trigger some code when the "Jump" event is triggered.


That way you could easily change your input settings by changing which inputs (e.g. key press, button press, click) triggers the "Jump" event.


Link to comment
Share on other sites

Thank you JohnK, however, if gamepads has the right to have their own BABYLON.GamepadManager(), wouldn't be more consistent to move the keyboard observable to a BABYLON.KeyboardManager()? Why not do the same for the pointer and make a PointerManager()?


Thank you Deltakosh, I am happy to try :D where do I start (and what about the above?)?


Next to that, do you think it could be nice to have events/observables that aren't specific to keyboard/gamepad/mouse?

We could have an InputManager creating a layer of abstraction with the input sources.

The InputManager could be just a relay that would registers to many Observables and be itself an Observable.


For instance, BabylonJS's users would register:

- "Keyboard KeyDown Space" as "Jump"

- "TouchScreen PointerDown" as "Jump"

- "Gamepad ButtonDown A" as "Jump"

Then, when the users are writing their jump mechanics, they wouldn't need to check for all these input sources, they could just test:

if inputManager.wasJustTriggered("Jump") { doJump() }


inputManager.onEvent.add((event_name, state)=>{
        //event was triggered
        if(event_name === "Jump")

(Note that there could even be a Json import/export to store the input settings)


We could also have a common interface for directional input (controlling up/down with gamepads' analog stick vs keyboards' keys)

I am happy to try doing a PR :)

Link to comment
Share on other sites

I'm not sure i understand the need, seems like reinventing the wheel? no? 
observers can do everything you want, as you write the logic.
Observers simply tell you, "this key was pressed" "this key was released"

and here with a "keyIndex", space = "jump", w = "forward"

The json idea is nice, but a simple extension based on keyboard observers seems more reasonable... I might be misunderstanding something..

Link to comment
Share on other sites

I think he's asking for more than key observables, but a way to map and integrate the various input sources.
ie: space bar, trigger button or left mouse click - triggers a "jump" event.  I am not aware of a way to do a clean mapping from different sources.  The webxr spec looks like it has provided space for 'gestures', so it's only getting more complicated.  It doesn't feel to me like a reinventing of the wheel, but adding an abstraction layer as nodragem said.  Although I'm not sure it belongs in the core, seems not too hard to write a mapping.
// setup.  I just made up variables.
inputManager.addAction(PointerEvent.Down, MouseButton.Left & MouseClick.Single, "jump");
inputManager.addAction(KeyboardEvent.KeyDown, 32 /* space */, "jump");
inputManager.addAction(GamePad.TriggerButton, "jump");

// add listener
inputManager.eventObservables.add("jump", (evt) => {

All the needed code is in the Observable<T> already.  There is a mask there that kinds works the same - as a 'filter' on what is triggered, but this is for strings.  I think just a simple object dictionary to list would suffice.  The observable could be pattern matching (ie: "jum*") or function as well.
inputManager.eventObservables.add((evt) => evt.name.startsWith("jum"), () => {
   doJump(); // ie: will also be triggered on "jumparound"

I would start here - you need something like a 'mask':
edit: this input manager also needs to listen for gamepad de/registrations.  that code is not too much either.

Link to comment
Share on other sites

  • 2 weeks later...

@Nodragem If you do end up building something for this - there is an XR concept called "Action Sets" and I think that is a useful concept to include with any implementation.  Basically, depending if you are playing the game or navigating the menu (context), then the bindings are different.  ie: trigger button is "jump" during game play, and "select" when navigating menus.  If you watch this video for a couple of minutes then you can see it being explained with kittens (yes, kittens) :)
Also, there is a question at 57:30 about action sets, where he discusses the global action set and I think that is key as well.  I am researching XR, so wouldn't expect you to make it that far!


Link to comment
Share on other sites

  • 4 weeks later...

I just had a look at the Unity toolkit (https://doc.babylonjs.com/resources/intro) and the Scene Manager extension that it uses (https://github.com/BabylonJS/Extensions/blob/master/SceneManager/src/babylon.scenemanager.ts).

Unity does have an high-level/flexible input manager as described above (see https://docs.unity3d.com/Manual/class-InputManager.html ) :

The Input Manager

Hence I was thinking that maybe the Unity toolkit / exporter already deals with Unity Input Manager, which would mean that somewhere in the Scene Manager extension, there is already a high-level/flexible Input Manager.

The question is: Do we need to program an input manager if it was already implemented in the Scene Manager?

However, I tried to read through the source code and nothing looks like a Hashmap or structure, which I would have expected to see. 

Link to comment
Share on other sites

7 hours ago, Nodragem said:

Hence I was thinking that maybe the Unity toolkit / exporter already deals with Unity Input Manager,

I don't see that as a supported feature:
I can see in the scene manager they are being put into UserInputOptions from metadata, but didn't look further, but that's probably the structure you are after:

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Create New...