kryton9
30-11-2006, 23:31
I am not quite sure where to place this post, but thought will try here. Please move, if there is a more appropriate place.
In developing for games, directX provides the interface to everything, grfx, all sorts of input and of course sound.
we have tbgl for grfx, bass for sound, but how do we handle all of the input devices. For now I am thinking about first person type movment, using the keyboard and mouse, but eventually gamepads and joystick support would be nice. A friend and I were talking last night how cool would it be to have gloves that you wore that could sense your movements, could you imagine going into thinEdge and sculpting your model as if made from Clay. Maybe in coming years, but for now do we need to tie into an input library of some sort for this kind of stuff?
In developing for games, directX provides the interface to everything, grfx, all sorts of input and of course sound.
we have tbgl for grfx, bass for sound, but how do we handle all of the input devices. For now I am thinking about first person type movment, using the keyboard and mouse, but eventually gamepads and joystick support would be nice. A friend and I were talking last night how cool would it be to have gloves that you wore that could sense your movements, could you imagine going into thinEdge and sculpting your model as if made from Clay. Maybe in coming years, but for now do we need to tie into an input library of some sort for this kind of stuff?