WebGL Lesson 6 – keyboard input and texture filters

<< Lesson 5Lesson 7 >>

Welcome to my number six in my series of WebGL tutorials, based on part of number 7 in the NeHe OpenGL tutorials. In it, we’ll go over how you can make your WebGL page accept keyboard input, and we’ll use that to change the rate and direction of spin of a texture-mapped cube, and also to change the kind of filtering used on the texture to get lower-quality but faster, or higher-quality but slower, rendering. (NeHe’s lesson seven covers not only this but also lighting; because lighting is more work in WebGL than it is in OpenGL, I’ve left it out of this lesson — we’ll look at that next time.)

Here’s what the lesson looks like when run on a browser that supports WebGL, best viewed on the YouTube site so that you can see the annotations saying what I’m doing to make the scene change:

Click here and you’ll see the live WebGL version, if you’ve got a browser that supports it; here’s how to get one if you don’t. Once you have it loaded, use the Page Up and Page Down keys to zoom in and out, and use the cursor keys to make the cube rotate (the longer you hold down a cursor key, the more it accelerates). You can also use the F key to toggle through three different kinds of texture filters, an effect that you will see best when you’re zoomed quite close up to the cube or when you’re quite far away; more about exactly what’s going on there later.

More on how it all works below…

The usual warning: these lessons are targeted at people with a reasonable amount of programming knowledge, but no real experience in 3D graphics; the aim is to get you up and running, with a good understanding of what’s going on in the code, so that you can start producing your own 3D Web pages as quickly as possible. If you haven’t read the previous tutorials already, you should probably do so before reading this one — here I will only explain the differences between the code for lesson 5 and the new code.

There may be bugs and misconceptions in this tutorial. If you spot anything wrong, let me know in the comments and I’ll correct it ASAP.

There are two ways you can get the code for this example; just “View Source” while you’re looking at the live version, or if you use GitHub, you can clone it (and the other lessons) from the repository there. Either way, once you have the code, load it up in your favourite text editor and take a look.

The biggest change between this lesson and the last is that we’re looking at the keyboard, but it’s easier to explain how that works by starting with the code that it affects. If you start off by scrolling halfway down through the code, you’ll see a number of global variables defined:

  var xRot = 0;
  var xSpeed = 0;

  var yRot = 0;
  var ySpeed = 0;

  var z = -5.0;

  var filter = 0;

xRot and yRot will be familiar from lesson 5 — they represent the current rotation of the cube around the X and Y axes. xSpeed and ySpeed should be fairly obvious; now that we’re allowing the user to vary the speed of rotation of the cube using the cursor keys, these are where we keep the rates of change of xRot and yRot. z is, of course, the Z-coordinate of the cube — that is, how close it is to the viewer — and will be controlled by the Page Up and Page Down keys. And finally, filter is an integer ranging from 0 to 2, which specifies which of three filters is used on the texture that we’re mapping on to the cube, and thus how nice it looks.

Let’s take a look at the code that drives the filter now. The first changes are in the code to load the texture, a bit further up and about a third of the way from the top of the page. The code is so changed from last time that I won’t highlight anything in red. However, it should look pretty familiar in form if not in the details:

  function handleLoadedTexture(textures) {
    gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);

    gl.bindTexture(gl.TEXTURE_2D, textures[0]);
    gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, textures[0].image);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);

    gl.bindTexture(gl.TEXTURE_2D, textures[1]);
    gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, textures[1].image);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);

    gl.bindTexture(gl.TEXTURE_2D, textures[2]);
    gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, textures[2].image);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);

    gl.bindTexture(gl.TEXTURE_2D, null);

  var crateTextures = Array();

  function initTexture() {
    var crateImage = new Image();

    for (var i=0; i < 3; i++) {
      var texture = gl.createTexture();
      texture.image = crateImage;

    crateImage.onload = function() {
    crateImage.src = "crate.gif";

Looking first at the function initTexture and the global variable crateTextures, it should be clear that although the code is changed, the only real underlying difference is that we're creating three WebGL texture objects in an array rather than one, and we're passing that array over to handleLoadedTexture in the callback function when the image is loaded. And, of course, we're loading a different image from last time, crate.gif instead of nehe.gif.

handleLoadedTexture has also not changed in any complicated way; previously we were just initialising a single WebGL texture object with the image data, and setting two parameters on it: gl.TEXTURE_MAG_FILTER and gl.TEXTURE_MIN_FILTER, both to gl.NEAREST. Now, we're initialising all three textures in our array with the same image, but we're setting different parameters on each, and there's an extra bit of code for the last one. Here's how the different textures differ in more detail:

Nearest filtering

The first texture has gl.TEXTURE_MAG_FILTER and gl.TEXTURE_MIN_FILTER both set to gl.NEAREST. This is our original set-up, and it means that both when the texture is being scaled up and when it's being scaled down, WebGL should use a filter that determines the colour of a given point just by looking for the nearest point in the original image. This will look just fine if the texture is not scaled at all, and will look OKish if it's scaled down (but see the discussion of aliasing below). However, when it's scaled up, it will look "blocky", as this algorithm effectively scales the pixels in the original image up.

Linear filtering

For the second texture, gl.TEXTURE_MAG_FILTER and gl.TEXTURE_MIN_FILTER are both gl.LINEAR. Here we're once again using the same filter for both scaling up and scaling down. However, the linear algorithm can work better for scaled-up textures; basically, it just uses linear interpolation between the pixels of the original texture image — roughly speaking, a pixel that is half-way between a black one and a white one comes out grey. This gives a much smoother effect, though (of course) sharp edges get a bit blurred. (To be fair, when you scale an image up it's never going to look perfect — you can't get detail that isn't there.)


For the third texture, gl.TEXTURE_MAG_FILTER is gl.LINEAR and gl.TEXTURE_MIN_FILTER is gl.LINEAR_MIPMAP_NEAREST. This is the most complex of the three options.

Linear filtering gives reasonable results when you scale the texture up, but it's no better than nearest filtering when scaling down; in fact, both filters can cause ugly aliasing effects. To see what these look like, load up the sample again so that it's using nearest filtering (or hit the refresh button to get it back to its initial state), and hold down the Page Up key for a few seconds to zoom out. As the cube moves away, at some point you'll see it start "twinkling", with vertical lines seeming to appear and disappear. Once you see this, stop and try zooming in and out a bit, watching the twinkling, then press F once to switch to linear filtering, move it back and forward a bit more, and note that you get pretty much the same effect. Now press F once more to use mipmap filtering, zoom in and out again, and you should see this effect eliminated or at least very much reduced.

Now, while the cube is quite far away — say, 10% of the width/height of the overall canvas — try cycling through the filters without moving it. With nearest or linear filtering, you will notice that in some places the dark lines that make up the grain of the wood in the texture are very clear, whereas in others they have disappeared; the cube looks a bit "splotchy". This is really bad with nearest filtering, but not much better with linear. Only mipmapped filtering works well.

What's happening with nearest and linear filtering is that when the texture is scaled down to (say) one-tenth size, the filter uses every tenth pixel in the original image to make up the scaled-down version. The texture has a wooden "grain" pattern, which means that most of it is light brown but there are thin vertical dark lines; let's imagine that the grain is ten pixels wide, or that in other words there is a dark brown pixel every ten pixels horizontally. If the image is scaled down to one tenth, then there is a one-in-ten chance of any given pixel being dark brown, nine-in-ten of it being light. Or to put it another way, one in ten of the dark lines in the original image are shown just as clearly as they were when the image was full-sized, and the others are completely hidden. This causes the splotchy effect, and also adds the twinkling when the scale is changing, because the specific dark lines that are chosen might be completely different at scaling factors of 9.9, 10.0 and 10.1.

What we'd really like to do is be in a situation where when the image is scaled to one tenth of its original size, each pixel is coloured based on the average of the ten-by-ten pixel square that it is a scaled-down version of. Doing this smoothly is too computationally expensive for real-time graphics, and this is where mipmap filtering comes in.

Mipmap filtering solves the problem by generating for the texture a number of subsidiary images (called mip levels), at half, one-quarter, one-eighth, and so on of the original size, all the way down to a one-by-one pixel version. The set of all of these mip levels is called a mipmap. Each mip level is a smoothly averaged version of the next-largest one, and so the appropriate version can be chosen for the current level of scaling; the algorithm for this depends on the value used for gl.TEXTURE_MIN_FILTER, the one we chose being basically meaning "find the closest mip level and do a linear filter on that to get the pixel".

Now that I've explained all that it should be pretty clear that the extra line we had to add for this texture:


...is the line required to tell WebGL to generate the mipmap.

Right, that was significantly more than I'd planned to write on mipmaps, but I think it should be reasonably clear :-) Let me know in the comments if anything's still unexplained.

Back to the remainder of the code. So far, we've looked at the global variables and seen how the textures are loaded and set up. Now let's see how the globals and the textures are used when we actually come to draw the scene.

drawScene is about two-thirds of the way through the page, and has just three changes. The first is that when we are positioning ourselves to draw the cube, instead of using a fixed point, we use the global variable z:

    mvTranslate([0.0, 0.0, z]);

The next is actually a line that we've removed from the code for lesson 5; now, we don't rotate around the Z axis at all, and there are just rotations around X and Y:

    mvRotate(xRot, [1, 0, 0]);
    mvRotate(yRot, [0, 1, 0]);

Finally, when we are about to draw the cube, we have to specify which of our three textures we want to use:

    gl.bindTexture(gl.TEXTURE_2D, crateTextures[filter]);

That's all for the changes in drawScene. There are also some minor changes in animate; instead of changing xRot and yRot at constant rates, we now use our new xSpeed and ySpeed variables:

      xRot += (xSpeed * elapsed) / 1000.0;
      yRot += (ySpeed * elapsed) / 1000.0;

And that's all of the changes to the code except for those that actually handle the user's keypresses and update our various globals based on them; let's move on to those now.

The first relevant change is right down at the bottom, in webGLStart, where we've added two new lines (in red below):

  function webGLStart() {
    var canvas = document.getElementById("lesson06-canvas");

    gl.clearColor(0.0, 0.0, 0.0, 1.0);



    document.onkeydown = handleKeyDown;
    document.onkeyup = handleKeyUp;


    setInterval(tick, 15);

Fairly obviously, all we're doing here is telling the JavaScript runtime that when a key is pressed (with the focus on the web page) we want our function called handleKeyDown to be called, and when a key is released, it should call handleKeyUp.

Let's take a look at those functions next. They're about halfway through the page, just below the global variables we looked at earlier, and look like this:

  var currentlyPressedKeys = Object();

  function handleKeyDown(event) {
    currentlyPressedKeys[event.keyCode] = true;

    if (String.fromCharCode(event.keyCode) == "F") {
      filter += 1;
      if (filter == 3) {
        filter = 0;

  function handleKeyUp(event) {
    currentlyPressedKeys[event.keyCode] = false;

What we're doing here is maintaining a dictionary (which you might also know by the name "hashtable" or "associative array"), which, given a key code — that is, one of JavaScript's numeric identifiers for the keys on the keyboard — can tell us whether that key is currently being pressed by the user or not. If you're not familiar with the way JavaScript works, you may find it interesting to note that any object can be used as a dictionary like this, so we just use a base Object instance.

In addition, we're handling the key-down event for the "F" key separately, by cycling the filter global variable through the values 0, 1, and 2 each time it's pressed.

It's worth taking time now to explain why we're handling different keys in two different ways. In a computer game, or almost any other similar 3D system, key-presses can work in one of two ways:

  1. They can take an immediate action: "fire the laser". Keypresses like this might auto-repeat at some kind of fixed rate, say twice per second.
  2. They can take an effect that depends on how long you hold them down. For example, when you press the key to walk forward, you expect to keep moving forward for as long as you hold it.

Importantly, with the second kind of key-press, you want to be able to press other keys while the action is in progress, so that you can (for example) start running forward, then turn a corner and shoot without stopping running. This is a fundamentally different way of reading the keyboard to the normal "text-processing" way; if you hold down the "A" key in a word processor, you'll get a stream of "A"s, but if you press "B" while you're holding down "A" then you'll get a "B" but the stream of "A"s will stop. The equivalent in a game would be for you to stop running every time you turned a corner, which would be extremely irritating.

So, in the code we just looked at, the "F" key is handled as the first kind of key-press. The dictionary is used by the code that handles the second kind; it keeps track of all of the keys that are currently being held down, not just the last one to be pressed.

The dictionary is actually used in a different function, handleKeys, which comes next in the page. Before we go through that, jump briefly down to the bottom of the code and you'll see that it's called by the tick function, just like drawScene and animate:

  function tick() {

Here's what handleKeys looks like:

  function handleKeys() {
    if (currentlyPressedKeys[33]) {
      // Page Up
      z -= 0.05;
    if (currentlyPressedKeys[34]) {
      // Page Down
      z += 0.05;
    if (currentlyPressedKeys[37]) {
      // Left cursor key
      ySpeed -= 1;
    if (currentlyPressedKeys[39]) {
      // Right cursor key
      ySpeed += 1;
    if (currentlyPressedKeys[38]) {
      // Up cursor key
      xSpeed -= 1;
    if (currentlyPressedKeys[40]) {
      // Down cursor key
      xSpeed += 1;

It's another long but very simple function; all it does is check whether various keys are currently pressed, and update our global variables appropriately. Most importantly, if (say) the "Up" and the "Right" cursor keys are both being pressed, it will update both xSpeed and ySpeed, so these act the way we want.

And that's it for this time! Now you know all there is to learn from this lesson: you should have a pretty good understanding of how different filters affect the way textures look at different scaling factors, and you know how to read user input from the keyboard in a way that works well with 3D animations.

If you have any questions, comments, or corrections, please do leave a comment below!

In the next lesson, we make a start on lighting.

<< Lesson 5Lesson 7 >>

Acknowledgments: The OpenGL ES Programming Guide was an invaluable resource for information about textures and mipmapping. Matthew Casperson's post at Bright Hub was a good resource for hints on getting keyboard input working, and this JavaScript Kata told me how to code something like a dictionary. As always, I'm deeply in debt to NeHe for his OpenGL tutorial for the script for this lesson.

You can leave a response, or trackback from your own site.

25 Responses to “WebGL Lesson 6 – keyboard input and texture filters”

  1. murphy says:

    I think you have swapped the headings for the “Nearest filtering” and “Linear filtering” sections.

  2. giles says:

    Argh, you’re right! Fixed it.

  3. titan says:

    I like your lessons so much! Did you play with greater models too? (10000+ triangles) I think there is a memory leak in the “painting-loop”. Every time a new array. I have some problems with that construction. I need more time to look at this.

    greets titan

  4. giles says:


    Glad you like the lessons! There probably are a few memory leaks in the tutorials right now; the array probably will get garbage-collected but the WebGL buffer might not. I’ll see what I can do to fix it.



  5. titan says:

    Hi me again,

    I do not know the correct place for my problem on firefox nightly build. I read only 1 pixel from framebuffer, all was fine until I build textures in. Somebody know why readpixels stopped working with textures in WebGL (gl.texImage2D). The Error is readpixels not allowed. The Code of the Error is “if(mCanvasElement->IsWriteOnly() && !nsContentUtils::IsCallerTrustedForRead()). I do not know, why it is WriteOnly. What can I disable for the moment of readpixel?

    greets Titan

  6. giles says:

    Hi titan,

    The best place for questions like that is probably the Khronos forums: http://www.khronos.org/message_boards/viewforum.php?f=34 — there’s a forum for browser-specific issues which is probably best.



  7. Liam says:

    Just pointing out a typo: handling the key-down even

  8. Liam says:

    Some readers may be interested in using constants to refer to keys instead of magic numbers. This shows you how:


  9. giles says:

    Thanks Liam — corrected the typo, and I’ll look into using the constants. It’s odd that there’s nothing built into all browsers, though!

  10. Shy says:

    On a small laptop screen the result of this lesson has a problem. When I’m pressing the down key or page down, the page scrolls, moving the canvas.
    Is there a way to prevent this from happening in javascript?

  11. Shy says:

    Ok so in firefox and chromium this can be prevented by returning false from handleKeyDown().

  12. giles says:

    @Shy — thanks! That’s a useful trick, I’ll update the lesson (with attribution, natch :-)

  13. Ozzy says:


    I am having problems with mipmap filtering. It cant load the texture when I use “LINEAR_MIPMAP_NEAREST” on my PC and it displays just a white object. To see the texture, I have to use LINEAR filtering.
    It happens not only in this Lesson, but also in all examples with mipmap filtering. I work with Chromium and Minefield, both give me the same result.
    Do you have any idea?

    Thanks in advance

  14. Justin Bailey says:

    Thanks (as always) for the great lessons!

    I’ve been a long time follower, even back in the old days of Win32 NeHe, but this is my first time commenting. Unfortunately I’m stuck with this HP nx9420 and it has a built in ATi Radeon Mobility x1600. I know that ATi has some pretty terrible driver support but to my astonishment I cannot seem to generate mip maps in WebGL. I ran the real-tech glView tool and passed all of the tests with flying colors (I’ll have to copy and paste the results when I get home) but I still do not see the mip mapped textures in your tutorials or my own projects. I’ve also had the same problem with an older Pentium 4 machine running an ATi Radeon 9800 series card. Any suggestions?

    Thanks in advance! :D

  15. giles says:

    Hi Justin — glad you like the lessons! (Just in case it’s not clear, I’m not the one who wrote the original Win32 NeHe lessons, though.)

    It sounds like both you and Ozzy are seeing a similar problem — mipmaps not working — so perhaps it’s a common cause. Are you using Chromium or Minefield?

  16. msirin says:

    hi giles! you should publish a book..^^

    i have a question concerning rotating an object. when i do rotate the model around the x-axis via mvRotate(angle_var, [1, 0, 0]); which for example means that i dragged the mouse cursor from top to down (say 90°) everything is fine and my model is rotated as i wished. now i want to rotate the object “to the right”. but then something unexpected happens: the object rotates counterclockwise.

    i guess this phenomenon is a result of rotation the whole world-coordinate-system.

    maybe you have an idea how everytime to spin the model in direction which my curser is dragged to?


  17. msirin says:

    …”to the right” causes a counterclockwise spinning after I first rotated it around the x-axis.

  18. giles says:

    Hi msirin, glad you like the tutorials! I wish I had time to write a book :-)

    You’re exactly right that the effect you’re seeing is a result of the coordinates being rotated. Your first rotation around the X axis puts the Y axis where the Z axis used to be, so when you try to rotate around the Y axis, it rotates around Z instead.

    Lesson 11 explains what’s going on in a bit more detail, and shows how you can use rotation matrices to work around the problem.

  19. msirin says:

    aah, now that’s a perfect answer^^
    thank you!

  20. Justin Bailey says:


    Thanks for the reply. I’m currently on Chromium (the nightly builds). As previously mentioned, I am running a Radeon x1600 Mobility and I know that ATi has problems with software so I went ahead and tried Omega drivers to see if that was the problem. No dice. I haven’t previously had problems with mip maps in XP, at least not when GL was part of a native C++, Java or C# app. Still looking around the web for an answer but it seems that a lot of older ATi cards are having this same issue. I’ve tried playing with the Catalyst Control Center settings (useless as usual) as well. So far I have confirmed that it’s isolated to ATi devices and no nVidia cards had the same problem in my app or yours.

    Sorry it took me so long to reply. The starter on my car decided to go out so I’ve been up to other things! ^_^

  21. Justin Bailey says:

    In the words of the Koolaid Man, “OH YEAH~”


    Finally got the anisotropic filtering working under Windows 7 64bit with the above link. x1600 users, this works. Anyone else will just have to try it out for themselves!

    Thanks again, Giles ^_^

  22. giles says:

    @Justin — great news — thanks for posting the link!

  23. Vik says:

    @Giles- Awesome job Giles. But i have a concern. say I have a 3D file on my system and I want to upload it on my facebook account(just an example could be any other website). now if I want this file to open directly in my browser showing that 3D content(say rotating cube), is it possible??? and if yes what 3D file format it should be in???

    any ideas???


  24. giles says:

    Hi Vik — that would depend on the WebGL code that the people who run the site have put on it. So, for example, Facebook might decide that they support the COLLADA format, while some other site might support .OBJ and another 3DS. It would all depend on the site owner.

  25. Vik says:

    thanks Giles…..

Leave a Reply

Subscribe to RSS Feed Follow me on Twitter!