Jump to content

[Tutorial] - Efficiently Load Large Amounts of Game Data Into Memory


WombatTurkey
 Share

Recommended Posts

So, I found this pretty sleek script called pako.js. It basically let's you unpack zLib compressed data using Javascript. Package here: https://github.com/nodeca/pako

 

I have a table called rpg_items which holds thousand's of different base item types and their properties. I exported this db table into a file so I can load its contents into my gameserver and client for synchronization. It's quite a large file. You can use any file for this, but you need to beable to parse it within Javascript. 

I used Adminer and their JSON export plugin to export that table into one large JSON file:

9a1cb8ad1e9dea97b61154a150762b14.png

There is currently over 500 items in that 232 KB of JSON acting as a mini database. (see my rpg_items s/s above for the db row format).

Then, I packed the uncompressed file using: http://php.net/manual/en/function.gzcompress.php. With compress level 9.  You don't have to use PHP to do this, you can pack your file however you want, just make sure you're using the zlib Library, version 6.

Now, you may be wondering... "Why even do this when your file is gzipped by nginx anyway, it will go down to 16 KB before even going over the pipe!"  Well, my reason is because when I start having 600 or 1000 + items in this database, this file will be quite large. Well over 1MB / 2MB / 3MB and beyond. And to let poor nginx eat up resources to compress that data is kind of a dick thing to do. Why not let the client do the work for you? Also, you obviously get smaller file sizes! But that's not really an issue with disk space nowadays, but still counts as a positive! Also to note, even if you have your gzip compression level on nginx at a low value, it's still taking a performance hit. 

Just to add: You don't need to do this when loading the file into nodejs for your gameserver, just use the regular uncompressed json file. I mean, you could probably do this whole pako thing in nodejs, but kind of pointless.

Step 1:

-- Download pako from the link above, and stick pako_inflate.min.js at the top of your page.

Step 2:

-- Add this snippet somewhere in your page:

function zlibDecompress(url, callback){
    var xhr = new XMLHttpRequest();
    xhr.open('GET', url, true);
    xhr.responseType = 'blob';

    xhr.onload = function(oEvent) {
        // Base64 encode
        var reader = new window.FileReader();
        reader.readAsDataURL(xhr.response); 
        reader.onloadend = function() {
            var base64data      = reader.result;  
	
			//console.log(base64data);
            var base64      = base64data.split(',')[1];
 
            // Decode base64 (convert ascii to binary)
            var strData     = atob(base64);
 
            // Convert binary string to character-number array
            var charData    = strData.split('').map(function(x){return x.charCodeAt(0);});
	
            // Turn number array into byte-array
            var binData     = new Uint8Array(charData);

            // Pako inflate
            var data        = pako.inflate(binData, { to: 'string' });

            callback(data);
        }
    };

    xhr.send();
}

 

Step 3:

Use this beast:

zlibDecompress('../itemdata.nexus', function(data){
			Items = JSON.parse(data);
});

 

What it essentially does it basically acts like you're unzipping a file and it's contents and in this case it's JSON, so I parsed it with JSON.parse and stuck that baby to my global Items variable.

Now, the magic:

52de67fd96a4365c5911fec67d2f06ae.png

 

I haven't noticed ANY delays or hiccups while unpacking either. If you have any questions shoot away! I know it MIGHT come off as preMature optimization, but having a 16 KB file with thousand's of item data inside and  nginx doesn't even have to gzip it once? And letting the client do all the work is a win in my books!

Link to comment
Share on other sites

How long does it take to uncompress that 16kb of data?

It sounds interesting but is it actually more performant than just sending the, say, 232kb or having the browser unpack the 16kb gzipped version?

I guess the proper question is, why is this better than letting the browser unzip it? Surely the browser is always quicker?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...