Serving javascript libraries from a CDN instead of your own server comes with tremendous advantages. Less work for your server, possibility for the CDN to have a copy closer to the user than your server, but most importantly a good chance that your user's browser already has it cached from that URL. The last one means less total work for everybody, so it's clearly a win all around, and is more likely the more often we (developers) rely on the CDNs to serve our javascript.
But the popular javascript CDNs (Google, Microsoft, others?) only host a small number of files. For others we have the choice of hosting them ourselves, or ... using the source control server as a kind of CDN. It's unlikely Github or similar has a geographically distributed cache of files optimized for serving globally. But if it's common practice, then there is a decent chance that the user's browser will have it cached. The argument of off-loading work from our servers to github is only valid if Github has willingly volunteered to do this.
So, is it common practice? Should we encourage each other to do this? Does Github mind? Do they have an official policy stated?
You should not do that for JavaScript files if you care about performance or IE9 compatibility.
GitHub doesn't serve its "raw" files with a far-future expires header. Without the possibility of cross-site caching, you lose the biggest benefit of using a public CDN to host your JavaScript. In fact, using GitHub as a CDN will be slower than simply hosting the files on your own server after each user's first request for the file (assuming you configure caching correctly on your server).
Another problem is that GitHub doesn't serve "raw" files with a content-type header that matches the file's actual MIME type. In IE9 (and perhaps other browsers/proxies/firewalls/etc), JavaScript files that aren't served with the correct content-type are blocked by default. You can see that in action on the BlockUI demo page, for example: