String length differs from Javascript to Java code

François P. picture François P. · Jan 20, 2009 · Viewed 11.2k times · Source

I've got a JSP page with a piece of Javascript validation code which limits to a certain amount of characters on submit. I'm using a <textarea> so I can't simply use a length attribute like in a <input type="text">.

I use document.getElementById("text").value.length to get the string length. I'm running Firefox 3.0 on Windows (but I've tested this behavior with IE 6 also). The form gets submitted to a J2EE servlet. In my Java servlet the string length of the parameter is larger than 2000!

I've noticed that this can easily be reproduced by adding carriage returns in the <textarea>. I've used Firebug to assert the length of the <textare> and it really is 2000 characters long. On the Java side though, the carriage returns get converted to UNIX style (\r\n, instead of \n), thus the string length differs!

Am I missing something obvious here or what ? If not, how would you reliably (cross-platform / browser) make sure that the <textarea> is limited.

Answer

levik picture levik · Jan 20, 2009

This isn't really a JavaScript (or Java) problem - both layers report an accurate length for the string they are dealing with. The problem in your case is that the string gets transformed during the HTTP transmission.

If you absolutely must ensure that the string doesn't exceed a certain length, you can mimic this transformation on the client by replacing every instance of "\n" with "\n\r" - but only for length verification purposes:

textarea.value.replace(/\n/g, "\r\n").length