i just got knocked down after our server has been updated from Debian 4 to 5. We switched to UTF-8 environment and now we have problems getting the text printed correctly on the browser, because all files are in non-utf8 encodings like iso-8859-1, ascii, etc.
I tried many different scripts.
The first one i tried is "iconv". That one doesnt work, it changes the content, but the files enconding is still non-utf8.
Same problem with enca, encamv, convmv and some other tools i installed via apt-get.
Then i found a python code, which uses chardet Universal Detector module, to detect encoding of a file (which works fine), but using the unicode class or the codec class to save it as utf-8 doesnt work, without any errors.
The only way i found to get the file and its content converted to UTF-8, is vi.
These are the steps i do for one file:
vi filename.php
:set bomb
:set fileencoding=utf-8
:wq
Thats it. That one works perfect. But how can get this running via a script. I would like to write a script (linux shell) which traverses a directory taking all php files, then converting them using vi with the commands above. As i need to start the vi app, i do not know how to do something like this:
"vi --run-command=':set bomb, :set fileencoding=utf-8' filename.php"
Hope someone can help me.
This is the simplest way I know of to do this easily from the command line:
vim +"argdo se bomb | se fileencoding=utf-8 | w" $(find . -type f -name *.php)
Or better yet if the number of files is expected to be pretty large:
find . -type f -name *.php | xargs vim +"argdo se bomb | se fileencoding=utf-8 | w"