I am making a PHP tool that connects to a repository, downloads the list of commits and stores them locally. However, due to the possibility that certain repositories are HUGE and grabbing their log results in long waiting time and possible time-outs/errors, I would like to download each commit message using async requests.
So, I have a start and end points in revision history, and I can grab all logs like this:
svn log -r <from_revision>:<to_revision> <REPO_URL>
... and I will possibly end up with an XML file which is so huge that it will take long time to download, long time and lots of resources to parse, and long time to store.
If I know the start and the end point, I can create a for() loop to grab revisions one by one:
svn log -r <revision> ...
BUT, since I don't know which specific revisions exist for given path, I will receive an error. I can make the application to ignore that error during the update, but it's a nasty hack and it will post requests and wait for the responses anyway - which is not good at all.
So, I need something like this:
That way I would be able to make an array of valid revisions for the repository path and get them one by one.
All suggestions are welcome, thanks in advance.
I think your best bet would be to do the following svn log -l 100 0:{$newest} .\
that will retrieve the first 100 logs for a given directory or file. then read the last revision number returned, then request the next 100 svn log -l 100 {$last_ret_log}:{$newest} .\
that way you get 99 new log entries with each request but you wont get all of them.