I have situation, where I have to receive requests in a Web API method, queue those request and then send the bulk to a database (Solr instance).
I am not really sure how do I maintain a batch of requests from multiple sources. For now I am writing each request data in json format to a file on disk, Later I will have a windows service, go through the folder read all files , update the database and delete those files.
Here is what I am doing in my Web API
public void Post(LogEntry value)
{
value.EventID = Guid.NewGuid();
value.ServerTime = DateTime.UtcNow;
string json = JsonConvert.SerializeObject(value);
using(StreamWriter sw = new StreamWriter(value.EventID.ToString()))
{
sw.Write(json);
}
}
(Here EventID
is GUID)
This process doesn't look right, there must be a way to maintain a queue of request, but I am not really sure how to maintain a queue during multiple requests.
The reason I am doing that is, insertion in batches in solr instance is faster than inserting a single record through SolrNet. I am expecting to get at least 100 requests each second on the Web API. I want to create a batch of 1000 request and update the solr instance every 10 seconds. Please don't think that I need code, just need to know what strategy should I adopt to maintain a queue of request / state.
You could use a concurrent queue, if you're using .NET 4.0 or higher:
This is a thread-safe way of using a queue, which then could be accessed at a desired time.
Edit:
Example:
This would be a wrapper for the queue:
public static class RequestQueue
{
private static ConcurrentQueue<int> _queue;
public static ConcurrentQueue<int> Queue
{
get
{
if (_queue == null)
{
_queue = new ConcurrentQueue<int>();
}
return _queue;
}
}
}
Then you could set up your web api like this (this example stores integers for the sake of brevity):
public class ValuesController : ApiController
{
public string Get()
{
var sb = new StringBuilder();
foreach (var item in RequestQueue.Queue)
{
sb.Append(item.ToString());
}
return sb.ToString();
}
public void Post(int id)
{
RequestQueue.Queue.Enqueue(id);
}
}
If u use this example you'll see that the queue holds the values across multiple requests. But, since it lives in memory, those queued items will be gone if the app pool is recycled (for instance).
Now you could build in a check for when the queue holds 10 items and then save those to the DB, while creating another queue to store incoming values.
Like so:
public static class RequestQueue
{
private static ConcurrentQueue<int> _queue;
public static ConcurrentQueue<int> Queue
{
get
{
if (_queue == null)
{
_queue = new ConcurrentQueue<int>();
}
if (_queue.Count >= 10)
{
SaveToDB(_queue);
_queue = new ConcurrentQueue<int>();
}
return _queue;
}
}
public static void SaveToDB(ConcurrentQueue<int> queue)
{
foreach (var item in queue)
{
SaveItemToDB(item);
}
}
}
You need to clean this up a bit, but this setup should work. Also, you might need some locking mechanism around the dumping of the queue to the DB and creating a new instance. I would write a Console app with multiple threads that access this Queue to test it.