I'm learning about the uses of async/await in Scala. I have read this in https://github.com/scala/async
Theoretically this code is asynchronous (non-blocking), but it's not parallelized:
def slowCalcFuture: Future[Int] = ...
def combined: Future[Int] = async {
await(slowCalcFuture) + await(slowCalcFuture)
}
val x: Int = Await.result(combined, 10.seconds)
whereas this other one is parallelized:
def combined: Future[Int] = async {
val future1 = slowCalcFuture
val future2 = slowCalcFuture
await(future1) + await(future2)
}
The only difference between them is the use of intermediate variables. How can this affect the parallelization?
Since it's similar to async & await
in C#, maybe I can provide some insight. In C#, it's a general rule that Task
that can be awaited should be returned 'hot', i.e. already running. I assume it's the same in Scala, where the Future
returned from the function does not have to be explicitly started, but is just 'running' after being called. If it's not the case, then the following is pure (and probably not true) speculation.
Let's analyze the first case:
async {
await(slowCalcFuture) + await(slowCalcFuture)
}
We get to that block and hit the first await:
async {
await(slowCalcFuture) + await(slowCalcFuture)
^^^^^
}
Ok, so we're asynchronously waiting for that calculation to finish. When it's finished, we 'move on' with analyzing the block:
async {
await(slowCalcFuture) + await(slowCalcFuture)
^^^^^
}
Second await, so we're asynchronously waiting for second calculation to finish. After that's done, we can calculate the final result by adding two integers.
As you can see, we're moving step-by-step through awaits, awaiting Future
s as they come one by one.
Let's take a look at the second example:
async {
val future1 = slowCalcFuture
val future2 = slowCalcFuture
await(future1) + await(future2)
}
OK, so here's what (probably) happens:
async {
val future1 = slowCalcFuture // >> first future is started, but not awaited
val future2 = slowCalcFuture // >> second future is started, but not awaited
await(future1) + await(future2)
^^^^^
}
Then we're awaiting the first Future
, but both of the futures are currently running. When the first one returns, the second might have already completed (so we will have the result available at once) or we might have to wait for a little bit longer.
Now it's clear that second example runs two calculations in parallel, then waits for both of them to finish. When both are ready, it returns. First example runs the calculations in a non-blocking way, but sequentially.