Add numSparks# to return the number of elements in the current capability's spark queue
Adding
numSparks# :: State# s -> (# State# s, Int# #)
returning
dequeElements(cap->spark)
from the c-- backend would permit user code in Haskell to inspect the current capability's spark backlog to decide whether or not to spark a task at all.
Consider how the following combinator from 'speculation' works. Under load, the par will cause the speculation to be discarded, but otherwise it can use it to begin evaluating the function before the argument a is ready, enhancing parallelism if the guess is accurate.
spec :: Eq a => a -> (a -> b) -> a -> b
spec guess f a =
speculation `par`
if guess == a
then speculation
else f a
where
speculation = f guess
Under high load:
foreground: [----- a -----]
foreground: [-] (check g == a)
foreground: [---- f a ----]
overall: [-------- spec g f a ---------]
However, a similar combinator needs must degrade much worse under load.
specSTM :: Eq a => a -> (a -> STM b) -> a -> STM b
Due to the fact that the computation of the function is an STM
calculation that needs access to the current transaction, it can't be moved into the spark. However, it *could* inspect numSparks#
to determine if the current capability's spark queue was loaded and therefore would backlog anyways to avoid evaluating f g
at all under load, allowing it to (somewhat clumsily) ape the graceful degradation under load experienced by spec
.
This unfortunately cannot be done directly in a third party package using foreign import prim
because the capability and spark queue are declared private and so GHC's codebase is the only place that has access to the appropriate primitives.
For more timelines and examples, see 'speculation'.
Trac metadata
Trac field | Value |
---|---|
Version | 6.12.3 |
Type | FeatureRequest |
TypeOfFailure | OtherFailure |
Priority | normal |
Resolution | Unresolved |
Component | Compiler |
Test case | |
Differential revisions | |
BlockedBy | |
Related | |
Blocking | |
CC | |
Operating system | |
Architecture |