Refining the notion of triangularity
I was thinking that our current notion of triangularity may be too restrictive. Consider the following example:
def test():
knl = lp.make_kernel(
"{[i,j,k]: 0<=i<n and 0<=j<=i and i=k}",
"out[i] = sum(j, a[j])")
knl = lp.realize_reduction(knl, force_scan=True, force_outer_iname_for_scan="i")
This results in the following error:
loopy.diagnostic.ReductionIsNotTriangularError: gist of test against hypothesis domain has sweep or scan dependent constraint: '[n] -> { [i, j, k] : k = i }'
The k
constraint seems to be entirely irrelevant and if the reduction were its own instruction it would be ignored.
@inducer: thoughts? What would happen if we just projected away irrelevant inames such as k
, based on the instruction in which the reduction lives, and then we ran the triangularity check?
(I am filing this as a bug in the paper so we can make a note in the text for next time if we decide to accept a construct like this, but this might as well be a bug in loopy.)