I came up with this problem recently but haven't been able to find an efficient approach. Since it's a self-created problem, I don't have official test cases. Problem Statement ~~~~~ There is a sequence a of n integers. Process q queries given in order. For q-th query, you are given integers l, r (1 <= l <= r <= n) and a integer x.Perform the following in order: -Add x for each of a[l], a[l + 1],.., a[r] -Let m = r — l + 1, and b = (b[1], b[2],.., b[m]) = (a[l], a[l + 1],.., a[r]) sort(b + 1, b + m + 1) -Present the results of (m * b[1] + (m — 1) * b[2] + .. + b[m]) % MOD (MOD = 1e9 + 7) (n <= 1e5, q <= 1e5, |a[i]| <= 1e9) ~~~~~



