Problem statement

https://leetcode.com/problems/minimum-xor-sum-of-two-arrays/

Solution

I spend like 1 hour on contest, trying to find a mistake, so if you are wondering what was the problem, continue reading =)

The idea is to use dymamic programming on subsets idea, see here https://leetcode.com/discuss/general-discussion/1125779/dynamic-programming-on-subsets-with-examples-explained for more details.

Here we briefly talk about solution. Let us go from left to right on array A and allocate elements to A[0], then to A[1] and so on. We have two pieces of information in our dp function:

  1. mask is binary mask of unused elements from B. For example if we have 01001, it means that we still can use element with index 0 and element with index 3.
  2. i is current index in A. Actually, it can be computed from bitmask: how many non-zero bits we have in it. But it is more simple to keep it as well.

Finally, what we do is just try to allocate already not used elements and run function recursilvey, that is all!

Why I spend 1 hour to find my mistake? It is all about bitwise operations: you NEED BRACKETS in (A[i]^B[j]), in the opposite case it will not work.

Complexity

Time complexity is O(n*2^n), because we have 2^n states and O(n) transactions frome one state to others. Space complexity is O(2^n).

Code

class Solution:
    def minimumXORSum(self, A, B):
        n = len(A)

        @lru_cache(None)
        def dp(mask, i):
            if i == n: return 0
            return min(dp(mask ^ (1<<j), i + 1) + (A[i]^B[j]) for j in range(n) if mask & 1<<j)

        return dp((1<<n) - 1, 0)