There are **N** job opportunities in one company. And there are **M** candidates for these jobs. Candidate can be hired for the ith job if he has all skills that are needed for this job. There are **K** distinct skills which can be needed for jobs. Company needs only one person for each job. Moreover each candidate can be hired only for one job and the profit of hiring somebody for the ith job is profits[i]. Profit of hiring for the set of jobs equals to sum of profits of each job in set. For the empty set it's 0.

Given 2 2D arrays skillsForJobs (of size N*K) and skillsOfCandidates (of size M*K) which represents required skills for jobs and skills of the candidate respectively.

**Your task is to calculate a maximum profit after hiring a subset of candidates for a subset of jobs.**

Example

- For profits = [4, 10],
- skillsForJobs = [[true, false], [false, true]],
- skillsOfCandidates = [[true, true], [true, false]],
- the output should be = 14 (As candidate 1 will do job 0 and candidate 0 will do job 1.)

Example

Constraints: N,M,K <= 50

Solution?