The problem is function f that gets a list of words D and a word w of length n and returns a list of indices I where to split w such that each part is in D or "false".
Cleary, I has length at most n and the indices in it are also at most n.
Also, you need to look at every word in D.
Thus, the trivial lower bound of the complexity for an algorithm that computes f is O(n+|D|).
To prove a tighter lower bound, you would need to show that every algorithm that does it in in linear time has a bug - you need to find this "pathological" case for every imaginable algorithm!
The problem is function f that gets a list of words D and a word w of length n and returns a list of indices I where to split w such that each part is in D or "false".
Cleary, I has length at most n and the indices in it are also at most n. Also, you need to look at every word in D. Thus, the trivial lower bound of the complexity for an algorithm that computes f is O(n+|D|).
To prove a tighter lower bound, you would need to show that every algorithm that does it in in linear time has a bug - you need to find this "pathological" case for every imaginable algorithm!
Afaik it is still unknown of SAT has a quadratic lower bound! https://cstheory.stackexchange.com/questions/17789/is-there-...