Research ideas - giguru/fact-ai GitHub Wiki
Research ideas
Giguru & Dam:
- Instead of having sparse linear explanations, try sparse polynomial explanation. The formula now is something of the form y=Ax. Maybe try something y = Ax + Bxx. The formula now is t(x) = x + δ, which causes problems for clusters with differing varias, we can possibly try t(x) = λx + δ (scalar multiplication, possibly correctly enlarging/shrinking variance) and t(x) = Mx + δ (matrix-vector multiplication resulting in a mapped vector) This could give explanations with better coverage and correctness. But how does this affect the goal of the paper: explainability? Sparse linear explanations are comprehensible for most people, but sparse polynomials become more "complex"/unexplainable.
- Dive into compressed sensing, since the paper borrows its formulas (and semi-borrows its requirements), maybe understanding that better (or reading up on recent developments in that field which we can then project onto our paper) yields a nice idea which we can research
- Find a new data set, not covered in the paper, which we can use to demonstrate the method on. Show the results, and report reproducability on other real-life problems.