Hmm, so I’m very wary of defending tropical geometry when I know so little about it; if anyone more informed is reading please jump in! But until then, I’ll have a go.
tropical geometry might be relevant ML, for the simple reason that the functions coming up in ML with ReLU activation are PL
I’m not sure I agree with this argument.
Hmm, even for a very small value of `might’? I’m not saying that someone who wants to contribute to ML needs to seriously consider learning some tropical geometry, just that if one already knows tropical geometry it’s not a crazy idea to poke around a bit and see if there are applications.
The use of PL functions is by no means central to ML theory, and is an incidental aspect of early algorithms.
I agree this is an important point. I don’t actually have a good idea what activation functions people use in practise these days. Thinking about asymptotic linearity makes me think about the various papers appearing using polynomial activation functions. Do you have an opinion on this? For people in algebraic geometry it’s appealing as it generates lots of AG problems (maybe v hard), but I don’t have a good feeling as to whether it’s got anything much to do with `real life’ ML. I can link to some of the papers I’m thinking of if that’s helpful, or maybe you are already a bit familiar.
I don’t see why one wouldn’t just use ordinary currents here (currents on a PL manifold can be made sense of after smoothing, or in a distribution-valued sense, etc.).
I think you’re right; this paper just came to mind because I was reading it recently.
whether tropical geometry has ever been useful (either in proving something or at least in reconceptualizing something in an interesting way) in linear programming.
1 Algebraic geometry in general (including tropical geometry) isn’t good at dealing with deep compositions of functions, and especially approximate compositions.
Fair, though one might also see that as an interesting challenge. I don’t have a feeling as to whether this is for really fundamental reasons, or people haven’t tried so hard yet.
2 [….] I simply can’t think of any behavior that is at all meaningful from an AG-like perspective where the questions of fan combinatorics and degrees of polynomials are replaced by questions of approximate equality.
There are plenty of cases where “high degree” is enough (Falting’s Theorem is the first thing that comes to mind, but there are lots). But I agree that “degree approximately 5″ feels quite unnatural.
Hmm, so I’m very wary of defending tropical geometry when I know so little about it; if anyone more informed is reading please jump in! But until then, I’ll have a go.
Hmm, even for a very small value of `might’? I’m not saying that someone who wants to contribute to ML needs to seriously consider learning some tropical geometry, just that if one already knows tropical geometry it’s not a crazy idea to poke around a bit and see if there are applications.
I agree this is an important point. I don’t actually have a good idea what activation functions people use in practise these days. Thinking about asymptotic linearity makes me think about the various papers appearing using polynomial activation functions. Do you have an opinion on this? For people in algebraic geometry it’s appealing as it generates lots of AG problems (maybe v hard), but I don’t have a good feeling as to whether it’s got anything much to do with `real life’ ML. I can link to some of the papers I’m thinking of if that’s helpful, or maybe you are already a bit familiar.
I think you’re right; this paper just came to mind because I was reading it recently.
A little googling suggests there are some applications. This paper seems to give an application of tropical geometry to complexity of linear programming: https://inria.hal.science/hal-03505719/document and this list of conference abstracts seems to give other applications: https://him-application.uni-bonn.de/fileadmin/him/Workshops/TP3_21_WS1_Abstracts.pdf Whether they are ‘convincing’ I leave up to you.
Fair, though one might also see that as an interesting challenge. I don’t have a feeling as to whether this is for really fundamental reasons, or people haven’t tried so hard yet.
There are plenty of cases where “high degree” is enough (Falting’s Theorem is the first thing that comes to mind, but there are lots). But I agree that “degree approximately 5″ feels quite unnatural.