This reminds me of ML algorithms that use gradient descent to minimize a multivariable function which is sort of the opposite of what is said on this slide.
barath
Actually, there is also gradient ascent in ML where you try to maximize a concave function rather than trying to minimize a convex function (Reaching the top of the mountain vs reaching the bottom respectively).
ahhuang
Like what was said previously, this is a common method for objective function minimization in ML, although it's usually gradient descent.
dchen1
I've only really encountered this in ML, I had no idea this stuff was in graphics too. I've never thought about what kinds of things in graphics would need optimization, can anyone give me a quick example maybe?
keenan
There's really no difference between minimizing a convex function and maximizing a concave function: you're just putting a minus sign in front of the objective.
(On the other hand, maximizing a convex objective or minimizing a convex objective may not be as straightforward!)
This reminds me of ML algorithms that use gradient descent to minimize a multivariable function which is sort of the opposite of what is said on this slide.
Actually, there is also gradient ascent in ML where you try to maximize a concave function rather than trying to minimize a convex function (Reaching the top of the mountain vs reaching the bottom respectively).
Like what was said previously, this is a common method for objective function minimization in ML, although it's usually gradient descent.
I've only really encountered this in ML, I had no idea this stuff was in graphics too. I've never thought about what kinds of things in graphics would need optimization, can anyone give me a quick example maybe?
There's really no difference between minimizing a convex function and maximizing a concave function: you're just putting a minus sign in front of the objective.
(On the other hand, maximizing a convex objective or minimizing a convex objective may not be as straightforward!)