Surly GPU is good at computing matrix, is there any ASIC work better than GPU in the area of computer graphics? or if current GPU is not powerful enough to deal with some graphics task, is there any cases like AI area using several GPU to do the same task?
OtB_BlueBerry
Is there a specific reason (other than conventions) we transform in this order (scale, rotate, translate)?
motoole2
I'm not aware of ASICs being used for computer graphics, but a GPU is essentially an ASIC customized for computer graphics applications.
As for the order translation / rotation / scale operations, there is no pre-determined order of these matrix multiplication operations! Moreover, changing the order of these operations will also change the output (the result is a different transformation!), since matrix-multiplication operations are not commutative. There may also be multiple translation / rotation / scaling operations applied to a vertex, e.g., R_1 * T_1 * S_1 * T_2 * R_2 * S_2 * v. So choosing the correct order is important.
Surly GPU is good at computing matrix, is there any ASIC work better than GPU in the area of computer graphics? or if current GPU is not powerful enough to deal with some graphics task, is there any cases like AI area using several GPU to do the same task?
Is there a specific reason (other than conventions) we transform in this order (scale, rotate, translate)?
I'm not aware of ASICs being used for computer graphics, but a GPU is essentially an ASIC customized for computer graphics applications.
As for the order translation / rotation / scale operations, there is no pre-determined order of these matrix multiplication operations! Moreover, changing the order of these operations will also change the output (the result is a different transformation!), since matrix-multiplication operations are not commutative. There may also be multiple translation / rotation / scaling operations applied to a vertex, e.g.,
R_1 * T_1 * S_1 * T_2 * R_2 * S_2 * v
. So choosing the correct order is important.