There is a square array of N by N (N is finite) squares (the squares themselves are 1 by 1 ).
Each square has a positive integer (let's call it colour) assigned .
This array is tiled to infinity in any direction upon some arbitrary plane.
The edges of the arrays are aligned.
An arbitrary triangle is drawn upon this plane. It may be however large or small, near to or far from the origin of the plane's system. I need to calculate the average colour of the triangle. This means that for each and every single square that comes under the arbitrary triangle, even partially, I will add to a sum (initially 0) the square's_colour_value/area_of_the_square_that_is_inside_the_triangle. After having done this for every single square that falls inside the triangle (even partially) I will have the average colour of the triangle; which leads me to my problem. Given the fact that the triangle may be however large, applying this complex method to any possible arbitrary triangle is not wise.
Is there any way in hell to scale down the triangle to within one of the finite tiles (the square arrays, first one for example) in such a way that I can be certain the cromatic average is not at all altered, in order to make it easier to calculate its cromatic average? Or would some other method work, other than scaling (it just popped into mind)? Please enlighten. 10x