Did you know that if you draw a circle that fills the screen on
your 1080p high definition display, almost a million pixels are lit?
That's a lot of pixels! But do you know exactly how many pixels are lit?
Let's find out!
Assume that our display is set on a Cartesian grid where every pixel
is a perfect unit square. For example, one pixel occupies the area of a
square with corners (0,0) and (1,1). A circle can be drawn by specifying
its center in grid coordinates and its radius. On our display, a pixel
is lit if any part of it is covered by the circle being drawn; pixels
whose edge or corner are just touched by the circle, however, are not
lit.
Your job is to compute the exact number of pixels that are lit when a circle with a given position and radius is drawn.
The input consists of several test cases, each on a separate
line. Each test case consists of three integers, x,y, and
r(1≤x,y,r≤1,000,000), specifying respectively the center (x,y) and
radius of the circle drawn. Input is followed by a single line with x = y
= r = 0, which should not be processed.
For each test case, output on a single line the number of pixels that are lit when the specified circle is drawn.
Assume that the entire circle will fit within the area of the display.
#include <stdio.h>
#include <math.h>
void main(){
long long x,y,r;
long long i,j,sum,h;
while(scanf("%lld %lld %lld", &x, &y, &r) != EOF){
if(x == 0 && y == 0 && r == 0){
break;
}
sum = 0;
for(i=0; i<r; i++){
j = (r*r - i*i);
h = (long long)ceil(sqrt((double)j));
sum += h;
}
printf("%lld\n", sum*4);
};
}