Hello, I need to calculate the percentage of proximity that has a real point with respect to a theoretical one that is located in the center of a rectangle of 300 mm side by 340 mm wide.
If the real point (shown in red) is just above the black, it should return 100% and as it moves away that value should decrease until it is at least 0%.
Could someone tell me how this could be achieved or what I need to do research on to be able to do this?