# Given the function of a line (linear), how can I get the number of occurrences that given points land above or below that line?

3 views (last 30 days)
JARED VAHRENBERG on 29 Oct 2020
Edited: Ameer Hamza on 29 Oct 2020
So, I'm working on a code that I know a line, let's say its in the form y=mx+b and its just y=x in this case (for simplicity). And I have a couple points that I want to know if they are above or below that line, doesnt matter.
I want the code to return an array like [abovecounts, belowcounts].
My initial thinking was to plug in the values for m and b, and then get a matrix of x and y values that go from like -10 to 10 or whatever, and then from there count the number of points that I plug in are above that. But mine is counting it for every x value. For example like I have y=x and I want to know if (0,1) is above it (which duh it is) but it counts it at every x value so I get a return of 10.
How can I define the line and then count the points once for above/below?

Ameer Hamza on 29 Oct 2020
Edited: Ameer Hamza on 29 Oct 2020
The method you described seems to be very inefficient and overkill to solve a very simple problem. Suppose you have a point (x1, y1). Just plug in the value of 'x1' in your equation y' = m*x1 + b. If the value of y1 > y', your point is above the line; otherwise, it is below the line.
For example
m = 1;
b = 0.1;
x = rand(10, 1);
y = rand(10, 1);
y_ = m*x+b;
mask = y > y_;