Guide arrow at edge of screen

How would I program such a thing?

Say I have two (X,Y) coordinates, one coordinate is at the center of the screen, while the other coordinates are off the screen somewhere, lets say far off in the upper right hand direction. How would I program an arrow or symbol of some kind that would be drawn at the edge of the screen informing the player which direction the second coordinates are from the center of the screen?

Many games like FreeSpace, FreeLancer, GTA 1 and 2 etc. have such guide arrows... I know how to get an angle between the two points, and I know how to do it with some rather naive code.. but how should I go about doing it, and still be slick at the same time? ???


Interesting Question.

Personally, I would go about doing it something like this...

1) Determine distance to the edge of the screen using the angle given.
2) Using this distance, calculate the arrows location by converting this distance into horizontal and vertical components using sin/cos. (You are going to have to do some manipulations here with the actual location of the pixel, etc.)
3) Draw arrow at that location (rotated appropriatly, of course)

There is most likely a much better method, but this is what I can think of right off of the top of my head.


There's probably a line collision detection method that would apply well here. Let's pretend there's a rectangle which is 20 pixels in from each side of the screen. Fire a ray from the center of the screen to the offscreen object. Calculate where on the rectangle the ray intersects. There's your coordinates.

I just came up with that, but it sounds cool :)


Acutally, you could simplify this by utilizing similar triangles.

Let me explain:

Basically you can use ratios of the corresponding sides of the triangles to calculate the coordinates. So, if you have the distance from the point to the screen, and from the point to the other point... You can use those two numbers to form a ratio that you can multiply any diamention of the other triangle to get the same proportion on the base triangle.

Using this, there might even be away to completely elminate using angles! But, most likely not.


Both your ideas sound very interesting, and probably would work.

Carrus85: But how do you find what the distance is to the edge of the screen at a given angle? It sounds like you would have to do one of those line-collision things like 23 was saying, which brings me to:

23yrold3yrold: That line collision thing is what first came to my head, but I thought it would be rather innefficient...

I think I am going to do this: I won't put the guiding arrow on the edges of the screen. I will rather put the guiding arrow a pre-determined distance from your character in the center of the screen, so all the arrows will form a circle. This will be much easier, less CPU intensive, and still serve its purpose.

What do you think?

Am I just lazy? ;)

EDIT: Woah, just did this "credit goes to" thing. Of course, credit goes to anyone else willing to post thier respective thoughts. :)


I think that will work, but if the arrows are too big it might obscure vision. Maybe make some kind of HUD setup; a large, transparent, semi-thick radar circle around the player, with small arrows on that. Depends how many objects we're talking and how many different types, or what info you want to convey, or if it's just a "You want to go this way" arrow ... I'm sure you can get something set up :)

And my idea isn't inefficent at all. On the contrary; it would probably be pretty darn fast 8-)


Maybe this will help... TRIANGLE DATA

Ok, crash course in basic geometry/calculus...

Deltax = x2 - x1;
Deltay = y2 - y1;

Ratio = Deltay/Deltax

Value of offset on screen = ratio*distance_from_edge_of_screen

That SHOULD work. Notice I did that without using any raycasting or anything... I didn't even use the angle! The angle in this case is only relevent for the actual sprite rotation...


In order to not get an arrow that starts from the center of the screen to the side (which Carrus nicely demonstrated how to get the side location) you'll have to implement some simple calculus. Using Carrus diagram:

theta (angle at point 1) = tan(delta_y/delta_x )

now say we want the arrow (along the hypotenuse) to be 10 pixels in length:

arrow_start_x = screen_width - (10 * cos(theta))
arrow_start_y = (screen_height / 2) - (delta_y/delta_x) - (10 * sin(theta))
arrow_end_x = screen_width
arrow_end_y = (screen_height / 2) - (delta_y/delta_x)

now of course with order of operations, you probably wouldn't need near as many braces, but hey. Hope that helps... I beleive it doesn't matter what angle or position the off-screen coordinate is, the above code will still work... I think. :P

You'd have to change the end point using conditionals (probably) depending on the quadrant of the screen the arrow needs to be on, but the start point should be good.


Ros, In order to calculate the end point of a 10 pixel arrow, you don't even need to use calculus! It can be accomplished with basic geometry.

Just take the ratio of the two triangle hypontenuses (doesn't matter which triagle you use for the other, I suggest point 1 and point 2 as the distance you calculate). This will give you the aspect ratio percentage of the triangle in the corner from the larger triangle. Then you simply multiply deltax by that value to get the x-coordinate offset, and deltay by that value to get the y-coordinate offset.

Sorry, I just like to do things without trig functions if at all possible.


I didn't see this function mentioned here, and I think it fits.

To find the angle between point a and point b, you can pass the delta's of the x and y to this function:

atan2(mouse_y - y, mouse_x - x);

Which returns a radian, if I remember correctly.
If x and y is your player lets say, and mouse_x/y is your cursor, this will be the angle from the player to the cursor...
After that, just find the edge of the screen at that angle. I'm so rusty on my trig... I can't remember how to use radians.. so I'll stop right now :P Sorry, can't make an example!

Thomas Harte

An entire function for you, based largely upon comments above:

1void AddArrow(int x1, int y1, int x2, int y2)
3 x2 -= x1;
4 y2 -= y1;
6 float xratio, yratio, len;
7 xratio = fabs(x2) / (float)(SCREEN_W >> 1);
8 yratio = fabs(y2) / (float)(SCREEN_H >> 1);
9 len = 1.0f / sqrt(x2*x2 + x1*x1);
11 int tx, ty;
13 if(xratio > yratio)
14 DrawArrow( x1+ x2/xratio, (y2 < 0) ? 0 : SCREEN_H, x2*len, y2*len);
15 else
16 DrawArrow( (x2 < 0) ? 0 : SCREEN_W, y1+ y2/yratio, x2*len, y2*len);
19void DrawArrow(int x, int y, int xtangent, int ytangent)
21 line(x, y, x+10*xtangent, y+10*ytangent);
22 line(x, y, x-5*ytangent+5*xtangent, y+5*xtangent+5*ytangent);
23 line(x, y, x+5*ytangent+5*xtangent, y-5*xtangent+5*ytangent);
25 /* a much better solution here would be to draw a textured quad */

An optimised function might keep a lookup of tangents by which edge of screen pixel the arrow tip falls on, saving the sqrt. Such a table would, after all, only be 17.5kb even if you stuck with floats and didn't make any observations about symmetry.


Woah, does that work? :o

I currently implemented the not-on-edge-of-screen arrow, but I could definetly change it to that system later.

I'm having mad pointer problems currently with the game, so I'm chuggin' along... :-/

Thomas Harte

Woah, does that work?

It isn't tested or anything, so there is bound to be some stupid error in it, but it'll either work or very nearly work! And obviously you'll have to actually start passing BITMAP *'s around so that you can change my line's to meaningful versions.

Thread #283220. Printed from