![]() |
|
This thread is locked; no one can reply to it.
![]() ![]() |
1
2
|
network gaming |
Neil Walker
Member #210
April 2000
![]() |
Hello, I'm trying to weigh up the pros/cons between a networked game that initialises itself both as the server and as a client (as a player), and using the same library but as a separate running application for the server and the client (e.g. like bzflag). Ignoring the details of determining the controlling user, etc. As for the network library I haven't decided yet, but I'm more drawn to raknet. But enet interestes me, as does the library on bafs server - daynse or something - forgot it's name. Neil. wii:0356-1384-6687-2022, kart:3308-4806-6002. XBOX:chucklepie |
kentl
Member #2,905
November 2002
|
Well how the scheduler works regarding processes and threads depends on the platform. As Windows and Linux behave differently in this part. But are you really concerned with what's more efficient? (As both metods will be efficient enough.) What's the easiest and most convenient way for you? Personally I would separate the two as I find it more logical. A computer can act as a game server even when I don't want to play the game on that computer. But that's just me. Both ways work. |
Peter Wang
Member #23
April 2000
|
Another option you might want to consider is to have one application and two threads, but using cooperative multithreading instead of preemptive multithreading. This is sometimes called fibres. You get some of the advantages of threads (easy communication between threads, and you only need one copy of the game data in memory) without all the headaches of preemptive multithreading. Of course, sometimes you do need preemption or maybe you'd like your program to make use of multiple CPUs.
|
Neil Walker
Member #210
April 2000
![]() |
tbh, I was more drawn to the separate server as then all I have to code in a game is client connections. But will an Allegro game be ok with this as from experience allegro tends to use up 100% of the cpu most of the time? Neil. wii:0356-1384-6687-2022, kart:3308-4806-6002. XBOX:chucklepie |
Frank Drebin
Member #2,987
December 2002
![]() |
if your game doesn't run 100 cycles per second (like mine does |
Audric
Member #907
January 2001
|
Neil said: allegro tends to use up 100% of the cpu most of the time It's not Allegro, it's the game programmer's code. edit: It's a matter of Busy waiting |
Michael Jensen
Member #2,870
October 2002
![]() |
Yeah, it's definitely the busy waiting -- make sure not only to not use busy waiting in your game code, but also not in your game server's code (it's really easy to do that in a network app, it's just so damn convienient!) -- Also most of the allegro timer examples I see implement busy waiting. A simple way to fix it, is when you lock your game to, say, 60fps, and the screen has been updated, and it's not time yet to call your game's logic methods... call: rest(10);
http://www.allegro.cc/manual/api/timer-routines/rest It's much easier, IMHO to do it as two seperate applications, If you're stuck in C++ (possibly for linux or portability or comfort) I'd make the game server a console app (many comercial games do this) -- Preferably I'd do the game server in C# or VB.net because you get a very convient timer class, easy to build forms (much nicer than a console interface) and managed sockets/pinging/web services(making it really damn easy to keep track of who is running an internet server)/etc. .NET sockets can even be put into an event based mode, where your code doesn't have to check for new connections or for new data -- it's told when it has it. "Bye Bye" to server-side busy waiting! Then you can simply use a timer that ticks every so often that does server side game logic, and you're mostly done...
|
Neil Walker
Member #210
April 2000
![]() |
I was planning on making the server a simple console app using raknet, but then I thought I could always create it with allegro in conjunction with the 'allegro console' library to a few basic scrolling text areas to display server information and to allow server commands, etc. Neil. wii:0356-1384-6687-2022, kart:3308-4806-6002. XBOX:chucklepie |
Audric
Member #907
January 2001
|
Well, if you keep busy waiting, your game will attempt to run as fast as the computer allows. People who are playing your game on a computer worse than your testing machine will be thankful. From my experience with some professionnal windows servers, it was really infuriating to see a software server run a week-end update for 8 hours, while the CPU (busy time) peaked at 23%. The middleware was acting multi-task-friendly at all times, and there was no way for a programmer or administrator to tell it to run as fast as you can, dummy |
Michael Jensen
Member #2,870
October 2002
![]() |
Quote: Well, if you keep busy waiting, your game will attempt to run as fast as the computer allows. You only preform a rest if there is time left over -- if your game only does lps/fps of 50 or so, and the computer is capable of much more, you don't just sit there in a busy loop, and on old computers, if it has extra time, you should be resting too as this will free time slices up for other threads (maybe they have aim or winamp running in the background) -- this will speed your game up in the end since the OS is going to jerk your time slice away at some point, it's just better to let it know when you're ready to have it jerked away. A game server/game is different from a network application -- I'm not saying "give up your game loop cycles" I'm saying give up your extra cycles that would be spent in a wait loop doing nothing anyway as that slows the overall computer down -- if you're game runs at full speed on my pc at 25% usage, it should only take 25%, and then on someone elses where it needs 50% it will only take %50 -- etc, this is easily accomplished. Offtopic: If you find a generic application running too slow or fast you can promote/demote the thread manually via the task manager.
|
Audric
Member #907
January 2001
|
You could use a high-precision timer to determine how much time is remaining after all your logic (and display), before the beginning a new cycle. This information could let your program decide if it should yield once, wait (and how long), or busy-wait. I cannot speak from experience as I've not tested recently, but the docs and posts I've seen in the last 6 months all point to the same issue: The Allegro timers are not accurate on Windows. I don't think missing a retrace is important, but it's the speed inconsistency that bothers me. Note: Please feel free to correct me if I'm wrong, I'd love to see a solution that would automatically : |
ImLeftFooted
Member #3,935
October 2003
![]() |
Use GetTicks / gettimeofday for a high precision timer. |
Myrdos
Member #1,772
December 2001
|
The trick here is the difference between a sleep and a yield. If you "yield" your thread every logic update, you're still using 100% CPU. Whenever the OS sees a yield, it sees if any other processes want to run. If no, or their load is very light, it reschedules your program. If your program sleeps, it "yields", and tells the OS not to bother rescheduling it for x milliseconds. So you want to "yield" if your program needs as much CPU as it can get, and sleep otherwise. Sleeping for 0 milliseconds is a good way to yield. The following code implements such a system, and maintains a constant number of logic updates per second while allowing FPS to drop on slower machines. It never lets the FPS drop to more than 1, so that very old machines can at least get a little feedback. The basic idea is this: If we're redrawing graphics and haven't updated logic, sleep. If we're redrawing graphics and the logic has changed, yield. Globally:
Initialization: LOCK_VARIABLE(_logicTime); LOCK_VARIABLE(_engineWarning); LOCK_FUNCTION(ticker); if (install_int_ex(ticker, BPS_TO_TIMER(CYCLES_PER_SEC)) < 0) cout << "Couldn't start ticker function" << endl; Game Engine:
In the logic part of your code, to calculate FPS: if (logicCounter >= CYCLES_PER_SEC) { currentFPS = graphicsCounter; _gfxTime = 0; graphicsCounter = 0; logicCounter = 0; }//if logicCounter In the logic part of your code, display a warning if logic updates were skipped: if (_engineWarning) { _engineWarning = false; cout << "WARNING: logic updates were skipped. Your computer is probably too slow to run this game." << endl; }//if _engineWarning This is for a 2D game, if you were doing a 3D game you might want to update your graphics even if there was no logic update. (But you would still sleep.) __________________________________________________ |
Audric
Member #907
January 2001
|
(quoted)
rest(20) will have a duration of between 20 and 25ms, even if nothing is running in background. Or I'm just completely paranoid...:-/ |
Myrdos
Member #1,772
December 2001
|
Audric: I have found rest to be very accurate on an unburdened system. See here for more, and some stats from Windows and Linux. The "inaccuracy" of sleeping on an unburdened system has nothing to do with Allegro's timers. It's only that the OS won't reschedule a sleeping program to run until some minimum time has elapsed. If I sleep for 20 ms, I tend to get 20 ms (if OS is unburdened). If I sleep for 1 ms, I tend to get at least 10-20 ms, depending on the OS. But it doesn't matter in the slightest. The beauty of a 'proper' game engine, where logic and graphics are separated, is that you can have quite large variations in the time between two consecutive logic updates. No one will notice as long as you have the correct number of updates each second. The kind of precise timing you're longing for doesn't exist in desktop OSes. If you think you can get millisecond-accurate scheduling, guess again. __________________________________________________ |
Audric
Member #907
January 2001
|
Thanks DMC, there have been so many threads about this, I'm still confused. I'll convert my work-in-progress to your method for testing. (I was simply using vsync + draw + 1 to N logic updates to catch up on a tick timer) Neil, sorry of the derailing :/ So, back on the topic of server and client apps... |
Michael Jensen
Member #2,870
October 2002
![]() |
Quote: By the way, the server can be a blind console, so it wouldn't require Allegro. I would make it that way if I were doing it in C, since there's no reason for it to even need allegro, it also becomes exextremelyortable (not that allegro isn't portable -- but you could compile it on anything that has sockets and C, even a system that allegro hasn't been ported to... etc) Quote: The basic idea is this: If we're redrawing graphics and haven't updated logic, sleep. If we're redrawing graphics and the logic has changed, yield. Sorry for keeping the train wreck going, but this doesn't make any sense to me; If the logic needs and update, why would we sleep or yield? shouldn't we only yield as an alternative to busy waiting? If the application has things it needs to do, I don't understand why we should sleep or yield; unless our application is running too fast we shouldn't. (the whole reason we have a busy wait loop, to slow down the program -- we don't busy wait for no reason, only when the program is running to fast and has extra time...)
|
Audric
Member #907
January 2001
|
If you don't yield when you decide it's the right moment, the OS will take control when IT decides. It's better to yield yourself than risk getting caught in the middle of a blit to screen. (edit: scrapped a justification on rest(), irrelevant) |
Michael Jensen
Member #2,870
October 2002
![]() |
right, but you have to decide when the right moment is, and the right moment is never when you have work that needs to be done, it's when you have extra time (and you almost always have extra time). So if the screen needs to render, or logic needs to update, you should do that before considering waiting -- if the system is too slow, you never call wait/etc and the OS will yank away your thread when it needs to, and you can't help it -- adding wait/rest would only make your program suffer more -- if you're computer is fast enough, and it is sitting in a busy wait loop (pushing cpu usage to 100%) you should yield/rest then... no? I don't understand.
|
Myrdos
Member #1,772
December 2001
|
I created a new thread for the CPU usage discussion here. (I think we've derailed this one enough) __________________________________________________ |
piccolo
Member #3,163
January 2003
![]() |
In my mmrpg I use different exes one is client and one is server. In the code I use a variable so the code knows witch main.cpp file is interpreting it. For testing, I open the server and client on the same computer. Because the game is not threaded some data is lost. This is because when the client or server is sending data it can not receive data at the same time. wow |
GullRaDriel
Member #3,861
September 2003
![]() |
piccolo: you fail. "Code is like shit - it only smells if it is not yours" |
piccolo
Member #3,163
January 2003
![]() |
I don't not understand your comment. wow |
GullRaDriel
Member #3,861
September 2003
![]() |
piccolo said:
For testing, I open the server and client on the same computer. Because the game is not threaded some data is lost. This is because when the client or server is sending data it can not receive data at the same time. </quote> Its not because they are not threaded. Your OS is multitask. You do not have set the right switch mode ( RTM ) . I can give you my code ( which is not the best, by far ) where a running server in background send and receive data from multiple client on the same computer. I will not since you got searched the set_display_switch function in da fuck'in manual ! Hope it helps. EDIT: some various edits. _ "Code is like shit - it only smells if it is not yours" |
piccolo
Member #3,163
January 2003
![]() |
o thats what you meant i was misleading in typing. Quote: Because the game is not threaded some data is lost. This is because when the client or server is sending data it can not receive data at the same time. That should of had its only line with a space. The lack of threading is why data is lost. That is what i meant to say. EDIT: wow |
|
1
2
|