|
This thread is locked; no one can reply to it. |
1
2
|
network gaming |
Bruce Perry
Member #270
April 2000
|
Funny. I didn't have any problems with data loss when I wrote a networked game (or source only for non-Windows users) that didn't use threads. Are you using UDP or TCP? -- |
piccolo
Member #3,163
January 2003
|
I'm using tcp. wow |
Bruce Perry
Member #270
April 2000
|
That shouldn't happen. TCP doesn't do that. My game uses TCP as well, and I have had zero problems with data loss. You're obviously doing something wrong, or misinterpreting some results somewhere. You should start a thread (EDIT: no, not with pthreads) for this problem and share some code -- |
Michael Jensen
Member #2,870
October 2002
|
piccolo, that's not because of multithreading, your program is broken. I've written console application chat programs that were not multithreaded, and chatted with several users on a network, and/or several fake users on the same machine as the server -- it all worked fine, no multithreading, no data loss, TCP of course. edit: the way tcp is implemented is that it has it's own buffer, so your application does not need to be multithreaded -- if you wait 5 seconds to read the data, it should still be there. /edit UDP on the other hand, you can expect packet loss, especially if you're saturating the network with UDP packets...
|
piccolo
Member #3,163
January 2003
|
hmmm ok. I though that because the game had a game loop and a network loop that are not multi-threaded and runs one after the other,data will be lost. because while in the game loop you can not receive stuff that is meant for the network loop. Thats why i said you would have to network both loops so they run at the same time. wow |
Michael Jensen
Member #2,870
October 2002
|
It sounds like a bad design to me... Why cant you just poll the network for messages in your game loop?
|
piccolo
Member #3,163
January 2003
|
all data is pooled in the network loop. then processed in the game loop. wow |
GullRaDriel
Member #3,861
September 2003
|
Because when his app is backgrounding it is not running yet. The app just 'hang' until it get focusing again.
Try to add this in your initialization routine: if (set_display_switch_mode(SWITCH_BACKGROUND) != 0) { fprintf( stderr , "Warning: can not change switch mode to SWITCH_BACKGROUND "); if( set_display_switch_mode(SWITCH_BACKAMNESIA) !=0) { fprintf(stderr,"Error: can not change switch mode to BACKAMNESIA"); return FALSE; } }
"Code is like shit - it only smells if it is not yours" |
piccolo
Member #3,163
January 2003
|
i telling you I'm using that already.
wow |
GullRaDriel
Member #3,861
September 2003
|
piccolo said: i telling you I'm using that already. You are just now telling me that. You do not check its return value. How do you know you are really using a good switch mode ? Is it configured on both client & server ? SWITCH_BACKAMNESIA works fine in fullscreen while SWITCH_BACKGROUND works in windowed mode. If all the previous things are ok, it is your implementation who is buggy. "Code is like shit - it only smells if it is not yours" |
piccolo
Member #3,163
January 2003
|
yes its the same one client and server. i think im going to make 3 more network core instrutions. wow |
Jonatan Hedborg
Member #4,886
July 2004
|
So, uh, you are going to duplicate what TCP is already doing, only worse? Just make it work like you have it. TCP does not drop packages (it drops the connection if a certain package takes to long to reach it's destination).
|
GullRaDriel
Member #3,861
September 2003
|
Me said: You do not check its return value. Alors ? "Code is like shit - it only smells if it is not yours" |
Michael Jensen
Member #2,870
October 2002
|
JH is right, TCP will not drop your packets -- I've run a console server app, and an allegro app (of my own design) side by side on the same machine, and not had any problems thus far, and have even tested in on several machines, and networks... You start the console app, let it serve clients, start a client, and connect. -- If the connection becomes unstable, the TCP connection will disconnect, the-end. It will never drop, lose, re-order, or scramble your packets; if it absolutely needs to, it just terminates your connection. The OS is multithreaded so your apps don't need to be, though having them not take 100% of the cpu usage in a wait loop as discussed earlier is nice. If the OS wants to do something, it does it, without asking permission, it will yank the thread away, process the TCP/IP stack, and then give the thread back in it's own sweet damn time. There's something wrong with your program or your understanding of network programming. Edit: Edit: Also in windows, I'm not sure that you can run two seperate applications at once that use the same implementation of allegro timers. But I might be mistaken. Edit: And I'm pretty sure that any calls to vsync() will block indefinitely while your app is minimized, etc, etc... this is a bad design, just dont do it -- use a console app, a service, or a windows GUI app as your dedicated server.
|
|
1
2
|