• Welcome to Valhalla Legends Archive.
 

Help with never ending loop

Started by Atom, May 15, 2004, 09:29 AM

Previous topic - Next topic

Atom

Maybe someone can help me, I am making a utilitie for CS and it requires a main loop that will run over and over kinda like a game loop. I get the problem where the process runs up 100% cpu usage and pretty much fails. How can I help this?

You guys dont really need this but this is the code i have right now thats going to go in the loop.

Dim hwnd As Long
hwnd = FindWindow(vbNullString, "Counter-Strike")
If hwnd > 0 Then
Call DrawText(hwnd, 100, 500, "Counter-Strike Detected", &HFF&)
Call DrawText(hwnd, 100, 515, ".:CST00LZ vZero by Atom:.", &HF48020)
I am back! aINC is dead, ThinkTank PRO is alive.
VB, JAVA, ASM, C, its all yummy to me.

Adron

Quote from: Atom on May 15, 2004, 09:29 AM
Maybe someone can help me, I am making a utilitie for CS and it requires a main loop that will run over and over kinda like a game loop. I get the problem where the process runs up 100% cpu usage and pretty much fails. How can I help this?

You guys dont really need this but this is the code i have right now thats going to go in the loop.

Dim hwnd As Long
hwnd = FindWindow(vbNullString, "Counter-Strike")
If hwnd > 0 Then
Call DrawText(hwnd, 100, 500, "Counter-Strike Detected", &HFF&)
Call DrawText(hwnd, 100, 515, ".:CST00LZ vZero by Atom:.", &HF48020)


Having 100% CPU usage from a simple game loop is normal. If you only need to check every X seconds or milliseconds, you should insert a delay statement into the loop. Something as simple as "Sleep 100" would probably do.

Atom

Well i might as well use a timer then i guess, thanks for your help!
Only problem with a timer is that i draw text onto the cs window, and it of course flickers with the 1ms delay.
I am back! aINC is dead, ThinkTank PRO is alive.
VB, JAVA, ASM, C, its all yummy to me.

Lenny

#3
IIRC, the Visual Basic timer control is only accurate to the nearest 55 ms.

You would be better off using sleep as Adron said, or a high resolution timer.....
The Bovine Revolution
Something unimportant

Live Battle.net:

WARNING: The preceding message may have contained content unsuitable for young children.

Skywing

#4
Quote from: Lenny on May 15, 2004, 10:58 PM
IIRC, the Visual Basic timer control is only accurate to the nearest 55 ms.

You would be better off using sleep as Adron said, or a high resolution timer.....
Sleep is only accurate to at best 10ms or so.  No delay execution function will provide you with better than 10ms minimum scheduling granularity on a typical x86 system (excluding busy-waits) because that is as low as the system clock interrupt goes.

You should probably write to CS's backbuffer to avoid the flicker.

Grok

Quote from: Skywing on May 17, 2004, 10:22 PMSleep is only accurate to at best 10ms or so.  No delay execution function will provide you with better than 10ms minimum scheduling granularity on a typical x86 system (excluding busy-waits) because that is as low as the system clock interrupt goes.

Configurable?  Did you find this in the IX86 programmer's guides?  I'll read about it when I get home.

Skywing

#6
Quote from: Grok on May 18, 2004, 09:59 AM
Quote from: Skywing on May 17, 2004, 10:22 PMSleep is only accurate to at best 10ms or so.  No delay execution function will provide you with better than 10ms minimum scheduling granularity on a typical x86 system (excluding busy-waits) because that is as low as the system clock interrupt goes.

Configurable?  Did you find this in the IX86 programmer's guides?  I'll read about it when I get home.
None of the standard x86 HALs for NT will let you go below 10ms.  I don't think you'll be able to go below 10ms with typical x86-based system hardware either (even if you convinced the HAL to try).

This isn't something that you would find in the Intel CPU manuals because it's a limitation of the timer chip and not the CPU.

Adron

I think you can go below 10 ms. IIRC it's just the standard timing. In DOS days, you could reprogram the timer to fit your application, but to keep the clock running and stuff updating, you had to call the original handler every 10 ms. Increase timer frequency by X times, then call original handler every X ticks.

Skywing

#8
Quote from: Adron on May 19, 2004, 05:54 AM
I think you can go below 10 ms. IIRC it's just the standard timing. In DOS days, you could reprogram the timer to fit your application, but to keep the clock running and stuff updating, you had to call the original handler every 10 ms. Increase timer frequency by X times, then call original handler every X ticks.
Are you guaranteed hardware support for <10ms resolution, though?  Or is that just a nice feature that some computer systems may have?

Adron

#9
Quote from: Skywing on May 19, 2004, 09:22 AM
Quote from: Adron on May 19, 2004, 05:54 AM
I think you can go below 10 ms. IIRC it's just the standard timing. In DOS days, you could reprogram the timer to fit your application, but to keep the clock running and stuff updating, you had to call the original handler every 10 ms. Increase timer frequency by X times, then call original handler every X ticks.
Are you guaranteed hardware support for <10ms resolution, though?  Or is that just a nice feature that some computer systems may have?

There are never guarantees with computers, but:

PORT 0040-005F - PIT - PROGRAMMABLE INTERVAL TIMER (8253, 8254)
Notes:   XT & AT use ports 40h-43h; PS/2 uses ports 40h, 42h-44h, and 47h
   the counter chip is driven with a 1.193 MHz clock (1/4 of the
   original PC's 4.77 MHz CPU clock)

0040  RW  PIT  counter 0, counter divisor         (XT, AT, PS/2)
   Used to keep the system time; the default divisor of (1)0000h
   produces the 18.2Hz clock tick.

That's the default DOS timer, having a frequency of 18.2 Hz with a divisor of 10000. If you set the divisor to 1, you should get a frequency of 1192755 Hz. And then perhaps you'll be stuck in the interrupt forever, but.... :P

edit: Interesting thing to try: Make a driver that outputs a lower value to that port and see what happens? Port 42h is the speaker frequency, so you can obviously program the timer chip to generate higher frequencies than 100 Hz.

Atom

My posts always spark the most interesting questions.
I am back! aINC is dead, ThinkTank PRO is alive.
VB, JAVA, ASM, C, its all yummy to me.

MyndFyre

Quote from: Adron on May 24, 2004, 04:40 PM

PORT 0040-005F - PIT - PROGRAMMABLE INTERVAL TIMER (8253, 8254)
Notes:   XT & AT use ports 40h-43h; PS/2 uses ports 40h, 42h-44h, and 47h
   the counter chip is driven with a 1.193 MHz clock (1/4 of the
   original PC's 4.77 MHz CPU clock)

0040  RW  PIT  counter 0, counter divisor         (XT, AT, PS/2)
   Used to keep the system time; the default divisor of (1)0000h
   produces the 18.2Hz clock tick.

That's the default DOS timer, having a frequency of 18.2 Hz with a divisor of 10000. If you set the divisor to 1, you should get a frequency of 1192755 Hz. And then perhaps you'll be stuck in the interrupt forever, but.... :P

Hrm....  I think "perhaps" is giving it too much credit; based on my (admittedly-limited) knowledge of the timer mechanism, if there is an interrupt on every cycle -- wouldn't you be stuck forever on the interrupt, no "perhaps" about it?  You could also overload the interrupt registers....

Hrm, sounds like a fun project. :)
QuoteEvery generation of humans believed it had all the answers it needed, except for a few mysteries they assumed would be solved at any moment. And they all believed their ancestors were simplistic and deluded. What are the odds that you are the first generation of humans who will understand reality?

After 3 years, it's on the horizon.  The new JinxBot, and BN#, the managed Battle.net Client library.

Quote from: chyea on January 16, 2009, 05:05 PM
You've just located global warming.

Stwong

If it lags CS, just add a DoEvents somewhere.  Works like a charm, usually.

Adron

Quote from: Myndfyre on June 01, 2004, 07:59 PM
Quote from: Adron on May 24, 2004, 04:40 PM
   the counter chip is driven with a 1.193 MHz clock (1/4 of the
   original PC's 4.77 MHz CPU clock)

Hrm....  I think "perhaps" is giving it too much credit; based on my (admittedly-limited) knowledge of the timer mechanism, if there is an interrupt on every cycle -- wouldn't you be stuck forever on the interrupt, no "perhaps" about it?  You could also overload the interrupt registers....

Hrm, sounds like a fun project. :)

Well, the frequency isn't equal to the cpu frequency. So I'm not sure whether the interrupt handling time will be enough to keep you occupied all the time or if you'll be able to execute a few regular instructions once in a while.

But it shouldn't be too hard to test. Just write a simple driver to output those values to the port and see what happens.

Lycaon

#14
Try throwing in a DoEvents at the very end of your Loop if you haven't done so already.  This allows other processes their chance at grabbing proccessor time, otherwise, the system devotes most of it's CPU resources to the VB Loop (I don't know if this is a screwup on M$'s part or if it's there intentionally).

Only downside to this is that it might be excessive.  A Do / Loop with a DoEvents in it can loop a few thousand times per second.  If that's an issue you could throw in another small Do / Loop to limit your code to looping, say, every 5 ms.



Private Declare Function GetTickCount Lib "kernel32" () As Long

Dim oTime As Single

Do

' My Big Loop (tm)

oTime = GetTickCount

Do
DoEvents
Loop Until oTime + 5 < GetTickCount ' Loop for 5 ms

Loop ' End of my big loop


Alternatively, you could use the Sleep API if you don't mind >10 ms precision.


Private Declare Sub Sleep Lib "kernel32" (ByVal dwMilliseconds As Long)

Do

' My Big Loop (tm)

Sleep 10 ' Pause for 10ms

Loop ' End of my big loop



If you REALLY need better precision you might be able to do something with QueryPerformanceCounter...  But if you need that much precision, why are you programming in VB? :P

Edit:  Argh, I really need to read the posts right above mine, hehe.