The command I want to run takes a few minutes and returns lots of unimportant data. The important stuff returns in the first 10 or so lines of the output. What I want to do is run this command through exec(), kill the process after 30 seconds, but be able to get the output that was generated in those 30 seconds.
Using the popular function "PsExecute", I can set a timeout to 30 seconds. However, this does not allow me to get the results of the output.
I can't think of a way to do both kill the process after so many seconds, and return the output. Can this be done?
Does the data you find valuable equate to the first 10 lines of a file you are reading/analyzing?
Another question, how important is it that this process be done in real-time? In other words, can you run the several minute process say at 1:00 AM and then just read the output in your script on demand?
push the results of the command into a file and read the file back?
Thanks for the replies.
It is important for it to run in real-time, or at least almost real-time. The process in question can sometimes take a few minutes, but other times could take hours to run - which would be wasting resources if all I need is the first 10 lines of the output.
@StarLion: I thought about doing this, but I couldn't figure out how to write the results of the command in real-time. I can write all the results to file and then look at the first 10 lines, but by then, hours have passed and the data is no longer useful. How can I write the results to a file in real-time, whereas if I kill the command, the results will still be there?
Oooeeee... it seems that the writing to file wasn't working in my command as PsExec wasn't using it. I edited the function to save the output to a temp file. It looks like this solved my problem, as I'm assuming I'll be able to get the contents of this file. I'm guessing this isn't the 'proper' way to get the output, but if it works it works.
I'm still wondering how you'd kill a process though, you have your data in a file now, but meanwhile server cycles are spinning away needlessly processing a file.
I guess it must be possible to either id the process number and kill it, or maybe set a really really short time-out on the script.
Yes, I 'id' the process when it is ran, and then I kill it after so many seconds. You can do that with the custom functions found here:
My problem was not killing the process, but getting the output. Saving the output to a file from the command line, such as "program -stuff > /tmp/file", allows me to get a partial output before the process is killed.
This topic is now closed. New replies are no longer allowed.