new version V4.5 with Scraping sessions from command line

hi guys

Congratulations to the new version.Great work.
I just played around with it , and found something a bit strange with the command line

when i trying to use
jre\bin\java -jar screen-scraper.jar -s "xxx" -p "xxx" from linux, after the scraping session finished
the command prompt doesn't come back. Seems the script still running.Even though it says it finished in the log file.

any1 else got the same situation?

br
//Max

We were just running some

We were just running some tests, and it seems to return for us.

If it's not a problem, could I ask you to email [email protected] (which will come directly to me) and attach the scraping session and maybe a little shell script you can use to run the scrape? That way we can look at some internals of the scrape, and profile it with some development tools, if need be.

Of course, if the scrape contains any sensitive information at all (username, password, etc), you can certainly rely on the fact that such information will not be stored on our computers any longer than to help figure this one out.

Tim

hi Tim I fixed this problem

hi Tim

I fixed this problem with this API scraper.setDoLazyScrape( false );

I guess the default value for this api in ss is ture.

thx

//Max

Ah, okay-- that makes sense

Ah, okay-- that makes sense now. Yes, lazy is the way it normally works for workbench & server mode, but for your context, setting it to "false" will make the scraper work only on that single scrape. And when finished, it should return.

(Sorry to be repeat the issue there-- just trying to make it clear for any others who may view this in the future.)

Glad you found the fix!

Tim

I just started to work with

I just started to work with v4.5, and ran into the same problem while running in Vista. This solution seems to fix it... but if I go and add setDoLazyScrape(false) to all of my scrapes, are there any potential issues I should be aware of?

Also, I've noticed that log messages no longer show up in the command window.

if Lazy is true, then it just

if Lazy is true, then it just makes a new execution thread to run the scrape. If false, it means the window itself will run the scrape. I don't think there's anything more to be aware of, but if Lazy is false, then you can't very well close the console you're using to run it, or else you're terminating the thread and the scrape will be forced to quit.

In your batch file, you can divert the messages to a log if you want... with the "[command] > C:\path\to\file.log" notation. I'm not sure about the subtleties involved with logging and lazy being true or false...

It turns out it's because I'm

It turns out it's because I'm running the Scraping Session through a script. Screen-scraper apparently doesn't like this anymore. The "[command] > C:\path\to\file.log" option just results in an empty log file.

However, if I call the Scraping Session directly, everything seems to work as before.

Is there a way to bypass the

Is there a way to bypass the logging altogether when in running from command-line? A command line option/parameter perhaps?

I still haven't upgraded to 4.5, but for now I'm piping to a log file, but it's not really necessary. (though I'd still need logging when running screen-scraper as an independent app)

Well, if you don't do the

Well, if you don't do the piping part, then the log will go to its normal place (screen-scraper/log/[sessionname][unixtimestamp].log)

You can set the "logging level" from within screen-scraper, if you're on the log tab of the session. You could at least turn it to "Error", which will only record log messages when actual HTTP requests fail, or script errors occur.

Am I talking on the right track?
Tim