Same scrape on multiple URL

Hi there. I've got roughly 100 URL's that I need to scrape, e.g.

http://www.domain.co.uk/shelves/Breakfast.html
http://www.domain.co.uk/shelves/Desserts.html

Each URL is in the same format and I've created a single scraping session that works for all of them. Now I could create a scraping session for each of the 100 URL's, however this seems like the long way round as I'm basically repeating the same task 100 times!

Ideally what I want is a way to automate a single scraping session to do these 100 URL's in order, does anyone know a way?

Many thanks

Great program and thanks for the basic edition...

The easy solution would be to

The easy solution would be to use the server mode in professional or enterprise editions, and write a little scrape that will launch each. Barring that, you can do something similar for basic edition using a batch file. See the tutorial on how to start screen-scraper remotely, and it shows how to make a batch to start a scrape.

Thanks

thanks for the reply Jason, I found one section regarding running scraping sessions from the CLI e.g.

"jre\bin\java" -jar screen-scraper.jar -s "Product Page 1 Scrape"

This worked fine for the sessions I had already done, but couldn't figure out a way I could write it so it would dynamically take a URL and process that, I was expecting something like this:

"jre\bin\java" -jar screen-scraper.jar -s "Product Page 1 Scrape" -p URL http:\\www.domain.com\products\page2.html

Kind Regards

Pretty close "jre\bin\java"

Pretty close

"jre\bin\java" -jar screen-scraper.jar -s "Product Page 1 Scrape" -p "URL=http:\\www.domain.com\products\page2.html"

That assumes that you're setting the URL with a session variable named URL, but it would work.

I changed the URL on the

Hi Jason.

I FIRST TRIED THIS:
-------------------
I changed the URL on the PROPERTIES tab to the session variable ~#URL#~ and tried the following from a batch file:

"jre\bin\java" -jar screen-scraper.jar -s "Product Page 1 Scrape" -p "URL=http:\\www.domain.com\products\page2.html"

However this failed to run anything, seemingly because it wasn't picking up the URL variable.

THEN TRIED THIS AND IT WORKED:
------------------------------
Taken from Tutorial 7, I just created a text file with all my URL's in, set the URL on the PROPERTIES tab to the session variable ~#URL#~ and then run the scraping session. Reads in each URL. Saved me a lot of time!

File inputFile = new File( "urls.txt" );

// These two objects are needed to read the file.
FileReader in = new FileReader( inputFile );
BufferedReader buffRead = new BufferedReader( in );

// Read the file in line-by-line. Each line in the text file
// will contain a search term.
while( ( nextURL = buffRead.readLine() )!=null)
{
// Set a session variable corresponding to the URL.
session.setVariable( "URL", nextURL );

// Get search results for this particular URL.
session.scrapeFile( "test" );
}

// Close up the objects to indicate we're done reading the file.
in.close();
buffRead.close();

Many thanks for all your help.