screen-scraper public support

Questions and answers regarding the use of screen-scraper. Anyone can post. Monitored occasionally by screen-scraper staff.

Looping local folder with VBScript

On this forum, I found an example script to loop through files in a local folder, but it was in Java. I am using VBScript and would like to scrape files name 1.html, 2.html, etc. They are all in the same local folder.

I don't know programming at all, so I was hoping for some help. I did find this scipt for looping:
// How much deep to scan. (of course you can also pass it to the method)
const int HowDeepToScan=4;

exporting

please correct me if im wrong but it seems when i export a scraping session to xml the scripts are not exported also

is there a way around this?

Using 'Next' link that fires off Javascript

Anyone tried grabbing info from 118.com?

This link http://www.118.co.uk/SearchResults.aspx?query=taxi&type=BusinessType is a list of Taxi companies in the UK. Clicking the Next link fires off Javascript which gets the next page.
The page number is not held in the querystring of the link or anything that simple.

SS through privoxy-tor

Hi,
Back again with a problem. I can't seem to get SS working through proxy. I edited the screen-scraper properties config to use privoxy (which uses port 8118) running on same host. Screen-scraper runs fine but no data is scraped. I am able to access the web through the proxy + tor setup using a browser.

-Arun

A Whole Heap of Issues

I've retrieved a DataSet which I traverse. I use an external java program to contact the Server

When I've got around 30 records I'm getting a java Heap Out of Memory exception, now I've either got a memory leak or I'm just plain low on memory.

Granted I am using a large 3rd Party library Running a dev database and an all singing all dancing ide on a celeron, but I've plans to expand, is there a way of 'chunking/streaming' the data into bite-size portions to be managed by my external program?

The value "null" was passed to the "scrapeFil

I keep getting this in the session log.

While I went through the tutorials ok and played with them a bit, whenever I create my own session and scripts I tend to end up with this message in the log and thus no details grabbed.

1. Script - start
2. scrape file - list of links (session var COMPID)
a. For each (after pattern matching) Script - Grab Details
b. Scrape file - details page (using COMPID in the URL)

Plugable Scraping

is it possible for an external java program to upload scrapefiles to the server?

I want to hold a set of scrapfiles in exported XML format in a database, then get a Java program to load the files onto the server , before getting the server to serve each request.

how feasible is that.

UnFair

I've been trying to scrape odds from www.betfair.co.uk
however it seems that they hide the information cleverly in javascript do you guys know of any ways around this?

Extractor Pattern Question

Hi there

I have spent the past day trying to figure out the right way to do this, but I am completely lost. If someone could help me out I would be greatly appreciative.

The data I am extracting is an address (which includes street address, city, province, postal code and country).

The outputted code is:

DataSets and DataRecords - HOW TO

Ok I've worked through all of the tutorials and im at a loss as to how to reference a DataSet of DataRecords from a remoteSession.

I have similar set up to the tutorial 3.

I have one script that calls another script for each pattern application.
I'm using this to extract a product list

The second scrape-file forms the URL using a session variable to identify the product id. Further the second scrape file has a extractor of named
~@DATARECORD@~ then the subextractors extract from that result.