screen-scraper support for licensed users
Stuck on "Session Expired"
I've been building and running scrapes for quite a while now, and I haven't gotten this stuck in a long time. I'm trying to make what seems to be a pretty straightforward POST request from the following URL:
http://www.autozone.com/autozone/storelocator/storeLocatorMain.jsp
Action on Error code in screen-scraper
I occasionally get annoying 502 (bad proxy) errors that make my scrapes leap when using TOR on Linux-Ubuntu. Is there a way that SS can take action on a specific error message like this? If possible I would like to get a session pause and re-scrape the current scrapeable file when receiving a 502 error.
Is that doable?
/Johan
Next page ++
I'm using the example script of the "memory conscious next page". For some reason the OFFSET variable does not increase from its initial value and the page sequence never get going. To my knowledge I have not changed anything that should interfere with the logic in the below example. The variable NEXT_URL is saved to session variable and have "positive" values. I am calling the script "after each application pattern" on the Search result page and the following Next search result page. Is there something missing in the example script or am I missing something?
The "session" POST variable changes for each search
I recorded search results, created a scraping session, but the "session" POST variable changes for each search. For example, the "9046" will not work for the next search and since there are many users on the site the next number is not predictabe. I tried removing the "session" row but it did not work.
Received "An error occurred while preparing to issue the HTTP request: null"
I read the forum topic "An error occurred while preparing to issue the HTTP request: null" posted 6/24/09 but can't seem to fix a similar problem. I have an extractor pattern that grabs HTML pieces and save them into a session variables. The saved session variables are than inserted in the URL.
The log is as follows:
Scraping Ajax inner HTML
I'm trying to scrape a site whose second menu fetches its parameters with ajax based on the selection of the preceding drop-down. I've read what is written on the subject in earlier posts but I don't really understand how I will be able to pick the responses in a scrapable file. This is what the ajax part looks like:
//If ajax support
if(ajax){
//leave only one element as option, exclude the rest
document.formSearch.districtList.options.length = 1;
idOpcao = document.getElementById("opcoes");
Link not located in the respons in Screen Scraper
Hi
I have a problem. When i accesses a webpage from IE there is a link around a header. When i accesses it from Screen Scraper the page does not contain the link. There is a lot of different .js files that are loaded after the main page. How can i get the link to be accesiable in SS?
Hans
session.isRunning() problem
I am trying to use the session.isRunning() method, but can not get it to work. When i terminate a session in the workbench, and later prints aut this:
s = "" + session.isRunning();
session.logError(s);
The program will write true, even though i did terminate the session, and it should be false.
Is this a bug, or an error from my side?
Hans
Attempt to invoke method: get() on undefined variable or class name:
Thought I would find lots of examples on why this error ocure but coult not get any hints in the archive. I'm trying to build a memory efficient Next page method using:
if (dataRecord.get("NEXT_URL")!=null);
{
dataReceivers and lazy scrapes
RemoteScraping sessions in lazy mode drop the connection between client and server after the scrape is started. This makes it impossible to pass session variables back and forth. Does this severed connection also affect the use of the dataReciever/session.sendDataToClient() interface?