screen-scraper public support
screen-scraper in a GUI-less environment : how do i set it to show only the fourth level of logging
Hi guys
I'm running screen-scraper in a GUI-less environment and by default it generates huge amount of logging file which the same as in GUI enviroment the lowest level of logging.(logDebug)
How could I change this to the fourth level of logging - the logError level in the GUI-less environment.
Thx.
Max
read an Extractor pattern , check for a condition , then pass control to a second Extractor Pattern ?
I am using the screen-scaper Basic Edition V4.5
My project uses a site that requests a stock ticker symbol that brings up a screen having a list of dividend info then a list of earnings info on one srapeble file. Both lists are sorted by date. I have 2 extractor patterns ( 1 for div, 1for earnings). Each patern has a script that writes the data to a .cvs file "After each pattern application" if it meets my selection criteria. When the entire file is scraped I read a new ticker from a .cvs file.
read an Extractor pattern , check for a condition , then pass control to a second Extractor Pattern ?
I am using the screen-scaper Basic Edition V4.5
My project uses a site that requests a stock ticker symbol that brings up a screen having a list of dividend info then a list of earnings info on one srapeble file. Both lists are sorted by date. I have 2 extractor patterns ( 1 for div, 1for earnings). Each patern has a script that writes the data to a .cvs file "After each pattern application" if it meets my selection criteria. When the entire file is scraped I read a new ticker from a .cvs file.
Dynamically Modifying the URL
I'm trying to figure out how to scrape a site with some sort of a sessionid embedded in its url path. For example:
http://www.abc.com/(fbyhzf45k12nudyhfpysps45)/jsp/user/SignOn.aspx
I wrote a script that is invoked after the file is scraped to retrieve that sessionId and save it onto a variable:
String url = scrapeableFile.getCurrentURL();
String sessionId = url.substring(url.indexOf('(')+1, url.indexOf(')'));
session.setVariable("SESSIONID", sessionId);
My question now is -- how do I use that SESSIONID value on the succeeding scrape file URL's like so:
Get [Binary Data] only on certain pages - advanced firewall?
Hi,
I am trying to get the product information from this site:
http://tiny.cc/EsFBA
Type1 is the main category and Type2 is the sub-category with the product list. The scraping session is required to log in, scrape Type1 (main category) > Type 2 (sub category with product list) > product details.
AJAX Passing Sessions during scraping not working
Find that more sites are now using AJAX which presents new challenges trying to scrape.
I'm trying to scrape the following site:
http://www.marksandspencer.com/Red-Wine-Food-Wine/b/44097030?ie=UTF8&sor...
The error message was: Attempt to invoke method: setVariable() on undefined variable or class name
"The error message was: Attempt to invoke method: setVariable() on undefined variable or class name"
anybody know what caused this?
/Max
swilsonmc are you still looking for help with screen scrape projects?
swilsonmc are you still looking for help with screen scrape projects?
Invoking screen-scraper
Thank you for this opportunity. During the last three
weeks, I have used screen-scraper to continuosly gather
price quotes for a particular stock that I may invest in.
I am using a VB.Net executable called from Windows Task
Scheduler at 6:00 a.m. (VB.Net Source Code attached.)
The VB executable runs the inializing scrpt in my
screen-scraper's 'QuotesFromScottrade' Scraping Session.
(Javascript attached.) Shortly after screen-scraper
starts, the VB executable quits. Screen-scraper quits
at 2 p.m.
Cannot capture entire dataset in database table
I am trying to scrape a page and extract all of the data records captured in the dataset to a database while running a scraping session. I am using the same setup as the one in tutorial 2 and 5. I have set up the scraping session to run a script that calls some external .Net code to store the data into a database table. The problem is I can only get the very last record in the dataset into the database table. I have successfully outputted the extracted data to a file but I cannot get the same results into the database. Any ideas?