Losing Sessions/ScrapeableFiles on
So apparently I've run into the next problem. Not something that immediately stops me from working but none the less a major annoyance.
Whenever I close the workbench - screen-scraper seems to delete/lose all my scraping sessions and the associated scrapeable files. Opening the workbench again after closing - only the scripts remain. Tried to restore to some of the created database backups but they seem also to be affected. After opening screen-scraper again the scripts are the only things showing up.
Should probably still be known from the last issue, but I am operating on a CentOS environment where I recently reinstalled a fresh installation of screen-scraper enterprise and updated to the recent version 6.0.61a. Only thing I've done until now with it was importing and testing my projects one at a time.
Usually the server runs 24/7 - no reason to close screen-scraper until now. So count me surprised when I did it today for the first time since the installation and encountering such an issue.
Since the server is up 24/7,
Since the server is up 24/7, are you stopping the server before you open the workbench? There is a generally an error if you try to have bother server and workbench at the same time, but there is a way around that. If you do have both running, that could cause an issue. Are you using the scripts in screen-scraper to start/stop the sever, or do you use another way?
Sorry for the confusion -
Sorry for the confusion - with server I didn't mean the screen-scraper server application but the system itself which is running 24/7. Usually I just use the workbench and leave it open all the time. When I have to work with it I just connect to this system via VNC.
Because we haven't automatized our scrapes yet - (something planned for later) - there wasn't much reason to use screen-scraper's server application.
Since the workbench is meant
Since the workbench is meant to edit scrapes, sometimes if there is a problem in the internal database something like this could happen. The best thing I can think to prevent this would be to make sure the workbench gets closed, and run the scrape is run from command line or server. Command line is easy if you don't want to run in server mode. This would be in the screen-scraper directory: