running scripts in the workbench in v4.5

I was previously able to run scripts by right-clicking on one and choosing "Run Script". I can't seem to find a similar option anywhere - was it removed?

thx
Joshua

You are right-- In order to

You are right-- In order to simplify the design and internal structure of screen-scraper, version 4 removed the "run script" option.

The preferred (and more scrape-centric) way of running projects is to have a scraping session which can optionally call an initialization script "before session begins". This way, the control flow is always centered on the scraping session itself.

I assume you mean 4.5,

I assume you mean 4.5, because it was working just fine in 4.0!

The reason I ask is because in my current setup, I first run an initializion script. This script creates the runnableScrapingSession, potentially initializes a couple variables, then calls runnableScrapingSession.scrape().

After looking into this a bit, it would appear that this used to be the only way to run things via the command line. Unfortunately, I take it this means I have to change virtually all of my scraping sessions now? The worst part is I only initialize a couple variables in those scripts, but they're kind of important...

Hrm... there's no secret undocumented macro feature is there? ;)

Well, no macros, no. The

Well, no macros, no.

The workbench hasn't ever really been a magnificent place for running a dynamically-created scraping session. The workbench has been made to work with the main "Run scraping session" button.

Our normal approach is simply to put an "Initialize" script on the scraping session, "Before" the session begins, and just do normal calls to session.setVariable("var", "value");. Your approach is very much the desired one for running on a commandline, though there are several ways to accomplish the same thing. (For instance, you can pass a URL GET-like string to the session when invoking the screen-scraper jar file, which will auto-load simple string/number session variables.)

Honestly, I often have an Init script on the scrape, and have it set test values to the required session variables. Then, I guard those test value assignments by a check to see if the variable was already set (which could have only been done externally). If the variable was already set externally, it will skip the test value assignment. This allows me to run from the workbench with test values, and then externally with some controlling language (be it java, python, etc, etc) without altering anything within the scrape. This way the scrape can work independently (in the workbench), or with the assistance of an external controller.

As for emulating the whole right-click-and-run thing, I'd still suggest just making that very script you "before scraping session begins" initializing script.

If you create dynamic scrapes in the workbench, the "start" and "stop" buttons for the scrape will not work, because they're not tied to the workbench interface--- only the scraping engine behind the scenes. You could thus never force the session to stop scraping without closing down the entire program. And if you try to run another one before the first one is finished, then the log output gets really screwy and "last response" tabs are never reliable, since you don't know which of the hidden dynamic scrapes is being represented by the single scraping session icon in the workbench. The two scrapes just sort of fumble around in memory, and are not tied to any controlling interface.

So... yeah. If part of that wasn't super clear, just let me know.

Tim