Speed script generation for new sites

Here are some things I would like to see to speed up the generation on a script.

1. Parameter Boundary Condition Generator

The parameters for a page tend to either come from a previous page or are either fixed via session variable from the start.

So this would be similar to how you do the scrap file compare to a proxy transaction, I propose that you mouse over the parameter value and then right mouse click to get pop up menu.

The menu would have the following :
Create Session Variable
Match Session Variable
Replace With Session Variable
Replace All Matching Parameters With Session Variable

Create Session Variable
==================
Prompt user for name and then create entry of ~#FooBar#~, in turn it creates a script titled with "initialization" to have session variable and sets value to what was on screen.
Script is also attached to scrapable file and set to run from very start.
Also, internally store the initial value used so it can be referred to when user does some operation on next page over same param value.

Match Session Variable
==================
Push users to "previous" page's Last Response, user then gets prompted with first search result of "parameter value" in text area.
User then highlights text to become a pattern, obviously the param value is going to be session variable.
Now the pgm can either create pattern in previous page or simply store pattern match in clipboard for user to paste where they want.

Replace With Session Variable
=======================
Popup menu displays drop down of all existing session variables along with current value if known.
User selects and program enters ~#FooBar#~

Replace All Matching Parameters With Session Variable
=======================================
Same as Replace With Session Variable but does operation for automatically all matching parameter values in each scrapeable file.

2. Concept of instant filter

The dropdown on log screen can be enhanced to be interactive and affect current display, if I change to debug then only debug statements will be visible.

I realize this is a bit challenging because you must track each line's source(debug, error etc) but you could implement one simple improvement of adding a dropdown of
"Instant Warn" which scans all log and only displays lines that starts with a user definable character of say "*" then only those lines are displayed.

This would be helpful when you want all the data but want to highlight just key warnings without scrolling through all the text.

* Warning Only 5 Records found, expecting 25 for page xyz

3. Allow for SubFolders.. to create neater organization

I generally store my scripts for one job in a folder but I would like to organize work into something like

- Current Work
+ FaceBook Scrape
+ FaceBook Scrape Scripts

- Previous Work
+ Twitter Scrape
+ Twitter Scrape Scripts
+ SocialText Scrape
+ SocialText Scrape Scripts

4. Generate a better initial scrapeable file name based on target not on source name
FooBar XYZ -- 001 Home
FooBar XYZ -- 002 Next
FooBar XYZ -- 003 Final

Take title of scrape file container target selected as Prefix instead of proxy source name.

I make all my entries have a prefix so they sort in order