Alpha Change Log


Alpha versions are used to fix minor bugs and feature enhancement testing before they are added to stable versions. As such anything that is in the alpha version is prone to change and instability as they are being improved. This log will follow the changes as they are made for your convenience.

View Release Notes for public versions.

Alpha Version Logs


  • Added a new "Apply Whitespace Tidying" option to the Advanced tab of an Extractor Pattern. Disabling this will run the extractor with it's new lines against the last response with it's new lines.
  • Improved the table parsing code.
  • Updated the internal database to a newer version.
  • Added a proxy import/export feature (also supports importing from a Charles proxy export as an XML session export).
  • Updated the user agent csv file, and added methods to get a use agent by browser type (as well as using how recent it is). The methods use the file but have an optimized search approach rather than using a database.
  • Minor cosmetic changes (spacer after CPU use in web interface).
  • Now prompts on close if a scrape is still running.
  • Update to the completions to show NotNull and Nullable when present for methods and parameters, and fixed some bugs with parsing Javadocs (rare issue).
  • Updated so after a new alpha release the clients will download new javadocs from the server.
  • Updated RemoteScrapingSession so any object type can be set with setVariable or retreived with getVariable. This can't be used with custom script classes and only returns a copy of the value, not a reference to it.
  • Fixed a bug with cookies in the Async client and the saveStateToString method (domain wasn't being properly saved when using the async client) and added additional cookie information when saved to a file.
  • Updated the default user agent string to the current Chrome version.
  • Updated icons.
  • Fixed a bug with getCurrentUrl on ScrapeableFile when using the Async Client with GET parameters (previously returned the URL without the parameters).
  • Added support for importing multiple files simultaneously.
  • Added the ability to specify POST data when generating a ScrapeableFile from a URL.
  • Added ability to export "logs" folder to a zip archive.
  • Notifications when a ScrapeableFile's response can't be text-wrapped can now be suppressed by setting the "HideTextWrapNotifications" property to "true" in the file.
  • Fixed an issue where RemoteScrapingSessions would not stop properly in Professional edition.
  • Added a checkbox on the "Advanced" tab of ScrapeableFiles that allows you to disable syntax highlighting in the "Last Response" tab.
  • Added an option under the "File" menu to open the screen-scraper install directory.
  • Added the ability to import\export screen-scraper folders.


  • Added a new tidier (JSoup) and set it as the default.
  • Added method session.setCustomScriptVariable to set a variable that will be set in every script called for a session.
  • Added session.setAutoCloseAfterScrapeEnds (Closeable closeable).


  • Value/Domain values for cookies were being improperly saved when the session.saveStateToString method was called.
  • Minor bug fix related to cookie handling.


  • Bug fix for proxying a site that POSTs with no entity (Content-Length of 0).
  • Added support in the DataManager for the new Java 8 time classes.
  • Added the request method to the compare proxy transaction window.
  • The Default Tokens (named) now show up in the edit token popup as a dropdown (Professional edition and above).
  • Fixed a bug where multiple confirm overwrite dialog boxes would show when importing multiple files.
  • Added the "extractorPatternName" implicit variable which contains the name of the extractor pattern that invoked the script.


  • The tidier will automatically be set to JSON when generating a ScrapeableFile from a proxy transaction if the Content-Type of the response is JSON.
  • Changed the default HTTP client to the Ning Async client.
  • Added import support for Charles JSON session, which includes the notes field for the request.
  • Bug fix to download file lazily using session.downloadFile.
  • Update so filtered proxy transactions scroll much better in the workbench.
  • Fixed a proxy transaction comparison view issue when the scrapeable file had 2 of the same parameter key but the proxy only had 1.
  • Updated the Asnyc client so if a scraping session is stopped any active requests will also be stopped (large file downloads won't finish).
  • Added support for passing a java.util.Date object to the data manager for a java.sql.Date or java.sql.Time.


  • session.downloadFile now streams the file data to the output file rather than downloading to memory and then writing to disk afterwards.
  • Async client now properly requesting gzipped content on redirects.
  • Fixed file handle leak when viewing logs in the web interface.
  • Added Async client v2 as an HTTP client option.
  • Added XML tidier.
  • Various other updates related to JavaScript and PDF parsing.


  • Added protocol to compare with proxy transaction window.
  • Updated underlying libraries for navigation actions.


  • Added neko-htmlunit.jar.
  • User agent string update.
  • Added a callback time for session.scrapeString.
  • Updated a few jar files.


  • Updated for Java 9 stability.
  • Fixed various thread-related issues.
  • Updates / Bug Fixes in DataManager.
  • Added cURL Http client (must have cURL installed on machine).


  • Updated DataManager to work with Sybase.
  • Updated class loader for backward compatibility.
  • Updated runonce.script file to handle intermittent issue.


  • Reduced Data Manager lock contention in multithread mode.
  • Workbench log improvements.
  • Optimized Async HTTP Client to run better with a large number of active concurrent scrapes.
  • Added convenience methods to sutil for working with JSON objects to sutil. Methods are: getJSONObject, getJSONString, and getJSONArray.
  • Added sutil.alphabetLoopWithSubdivision (adds built-in ability to do an
    alphabet loop search with subdivision).
  • Various other minor bug fixes and improvements.


  • Bug fix for completions popup in the workbench.
  • Added sutil.runMultipleSessionsConcurrentlyForInput to allow for easily running multiple threads of a scrape to share workload.
  • Added the com.screenscraper.util.suppliers with basic supplier classes for common uses with runMultipleSessionsConcurrentlyForInput.
  • Added support for 308 status code (redirect with same POST payload) to Http clients.
  • Updated to latest HtmlUnit version.
  • Added data manager convenience method to track record counts by save type: dm.setBasicAutoIncrementRecordsByWriteType.
  • Bug fix to automatic anonymization proxy validation.